Java Card APDU Error - javacard

I am trying to send a simple APDU to a Java Card( I attached the simple code of the applet below).I already tested the applet in the Eclipse simulator but when I want to send an APDU to the applet it fails with the following error : send_APDU() returns 0x80206E00 (6E00: Wrong CLA byte.) .Applet is already installed into the card ( I used GpShell to do that ) .Here is the full output from the script I used to send the APDU.
D:\GPShell-1.4.4>GPShell.exe send_APDU.txt
establish_context
enable_trace
enable_timer
card_connect
command time: 15 ms
send_apdu -sc 0 -APDU b0000000010000
Command --> B0000000010000
Wrapped command --> B0000000010000
Response <-- 6E00
send_APDU() returns 0x80206E00 (6E00: Wrong CLA byte.)
command time: 860 ms
card_disconnect
command time: 31 ms
release_context
command time: 0 ms
Here is the full code of the applet.
public class Contor extends Applet {
private byte contor = 0;
private final static byte CLS=(byte)0xB0;
private final static byte INC=(byte)0x00;
private final static byte DEC=(byte)0x01;
private final static byte GET=(byte)0x02;
private final static byte INIT=(byte)0x03;
private Contor() {
}
public static void install(byte bArray[], short bOffset, byte bLength) throws ISOException {
new Contor().register();
}
public void process(APDU apdu) throws ISOException {
if(this.selectingApplet())return;
byte[] buffer = apdu.getBuffer();
if(buffer[ISO7816.OFFSET_CLA] != CLS)
ISOException.throwIt(ISO7816.SW_CLA_NOT_SUPPORTED);
switch(buffer[ISO7816.OFFSET_INS])
{
case INC:contor++; break;
case DEC:contor--; break;
case GET:
buffer[0] = contor;
apdu.setOutgoingAndSend((short)0,(short)1);
break;
case INIT:
apdu.setIncomingAndReceive();
contor = buffer[ISO7816.OFFSET_CDATA];
break;
}
}

In order to have a communication with your applet, you must select your applet first.
To do that you have two options. The first option is making your applet Default Selected in the applet installation phase and make it implicitly selected applet after each power up. The second option is sending SELECT APDU command concatenated with your applet AUD before sending other commands.
SELECT APDU Command = 00A40400 <AID Length> <AID>
Other wise, the entity that respond to your command, is not your applet and most probably it is the default Default-Selected applet, i.e. Card Manager.

Related

How to assert keyboard is closed?

I want to check my mobile keyboard is closed with an assertion.
public void keyboardIsClosed() {
boolean isKeyboardShown = driver.isKeyboardShown();
Assert.assertFalse("Keyboard is opened.",isKeyboardShown());
}
It doesn't accept last isKeyboardShown()in the code? How should i write?
I want to test whether or not the soft keyboard is shown with java
Also
public void keyboardIsClosed() {
Boolean isKeyboardShown;
isKeyboardShown = driver.isKeyboardShown();
Assert.assertFalse("Keyboard is opened.",isKeyboardShown==true);
}
works🚀

Is there a way to get how long a key is being pressed on in MIPS?

I'm making a synth for school, and I want to be able to play a note for the amount of time the key is being pressed on. I already know that syscall 30 will give me system time, but I don't know how to know when the key is no longer pressed on.
Here's some relevant code from the Keyboard and Display Simulator plug-in tool for MARS & RARS (the RISC V version of MARS):
private class KeyboardKeyListener implements KeyListener {
public void keyTyped(KeyEvent e) {
int updatedReceiverControl = readyBitSet(RECEIVER_CONTROL);
updateMMIOControlAndData(RECEIVER_CONTROL, updatedReceiverControl, RECEIVER_DATA, e.getKeyChar() & 0x00000ff);
if (updatedReceiverControl != 1) {
InterruptController.registerExternalInterrupt(EXTERNAL_INTERRUPT_KEYBOARD);
}
}
/* Ignore key pressed event from the text field. */
public void keyPressed(KeyEvent e) {
}
/* Ignore key released event from the text field. */
public void keyReleased(KeyEvent e) {
}
}
You could alter this code to do something in the lower level events handlers: keyPressed and keyReleased — instead the higher level event: keyTyped.
The idea is that you could supply key code data rather than character data, either at some new MMIO register location -or- hijack the existing receiver data and control locations for key code data instead of character data.

Memory leak in Camel netty TCP client when consuming lines with Windows line breaks (CR LF)

My Camel netty tcp client consuming text lines seems to have a memory leak but only if the test data lines end with Windows (CR LF) line breaks. I encountered no issues with Unix (LF) line breaks.
I made a short test to demonstrate the issue simulating a tcp server continuously sending test data lines.
With Unix (LF) line breaks in test data I see a throughput of about 3.500 messages/second and steady 180 MB ram use. No issues.
With Windows (CR LF) line breaks in test data I see a throughput starting with 380.000 (woah!) messages/second until hitting my -Xmx4G heap limit after about 30 seconds and slowing down considerably probably because of excessive GC; if given more heap it grows steadily until hitting that limit (tried with -Xmx20G).
The only difference is really the line breaks in my test data...
Am I missing something here?
Using Camel 2.24.0 (which is using netty 4.1.32-Final) on Linux with OpenJDK 1.8.0_192. The problem also occurs with latest netty 4.1.36.Final. Also occurs with OpenJ9 JVM, so does not seem to be JVM specific.
public abstract class MyRouteBuilderTestBase extends CamelTestSupport {
private final int nettyPort = AvailablePortFinder.getNextAvailable();
private ServerSocket serverSocket;
private Socket clientSocket;
private PrintWriter out;
#Override
protected RouteBuilder createRouteBuilder() {
return new RouteBuilder() {
public void configure() {
from("netty4:tcp://localhost:" + nettyPort + "?clientMode=true&textline=true&sync=false")
.to("log:throughput?level=INFO&groupInterval=10000&groupActiveOnly=false");
}
};
}
protected void startServerStub(String testdata) throws Exception {
serverSocket = new ServerSocket(nettyPort);
clientSocket = serverSocket.accept();
out = new PrintWriter(clientSocket.getOutputStream(), true);
for (;;) {
out.print(testdata);
}
}
#After
public void after() throws Exception {
if (out != null) out.close();
if (clientSocket != null) clientSocket.close();
if (serverSocket != null) serverSocket.close();
}
}
public class MyRouteBuilderTestUnixLineBreaks extends MyRouteBuilderTestBase {
#Test
public void testUnixLineBreaks() throws Exception {
startServerStub("my test data\n"); // Unix LF
}
}
public class MyRouteBuilderTestWindowsLineBreaks extends MyRouteBuilderTestBase {
#Test
public void testWindowsLineBreaks() throws Exception {
startServerStub("my test data\r\n"); // Windows CR LF
}
}
Heap dump analysis showed that the memory is getting allocated by one instance of io.netty.util.concurrent.DefaultEventExecutor which is using a LinkedBlockingQueue with unlimited size internally. This queue grows indefinitely under load causing the issue.
The DefaultEventExecutor is created by Camel because of the parameter usingExecutorService which is true by default (maybe not a good choice). Setting usingExecutorService=false makes Netty use its event loop instead of the executor which works much better.
I now get 600.000 messages per second throughput with data using Windows line breaks (CR NL) with a steady ram use of about 200mb (-Xmx500M). Nice.
Though with data using Unix line breaks (NL) the throughput is only at about 6.500 messages per second, two orders of magnitude slower, which was still puzzling.
The reason is that Camel creates its own org.apache.camel.component.netty4.codec.DelimiterBasedFrameDecoder class by subclassing Netty's io.netty.handler.codec.DelimiterBasedFrameDecoder -- I don't know why since Camel's class does not add any functionality. But by subclassing, Camel prevents a certain optimization inside Netty's DelimiterBasedFrameDecoder which switches to io.netty.handler.codec.LineBasedFrameDecoder internally, but only if not subclassed.
To overcome this I needed to explicitly declare decoder and encoders using Netty's classes instead, in addition to setting usingExecutorService=false.
Now I get the 600.000 messages per second throughput with data using Unix line breaks (NL) too and see a steady ram use of about 200mb. That looks much better.
public abstract class MyRouteBuilderTestBase extends CamelTestSupport {
private final int nettyPort = AvailablePortFinder.getNextAvailable();
private ServerSocket serverSocket;
private Socket clientSocket;
private PrintWriter out;
#Override
protected JndiRegistry createRegistry() throws Exception {
JndiRegistry registry = super.createRegistry();
List<ChannelHandler> decoders = new ArrayList<>();
DefaultChannelHandlerFactory decoderTextLine = new DefaultChannelHandlerFactory() {
#Override
public ChannelHandler newChannelHandler() {
return new io.netty.handler.codec.DelimiterBasedFrameDecoder(1024, true, Delimiters.lineDelimiter());
// Works too:
// return new LineBasedFrameDecoder(1024, true, true);
}
};
decoders.add(decoderTextLine);
ShareableChannelHandlerFactory decoderStr = new ShareableChannelHandlerFactory(new StringDecoder(CharsetUtil.US_ASCII));
decoders.add(decoderStr);
registry.bind("decoders", decoders);
List<ChannelHandler> encoders = new ArrayList<>();
ShareableChannelHandlerFactory encoderStr = new ShareableChannelHandlerFactory(new StringEncoder(CharsetUtil.US_ASCII));
encoders.add(encoderStr);
registry.bind("encoders", encoders);
return registry;
}
#Override
protected RouteBuilder createRouteBuilder() {
return new RouteBuilder() {
public void configure() {
from("netty4:tcp://localhost:" + nettyPort + "?clientMode=true&textline=true&sync=false&usingExecutorService=false&encoders=#encoders&decoders=#decoders")
.to("log:throughput?level=INFO&groupInterval=10000&groupActiveOnly=false");
}
};
}
protected void startServerStub(String testdata) throws Exception {
serverSocket = new ServerSocket(nettyPort);
clientSocket = serverSocket.accept();
out = new PrintWriter(clientSocket.getOutputStream(), true);
for (;;) {
out.print(testdata);
}
}
#After
public void after() throws Exception {
if (out != null) out.close();
if (clientSocket != null) clientSocket.close();
if (serverSocket != null) serverSocket.close();
}
}
Update: The memory usage issue is not a memory leak (and I regret phrasing my question that way) but about buffering. Please consult comments to this answer by users Bedla and Claus Ibsen to get a good understanding of the consequences of the solution outlined above. Please also consult CAMEL-13527

Java card send APDU command through server

I have created a server to send APDU commands to java card application.
The connection has been established successfully. The only problem that I encounter is that that I send a command successfully but the Java Card aplication doesn't receive it.
The following code represents the client from which I do send commands :
public class Client {
public static void main(String[] args) {
Socket sock;
try {
sock = new Socket("localhost", 9025);
InputStream is = sock.getInputStream();
OutputStream os = sock.getOutputStream();
CadClientInterface cad = CadDevice.getCadClientInstance(CadDevice.PROTOCOL_T0, is, os);
Apdu apdu = new Apdu();
System.out.println("Initialized apdu !");
byte[] installer = new byte[]{0x00, (byte) 0xA4, 0x04, 0x00, 0x09, (byte) 0xA0, 0x00, 0x00, 0x00, 0x62, 0x03, 0x01, 0x08, 0x01, 0x7F};
System.out.println("Prepare installer of cap !");
apdu.setDataIn(installer, installer.length);
System.out.println("Apdu set !");
System.out.println("Apdu sent !");
System.out.println(apdu);
cad.powerUp();
System.out.println("Powered up !");
cad.exchangeApdu(apdu);
cad.powerDown();
System.out.println("Powered down !");
} catch (IOException | CadTransportException e) {
e.printStackTrace();
System.out.println("Fail! " + e.getMessage());
}
}
}
The java card applet is a simple applet created by the IDE.
public class Proj extends Applet {
/**
* Installs this applet.
*
* #param bArray
* the array containing installation parameters
* #param bOffset
* the starting offset in bArray
* #param bLength
* the length in bytes of the parameter data in bArray
*/
public static void install(byte[] bArray, short bOffset, byte bLength) {
new Proj();
}
/**
* Only this class's install method should create the applet object.
*/
protected Proj() {
register();
}
/**
* Processes an incoming APDU.
*
* #see APDU
* #param apdu
* the incoming APDU
*/
#Override
public void process(APDU apdu) {
//Insert your code here
}
}
In the java card I turn on the Device and the port is established. I do not know why the command is successfully sent and the java card server doesn't receive it.
Edit:
I saw the problem about why javacard didn't receive any data. The problem is within the client. When the statement cad.powerUp(); reaches the whole client blocks and nothing more is executed, like in as a sleep(); function was called. So now the really problem is why the cad.powerUp() blocks the client.
I am assuming here that you have pasted the full applet code here. The Applet.register() method is not called in the applet on installation. So, the applet is never registered with JCRE and therefore no APDU can be received by it.
In fact, it will not be available for selection because JCRE does not have any information about it.
modify the code with the following and share the result.
public static void install(byte[] bArray, short bOffset, byte bLength) {
new Proj();
}
protected Proj(){
register();
}
Also, make sure to install the applet.

j2me bluetooth client. Function startInquiry nothing found

I develop simple j2me bluetooth client and have problem with bluetooth device search.
Function startInquiry nothing found.
Client : nokia 5220
Server : my pc with bluetooth adapter
All bluetooth devices is on.
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
import javax.microedition.midlet.*;
import javax.bluetooth.*;
import java.util.Vector;
import javax.microedition.lcdui.*;
/**
* #author Администратор
*/
public class Midlet extends MIDlet implements DiscoveryListener
{
private static Vector vecDevices=new Vector();
private static String connectionURL=null;
private LocalDevice localDevice;
private DiscoveryAgent agent;
private RemoteDevice remoteDevice;
private RemoteDevice[] devList;
private Display display;
private Form form;
public void startApp() {
display = Display.getDisplay(this);
form = new Form( "Client" );
try {
localDevice = LocalDevice.getLocalDevice();
} catch( BluetoothStateException e ) {
e.printStackTrace();
}
form.append("Address: "+localDevice.getBluetoothAddress()+"\n\n");
form.append("Name: "+localDevice.getFriendlyName()+"\n\n");
try {
agent = localDevice.getLocalDevice().getDiscoveryAgent();
form.append("Starting device inquiry... \n\n");
boolean si = agent.startInquiry(DiscoveryAgent.GIAC, this);
if ( si ) {
form.append("true");
} else {
form.append("false");
}
} catch( BluetoothStateException e ) {
}
int deviceCount = vecDevices.size();
if(deviceCount <= 0){
form.append("No Devices Found .");
}
else{
//print bluetooth device addresses and names in the format [ No. address (name) ]
form.append("Bluetooth Devices: ");
for (int i = 0; i < deviceCount; i++) {
remoteDevice=(RemoteDevice)vecDevices.elementAt(i);
form.append( remoteDevice.getBluetoothAddress() );
}
}
display.setCurrent(form);
}
public void pauseApp() {
}
public void destroyApp(boolean unconditional) {
}
public void deviceDiscovered(RemoteDevice btDevice, DeviceClass cod) {
//add the device to the vector
if(!vecDevices.contains(btDevice)){
vecDevices.addElement(btDevice);
}
}
public void inquiryCompleted(int discType)
{
}
//implement this method since services are not being discovered
public void servicesDiscovered(int transID, ServiceRecord[] servRecord) {
if(servRecord!=null && servRecord.length>0){
connectionURL=servRecord[0].getConnectionURL(0,false);
}
}
//implement this method since services are not being discovered
public void serviceSearchCompleted(int transID, int respCode) {
}
}
Not sure what the exact problem is, but you definitely don't want to be doing this in your midlet's startApp() method. This is a system lifecycle method, and should return quickly, but scanning for bluetooth devices will block it for a long time. Your startApp() method is tying up the device's resources which it could need for doing the actual scanning!
Refactor, so your device scanning is done in a new thread, then see what happens.
You seem to have misunderstood how the Bluetooth API works. The startInquiry method only starts the device discovery process and returns immediately afterwards, leaving the discovery running in the background. When devices are discovered, you get a callback of the deviceDiscovered method for each of them, and when the discovery process has completed, you get a callback of the inquiryCompleted method. So you need to move the accessing of the vecDevices member and the form manipulation from startApp to inquiryCompleted to be able to actually show the discovered information.
You say all devices are on - but also check if all devices are discoverable.
I've made this mistake before myself!
Lookup the method LocalDevice.setDiscoverable() if you want to toggle between modes programatically.

Resources