Related
I am trying to implement an equivalent of PBEWithHmacSHA512AndAES_128 crypt algorithm from Java for a node.js project.
I basically need to encrypt a string using a cipher made by that algorithm.
Any help is appreciated. Thanks in advance.
This is the snippet of code I have written in Java
public final static String ALGO_VIDEO_ENCRYPTOR = "PBEWithHmacSHA512AndAES_128";
static String password = "EncryptionSecretKey" ;
static byte[] salt = "NewSaltTest".getBytes() ;
static int iterCount = 12288 ;
static int derivedKeyLength = 256 ;
public static SecretKey encryptionKey;
public static SecretKey generateSecretKey(){
SecretKey key=null;
try {
salt=Arrays.copyOf(salt,128);
KeySpec spec = new PBEKeySpec(password.toCharArray(), salt, iterCount, derivedKeyLength * 8);
SecretKeyFactory skf = SecretKeyFactory.getInstance(ALGO_VIDEO_ENCRYPTOR);
key = skf.generateSecret(spec);
}catch (Exception e){
System.out.println(e);
}
return key;
}
public static String encryptString(SecretKey key, String input)
{
byte[] encrypted=null;
try {
byte[] iv = new byte[] { (byte) 0x8E, 0x12, 0x39, (byte) 0x9C,
0x07, 0x72, 0x6F, 0x5A, (byte) 0x8E, 0x12, 0x39, (byte) 0x9C,
0x07, 0x72, 0x6F, 0x5A };
IvParameterSpec ivSpec = new IvParameterSpec(iv);
PBEParameterSpec parameterSpec = new PBEParameterSpec(salt, iterCount, ivSpec);
Cipher cipher = Cipher.getInstance(ALGO_VIDEO_ENCRYPTOR);
cipher.init(Cipher.ENCRYPT_MODE, key,parameterSpec);
String charSet = "UTF-8";
byte[] in = input.getBytes(charSet);
encrypted = cipher.doFinal(in);
} finally {
String encStr = new String(Base64.getEncoder().encode(encrypted));
return encStr;
}
}
public static String decryptString(SecretKey key, String input) throws UnsupportedEncodingException {
byte[] decrypted=null;
try {
byte[] iv = new byte[] { (byte) 0x8E, 0x12, 0x39, (byte) 0x9C,
0x07, 0x72, 0x6F, 0x5A, (byte) 0x8E, 0x12, 0x39, (byte) 0x9C,
0x07, 0x72, 0x6F, 0x5A };
IvParameterSpec ivSpec = new IvParameterSpec(iv);
PBEParameterSpec parameterSpec = new PBEParameterSpec(salt, iterCount, ivSpec);
Cipher cipher = Cipher.getInstance(ALGO_VIDEO_ENCRYPTOR);
cipher.init(Cipher.DECRYPT_MODE, key, parameterSpec);
byte[] enc = Base64.getDecoder().decode(input);
decrypted = cipher.doFinal(enc);
} catch (Exception e){
System.out.println("Decrypt error"+e);
}finally {
String charSet = "UTF-8";
String plainStr = new String(decrypted, charSet);
return plainStr;
}
}
I am aware that the equivalence of this algorithm will basically derive a 256-bit AES key using salted PBKDF2 and then encrypt the stored key using AES/CBC/PKCS5Padding with that key. (ref - https://neilmadden.blog/2017/11/17/java-keystores-the-gory-details/)
I'm trying to simulate a mastercard contactless transaction using a Raspberry Pi with PN532 Chip and a smartwatch. The response from the SFI2 R2 contains the following CDOL1 data:
9F02069F03069F1A0295055F2A029A039C019F37049F35019F45029F4C089F34039F1D089F15029F4E14
which translates to:
TAG LENGTH
9F02 06
9F03 06
9F1A 02
95 05
5F2A 02
9A 03
9C 01
9F37 04
9F35 01
9F45 02
9F4C 08
9F34 03
9F1D 08
9F15 02
9F4E 14
I created the Get Application Cryptogram command following these specifications:
byte_t get_app_crypto[] = {
0x80, 0xAE, // CLA INS
0x80, 0x00, // P1 P2
0x43, // length
0x00, 0x00, 0x00, 0x00, 0x01, 0x00, // amount
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // other amount
0x06, 0x42, // terminal country
0x00, 0x00, 0x00, 0x00, 0x00, // tvr terminal
0x09, 0x46, // currency code
0x20, 0x08, 0x23, // transaction date
0x00, // transaction type
0x11, 0x22, 0x33, 0x44, // UN
0x22, // terminal type
0x00, 0x00,// data auth code
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // icc dynamic
0x00, 0x00, 0x00, // cvm results
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // 8
0x54, 0x11, // 2 merchant category
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // 14 merchant name or location
0x00, // LE
};
but the response from the watch is always "6700" (wrong length).
Any help would be appreciated.
AFAIR, your question looks like a problem that already had a resolution on SO. Generally, it's good to search.
Back to your issue, you interpreted 9F4E length as 14 decimal, but it simply is 0x14. You are six bytes short.
EDIT: reformulating this question as I've managed to get the basics to work, but still experience problems.
I'm trying to emulate a USB device (bar code scanner) for testing purposes using usb-vhci, and I'm having some problems.
To give some context: the device is a CDC abstract modem, and the client - a java program - communicates with it over the serial line using AT commands.
Basically, I've got my device up and running, it registers itself correctly and I'm able to receive commands from and respond to the client.
The main problem appears to be that as soon as the device starts up or receives a bulk transfer from the host it triggers an ongoing stream of bulk and interrupt IN transfers (massive amounts, my usbmon log grows to 100 MB in a few seconds).
First at startup, where it keeps spewing out (mainly) bulk IN transfers until I receive the SET_CONTROL_LINE_STATE request and then they stop. Then, when the client sends the commands (AT command via the serial device) it starts again.
I'm guessing this is because I'm not responding correctly to some transfer, but I can't figure out what it is.
I've been comparing the usbmon output of my device with that of the real device, but so far I haven't been able to detect any difference that would explain why my emulated device behaves like this and the real one doesn't.
I basically started out with the example code found in libusb_vhci/examples/virtual_device2.c and adapted it to mimic the actual device. First off the device descriptors:
const uint8_t dev_desc[] = {
/* Device Descriptor */
0x12, //bLength 18
0x01, //bDescriptorType 1
0x00, 0x02, //bcdUSB 2.00
0x02, //bDeviceClass 2 Communications
0x00, //bDeviceSubClass 0
0x00, //bDeviceProtocol 0
0x40, //bMaxPacketSize0 64
0x5a, 0x06, //idVendor 065a
0x02, 0xa0, //idProduct a002
0x00, 0x01, //bcdDevice 1.00
0x00, //iManufacturer 0
0x01, //iProduct 1
0x00, //iSerial 0
0x01 //bNumConfigurations 1
};
const uint8_t conf_desc[] = {
/* Configuration Descriptor */
0x09, //bLength 9
0x02, //bDescriptorType 2
0x43, 0x00, //wTotalLength 67 ??
0x02, //bNumInterfaces 2
0x01, //bConfigurationValue 1
0x00, //iConfiguration 0
0x80, //bmAttributes (Bus Powered) 0x80
250, //MaxPower 500mA
/* Interface Descriptor 0 */
0x09, //bLength 9
0x04, //bDescriptorType 4
0x00, //bInterfaceNumber 0
0x00, //bAlternateSetting 0
0x01, //bNumEndpoints 1
0x02, //bInterfaceClass 2 Communications
0x02, //bInterfaceSubClass 2 Abstract (modem)
0x00, //bInterfaceProtocol 0 None
0x00, //iInterface 0
/* CDC Header */
0x05, //bLength 7
0x24, //bDescriptorType 5
0x00, //bEndpointAddress 0x01 EP 1 OUT
0x10, //bcdCDC 1.10
0x01, //"
/* CDC Call Management */
0x05, //bLength 3
0x24, //CDC_CS_INTERFACE
0x01, //CDC_CALL_MANAGEMENT
0x01, //bmCapabilities 0x01
0x00, //bDataInterface 0
/* CDC ACM */
0x04, //bLength 2
0x24, //CDC_CS_INTERFACE
0x02, //CDC_ABSTRACT_CONTROL_MANAGEMENT
0x02, //bmCapabilities 0x02
/* CDC Union */
0x05, //bLength 3
0x24, //CDC_CS_INTERFACE
0x06, //CDC_UNION
0x00, //bMasterInterface 0
0x01, //bSlaveInterface 1
/* Endpoint Descriptor */
0x07, //bLength 7
0x05, //bDescriptorType 5
0x83, //bEndpointAddress 0x83 EP 3 IN
0x03, //bmAttributes 3
0x40, 0x00, //wMaxPacketSize 0x0040 1x 64 bytes
0x0a, //bInterval 10
/* Interface Descriptor 1 */
0x09, //bLength 9
0x04, //bDescriptorType 4
0x01, //bInterfaceNumber 1
0x00, //bAlternateSetting 0
0x02, //bNumEndpoints 2
0x0a, //bInterfaceClass 10 CDC Data
0x00, //bInterfaceSubClass 0
0x00, //bInterfaceProtocol 0
0x00, //iInterface 0
/* Endpoint Descriptor */
0x07, //bLength 7
0x05, //bDescriptorType 5
0x01, //bEndpointAddress 0x01 EP 1 OUT
0x02, //bmAttributes 2
0x40, 0x00, //wMaxPacketSize 0x0040 1x 64 bytes
0x00, //bInterval 0
/* Endpoint Descriptor */
0x07, //bLength 7
0x05, //bDescriptorType 5
0x82, //bEndpointAddress 0x82 EP 2 IN
0x02, //bmAttributes 2
0x40,0x00, //wMaxPacketSize 0x0040 1x 64 bytes
0x00 //bInterval 0
};
const uint8_t str0_desc[] = {
0x04, //bLength 4
0x03, //bDescriptorType 3
0x09, 0x04 //bLanguage 0409 US
};
const uint8_t *str1_desc =
(uint8_t *)"\x36\x03O\0p\0t\0i\0c\0o\0n\0 \0U\0S\0B\00\0B\0a\0r\0c\0o\0d\0e\0 \0R\0e\0a\0d\0e\0r";
The main function is the same as in the example, but the process_urb() function is what has mainly been changed. The control section is largely intact, but I've added handling for some additional setup packets:
uint8_t rt = urb->bmRequestType;
uint8_t r = urb->bRequest;
if(rt == 0x00 && r == URB_RQ_SET_CONFIGURATION)
{
devlog("URB_RQ_SET_CONFIGURATION\n");
urb->status = USB_VHCI_STATUS_SUCCESS;
}
else if(rt == 0x00 && r == URB_RQ_SET_INTERFACE)
{
devlog("URB_RQ_SET_INTERFACE\n");
urb->status = USB_VHCI_STATUS_SUCCESS;
}
else if (rt == 0x21 && r == 0x20)
{
devlog("URB_CDC_SET_LINE_CODING\n");
urb->status = USB_VHCI_STATUS_SUCCESS;
}
else if (rt == 0x21 && r == 0x22)
{
devlog("URB_CDC_SET_CONTROL_LINE_STATE\n");
urb->status = USB_VHCI_STATUS_SUCCESS;
}
else if(rt == 0x80 && r == URB_RQ_GET_DESCRIPTOR)
{
int l = urb->wLength;
uint8_t *buffer = urb->buffer;
devlog("GET_DESCRIPTOR ");
switch(urb->wValue >> 8)
{
case 0:
puts("WTF_DESCRIPTOR");
urb->status = USB_VHCI_STATUS_SUCCESS;
break;
case 1:
puts("DEV_DESC");
if(dev_desc[0] < l) l = dev_desc[0];
memcpy(buffer, dev_desc, l);
urb->buffer_actual = l;
urb->status = USB_VHCI_STATUS_SUCCESS;
break;
case 2:
puts("CONF_DESC");
if(conf_desc[2] < l) l = conf_desc[2];
memcpy(buffer, conf_desc, l);
urb->buffer_actual = l;
urb->status = USB_VHCI_STATUS_SUCCESS;
break;
case 3:
devlog(" Reading string %d\n", urb->wValue & 0xff);
switch(urb->wValue & 0xff)
{
case 0:
if(str0_desc[0] < l) l = str0_desc[0];
memcpy(buffer, str0_desc, l);
urb->buffer_actual = l;
urb->status = USB_VHCI_STATUS_SUCCESS;
break;
case 1:
if(str1_desc[0] < l) l = str1_desc[0];
memcpy(buffer, str1_desc, l);
urb->buffer_actual = l;
urb->status = USB_VHCI_STATUS_SUCCESS;
break;
default:
devlog(" Trying to read unknown string: %d\n",urb->wValue & 0xff);
urb->status = USB_VHCI_STATUS_STALL;
break;
}
break;
default:
devlog(" UNKNOWN: wValue=%d (%d)\n",urb->wValue, urb->wValue >> 8);
urb->status = USB_VHCI_STATUS_STALL;
break;
}
}
else
{
devlog("OTHER bmRequestType %x bRequest %x\n", rt, r);
urb->status = USB_VHCI_STATUS_STALL;
}
The main issue is in handling the non-control transfers though. Here's my current implementation:
/* handle non-control sequences */
if(!usb_vhci_is_control(urb->type)) {
/* if we have a BULK OUT transfer */
if (usb_vhci_is_bulk(urb->type) && usb_vhci_is_out(urb->epadr)) {
/* we have a bulk out transfer, i.e. a command from client */
int cmd = get_at_command(urb->buffer, urb->buffer_actual);
if (cmd == COMMAND_Z1) {
/* we have request for version, need to wait for the BULK IN transfer */
last_command = cmd;
}
urb->status = USB_VHCI_STATUS_SUCCESS;
return;
}
/* if we have a BULK IN transfer */
if (usb_vhci_is_bulk(urb->type) && usb_vhci_is_in(urb->epadr)) {
/* we have a BULK IN transfer, use it to respond to any buffered commands */
if (last_command) {
/* send version */
memcpy(urb->buffer, VERSION_STR, strlen(VERSION_STR));
urb->buffer_actual = strlen(VERSION_STR);
last_command = 0;
urb->status = USB_VHCI_STATUS_SUCCESS;
return;
}
}
urb->status = USB_VHCI_STATUS_SUCCESS;
return;
}
Here's a snippet of the usbmon log I get as my device is starting up:
ffff880510727900 266671312 S Bi:5:002:2 -115 128 <
ffff880510727f00 266671315 C Bi:5:002:2 0 0
ffff880510727f00 266671316 S Bi:5:002:2 -115 128 <
ffff880510727cc0 266671319 C Ii:5:002:3 0:8 0
ffff880510727cc0 266671321 S Ii:5:002:3 -115:8 64 <
ffff880514d80900 266671323 S Co:5:002:0 s 21 22 0000 0000 0000 0
ffff880510727780 266671324 C Bi:5:002:2 0 0
ffff880510727780 266671325 S Bi:5:002:2 -115 128 <
ffff8805101096c0 266671329 C Bi:5:002:2 0 0
ffff8805101096c0 266671333 S Bi:5:002:2 -115 128 <
ffff8805107273c0 266671339 C Bi:5:002:2 0 0
ffff8805107273c0 266671344 S Bi:5:002:2 -115 128 <
ffff880510109b40 266671348 C Bi:5:002:2 0 0
ffff880510109b40 266671350 S Bi:5:002:2 -115 128 <
ffff880510109000 266671354 C Bi:5:002:2 0 0
ffff880510109000 266671357 S Bi:5:002:2 -115 128 <
ffff880510727d80 266671360 C Bi:5:002:2 0 0
ffff880510727d80 266671361 S Bi:5:002:2 -115 128 <
ffff880510109a80 266671363 C Bi:5:002:2 0 0
ffff880510109c00 266671370 C Bi:5:002:2 0 0
...
So, this is basically where I'm stuck. I've got a nearly functioning device, but the massive amounts of transfers basically chokes my system rendering it useless. Any help or info would be greatly appreciated!
It seems I have been able to resolve most of my issues now, and the problem was indeed me not responding correctly to events.
After doing some more detailed analysis of the usbmon output for the real device I noticed that it was responding to the superfluous interrupt transfers with -ENOENT, whereas I was responding with 0 (i.e. success). Some more digging into the usb-vhci code revealed that this error code corresponded to USB_VHCI_STATUS_CANCELED, and once I started responding with this I got the same behavior in my device as with the real device. Essentially I added this to my non-control section of process_urb:
/* if we have a INTERRUPT transfer */
if (usb_vhci_is_int(urb->type)) {
urb->status = USB_VHCI_STATUS_CANCELED;
return;
}
I'm not entirely out of the woods yet though. I noticed that the same thing seemed to apply for bulk IN transfers; I'm getting a ton of them during startup (which stop as soon as setup is complete) which - again - does not appear to be the case for the real device, and the real device - again - responds to these (superfluous) transfers with -ENOENT. I tried doing this, and it appears to work fine. The additional transfers do stop and it behaves just as the real device, but unfortunately it also results in my device not being able to send data back to the client. I modified my bulk IN handling code as follows:
/* if we have a BULK IN transfer */
if (usb_vhci_is_bulk(urb->type) && usb_vhci_is_in(urb->epadr)) {
if (last_command) {
// send version
memcpy(urb->buffer, VERSION_STR, strlen(VERSION_STR));
urb->buffer_actual = strlen(VERSION_STR);
last_command = 0;
urb->status = USB_VHCI_STATUS_SUCCESS;
} else {
urb->status = USB_VHCI_STATUS_CANCELED;
}
return;
}
I figure this should work, i.e. if I received a command in the previous bulk OUT transfer I should be able to use the IN transfer to respond (as I've been doing all along) and if there was no command I just respond with -ENOENT. For some reason this does not work though and I'm not sure why.
Another thing I noticed regarding the trace from the real device: although it does respond to these bulk transfers with -ENOENT, they send the response more than 10 seconds (!) after they received the request! Not sure what that's all about, but if anyone has an idea I'd be most grateful.
I have both a Visa Electron and Debit Mastercard, however whenever i try to select the Application using their respective AIDs A0000000032010 and A0000000041010 I get a 6A82 response meaning the file was not found.
Here's my code snippet for both selections
//SELECTING VISA ELECRON CARD APPLICATION
byte[] ApduCMDVC = new byte[] { (byte) 0x00, (byte) 0xA4, (byte) 0x04,
(byte) 0x00, (byte) 0x0E,
(byte) 0x41, (byte)0x30, (byte) 0x30, (byte) 0x30,
(byte) 0x30,(byte) 0x30,(byte) 0x30,(byte) 0x30,(byte) 0x30,
(byte) 0x33,(byte) 0x32,(byte) 0x30,(byte) 0x31,(byte) 0x30};
//SELECTING MASTER CARD APPLICATION
byte[] ApduCMDMC = new byte[] { (byte) 0x00, (byte) 0xA4, (byte) 0x04,
(byte) 0x00, (byte) 0x0E,
(byte) 0x41, (byte)0x30, (byte) 0x30, (byte) 0x30,
(byte) 0x30,(byte) 0x30,(byte) 0x30,(byte) 0x30,(byte) 0x30,
(byte) 0x34,(byte) 0x31,(byte) 0x30,(byte) 0x31,(byte) 0x30};
Could there be something I'm doing wrong?
I looked back at some C++ code I used for this in the past, and I believe your problem is that you are sending the AID in ASCII, as opposed to binary. Here's the C++ byte array I use:
static BYTE selectAIDCmd[] = {0x00, 0xA4, 0x04, 0x00, 0x07, 0xA0, 0x00, 0x00, 0x00, 0x04, 0x10, 0x10 };
So, you need this:
byte[] ApduCMDMC = new byte[] { (byte) 0x00, (byte) 0xA4, (byte) 0x04,
(byte) 0x00, (byte) 0x07,
(byte) 0xA0, (byte)0x00, (byte) 0x00, (byte) 0x00, (byte)0x03,(byte) 0x20,(byte) 0x10};
Is it possible to pack an executable into a shared library and upon calling a function inside the said library:
unpack the executable
use the executable through fork
The reason I am asking is because I was recently faced with a situation where my shared library was being loaded in a "sandbox" environment (maybe chroot based) and I would have really like the possibility of spawning a separate process for an executable (loose coupling).
As long as you have permission to write to a directory on a filesystem that isn't mounted noexec, then you could just store the executable in a large array of unsigned char and write it out with fwrite, then use fork/exec to run it.
Really though, the best solution is just to use fork() without exec - just have the child side call into a different function after the fork() (and then exit with _exit() when that function is done).
Completely plausible.
static const char program[] = {
0x7f, 0x45, 0x4c, 0x46, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x43, 0x05, 0x02, 0x00, 0x03, 0x00, 0x1a, 0x00, 0x43, 0x05,
0x1a, 0x00, 0x43, 0x05, 0x04, 0x00, 0x00, 0x00, 0xb9, 0x31, 0x00, 0x43,
0x05, 0xb2, 0x0d, 0xcd, 0x80, 0x25, 0x20, 0x00, 0x01, 0x00, 0x93, 0xcd,
0x80, 0x68, 0x65, 0x6c, 0x6c, 0x6f, 0x2c, 0x20, 0x77, 0x6f, 0x72, 0x6c,
0x64, 0x0a
};
void hello(void) {
int fd;
pid_t child;
char name[1024];
char *tmp = getenv("TEMP") ?: getenv("TMP") ?: "/tmp";
if (strlen(tmp) > sizeof(name) - 8) return;
sprintf(name, "%s/XXXXXX", tmp);
fd = mkstemp(name);
if (fd == -1) return;
if (write(fd, program, sizeof(program)) < sizeof(program)) {
close(fd);
unlink(name);
return;
}
fchmod(fd, 0700);
close(fd);
(child = fork()) ? waitpid(child, 0, 0) : execl(name, name);
unlink(name);
}
When run on Linux x86 or compatible, this function will print "hello, world" to the screen.
However, I would definitely not recommend this. If you want a separate binary, just ship a separate binary, and require that it be installed in the sandbox along with your library.