FINAL EDIT: SOLVED, upgrading local dev to railo 3.3.4.003 resolved the issue.
I have to RC4 encrypt some strings and have them base64 encoded and I'm running into a situation where the same input will generate different outputs on 2 different dev setups.
For instance, if I have a string test2#mail.com
On one machine (DEV-1) I'll get: DunU+ucIPz/Z7Ar+HTw=
and on the other (DEV-2) it'll be: DunU+ucIlZfZ7Ar+HTw=
First, I'm rc4 encrypting it through a function found here.
Next I'm feeding it to: toBase64( my_rc4_encrypted_data, "iso-8859-1")
As far as I can tell the rc4 encryption output is the same on both (or I'm missing something).
Below are SERVER variables from both machines as well as the encryption function.
Is this something we'll simply have to live with or is there something I can do to 'handle it properly' (for a lack of a better word).
I'm concerned that in the future this will bite me and wonder it it can be averted.
edit 1:
Output from my_rc4_encrypted_data.getBytes() returns:
dev-1:
Native Array (byte[])
14--23--44--6--25-8-63-63--39--20-10--2-29-60
dev-2:
Native Array (byte[])
14--23--44--6--25-8-63-63--39--20-10--2-29-60
(no encoding passed to getBytes() )
DEV-1 (remote)
server.coldfusion
productname Railo
productversion 9,0,0,1
server.java
archModel 64
vendor Sun Microsystems Inc.
version 1.6.0_26
server.os
arch amd64
archModel 64
name Windows Server 2008 R2
version 6.1
server.railo
version 3.3.2.002
server.servlet
name Resin/4.0.18
DEV-2 (local)
server.coldfusion
productname Railo
productversion 9,0,0,1
server.java
vendor Oracle Corporation
version 1.7.0_01
server.os
arch x86
name Windows 7
version 6.1
server.railo
version 3.2.2.000
server.servlet
name Resin/4.0.18
RC4 function:
function RC4(strPwd,plaintxt) {
var sbox = ArrayNew(1);
var key = ArrayNew(1);
var tempSwap = 0;
var a = 0;
var b = 0;
var intLength = len(strPwd);
var temp = 0;
var i = 0;
var j = 0;
var k = 0;
var cipherby = 0;
var cipher = "";
for(a=0; a lte 255; a=a+1) {
key[a + 1] = asc(mid(strPwd,(a MOD intLength)+1,1));
sbox[a + 1] = a;
}
for(a=0; a lte 255; a=a+1) {
b = (b + sbox[a + 1] + key[a + 1]) Mod 256;
tempSwap = sbox[a + 1];
sbox[a + 1] = sbox[b + 1];
sbox[b + 1] = tempSwap;
}
for(a=1; a lte len(plaintxt); a=a+1) {
i = (i + 1) mod 256;
j = (j + sbox[i + 1]) Mod 256;
temp = sbox[i + 1];
sbox[i + 1] = sbox[j + 1];
sbox[j + 1] = temp;
k = sbox[((sbox[i + 1] + sbox[j + 1]) mod 256) + 1];
cipherby = BitXor(asc(mid(plaintxt, a, 1)), k);
cipher = cipher & chr(cipherby);
}
return cipher;
}
Leigh wrote:
But be sure to use the same encoding in your test ie
String.getBytes(encoding) (Edit) If you omit it, the jvm default is
used.
Leigh is right - RAILO-1393 resulted in a change to toBase64 related to charset encodings in 3.3.0.017, which is between the 3.3.2.002 and 3.2.2.000 versions you are using.
As far as I can tell the rc4 encryption output is the same on both (or I'm missing something). Below are SERVER variables from both machines as well as the encryption function.
I would suggest saving the output to two files and then comparing the file size or, even better, a file comparison tool. Base64 encoding is a standard approach to converting binary data into string data.
Assuming that your binary files are both exactly 100% the same, on both of your servers try converting the data to base 64 and then back to binary again. I would predict that only one (or neither) of the servers are able to convert the data back to binary again. At that point, you should have a clue about which server is causing your problem and can dig in further.
If they both can reverse the base 64 data to binary and the binary is correct on both servers... well, I'm not sure.
Related
I'm creating a Website and I've understood that the best authentication is through JWT, but I'm not a fun of frameworks because I like to go deep in the code and understand all the code in my files. Thus I'm asking if someone have done it, or something similar, in pure Node.js and if could give me an explanation of how to do that.
Thanks
Yes, it's of course possible, just consider how frameworks are made. There's no magic involved, just knowledge and a lot of javascript code.
You can find the sources of most frameworks on Github and study them there.
In a first step, you should make yourself familiar with the basics of JWT, e.g. with the help of this introduction and by reading RFC7519.
You'll find out, that a JWT basically consists of base64url encoded JSON objects and a base64url encoded signature.
The simplest signature algorithm is HS256 (HMAC-SHA256).
In the jwt.io debugger window, you see in the right column the pseudo code for creating a JWT signature:
HMACSHA256(
base64UrlEncode(header) + "." +
base64UrlEncode(payload),
secret
)
so basically you need to learn:
what information goes into the JWT header and payload (i.e. claims)
how to base64url encode a string or bytearray
how to create a SHA256 hash
how to use the hashing algorithm to create a HMAC.
With this, you already have a basic JWT framework that would allow you to create a signed token and verify the signature.
In the next step you can add "features" like
verification of expiration time
verification of issuer, audience
advanced signing algoritms.
You can use the jwt.io debugger to check the if your token can be decoded and verified.
Depending on your needs, you might use something like this:
export const setJwtToken = (headers, payload) => {
const base64Encode = str => {
const utf8str = decodeURI(encodeURI(str))
const b64 = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_"
const len = str.length
let dst = ""
let i
for (i = 0; i <= len - 3; i += 3) {
dst += b64.charAt(utf8str.charCodeAt(i) >>> 2)
dst += b64.charAt(((utf8str.charCodeAt(i) & 3) << 4) | (utf8str.charCodeAt(i + 1) >>> 4))
dst += b64.charAt(((utf8str.charCodeAt(i + 1) & 15) << 2) | (utf8str.charCodeAt(i + 2) >>> 6))
dst += b64.charAt(utf8str.charCodeAt(i + 2) & 63)
}
if (len % 3 == 2) {
dst += b64.charAt(utf8str.charCodeAt(i) >>> 2)
dst += b64.charAt(((utf8str.charCodeAt(i) & 3) << 4) | (utf8str.charCodeAt(i + 1) >>> 4))
dst += b64.charAt(((utf8str.charCodeAt(i + 1) & 15) << 2))
}
else if (len % 3 == 1) {
dst += b64.charAt(utf8str.charCodeAt(i) >>> 2)
dst += b64.charAt(((utf8str.charCodeAt(i) & 3) << 4))
}
return dst
}
const headers = JSON.stringify(headers)
const payload = JSON.stringify(payload)
const token = `${base64Encode(headers)}.${base64Encode(payload)}`
console.log(token)
}
I have seen multiple people asking for a help on C-MAC generation(Retail MAC). This question contains the answer as well.
This will help your enough time.
I have tested this function with real card and it worked fine.
Note:
One can improve the efficiency of function if like.
If you find any improvement please suggest.
Before you start work on SCP 02 with communication in Ext_Atuh as CMAC please check SCP i value.
This function supports ICV encryption for next command.
public static byte[] generateCmac(byte []apdu,byte[]sMacSessionKey,byte[]icv) throws Exception {
if(sMacSessionKey.length == 16) {
byte []temp = sMacSessionKey.clone();
sMacSessionKey = new byte[24];
System.arraycopy(temp,0,sMacSessionKey,0,temp.length);
System.arraycopy(temp,0,sMacSessionKey,16,8);
}
byte []cMac = new byte[8];
byte []padding = {(byte)0x80,0x00,0x00,0x00,0x00,0x00,0x00,0x00,};
int paddingRequired = 8 - (apdu.length) %8;
byte[] data = new byte[apdu.length + paddingRequired];
System.arraycopy(apdu, 0, data, 0, apdu.length);
System.arraycopy(padding, 0, data, apdu.length,paddingRequired);
Cipher cipher = Cipher.getInstance("DESede/CBC/NoPadding");
Cipher singleDesCipher = Cipher.getInstance("DES/CBC/NoPadding",
"SunJCE");
SecretKeySpec desSingleKey = new SecretKeySpec(sMacSessionKey, 0, 8,
"DES");
SecretKey secretKey = new SecretKeySpec(sMacSessionKey, "DESede");
// Calculate the first n - 1 block. For this case, n = 1
IvParameterSpec ivSpec = new IvParameterSpec(icv);
singleDesCipher.init(Cipher.ENCRYPT_MODE, desSingleKey, ivSpec);
// byte ivForLastBlock[] = singleDesCipher.doFinal(data, 0, 8);
int blocks = data.length / 8;
for (int i = 0; i < blocks - 1; i++) {
singleDesCipher.init(Cipher.ENCRYPT_MODE, desSingleKey, ivSpec);
byte[] block = singleDesCipher.doFinal(data, i * 8, 8);
ivSpec = new IvParameterSpec(block);
}
int offset = (blocks - 1) * 8;
cipher.init(Cipher.ENCRYPT_MODE, secretKey, ivSpec);
cMac = cipher.doFinal(data, offset, 8);
ivSpec = new IvParameterSpec(new byte[8]);
singleDesCipher.init(Cipher.ENCRYPT_MODE, desSingleKey, ivSpec);
icvNextCommand = singleDesCipher.doFinal(cMac);
System.out.println("icvNextCommand"+Utility.bytesToHex(icvNextCommand, icvNextCommand.length));
return cMac;
}
A way simpler alternative is to use Signature.ALG_DES_MAC8_ISO9797_1_M2_ALG3 (if supported by the card) to calculate retail MAC value for C-MAC in SCP02.
Note: CMAC is a message authentication code which is not used in SCP02 at all.
EDIT> For PC side consider ISO9797Alg3Mac from Bouncy Castle.
I'm trying to verify dice rolls but getting wrong roll numbers generated. I've tried official verification script from 2 sites:
primedice.com nodeJS verification script.
bitsler.com and PHP script to check a roll.
Can you help me understanding why I can't get the same roll numbers as on sites...Let's focus on primedice, because I think the issue is surely similar.
details of a primedice.com roll:
roll #20,549,462,672
server seed (hashed) : a72c7d6e95b7122dc21968505b91d729f4eef30af2c019488bb76274290ad183
client seed (nonced) : a9e65af098fe00229917de5659746dda-28
roll number shown on site: 35.02
I suppose the nonce to use there is 28 in the code. so I fill nodejs variable as this:
var clientSeed = "a9e65af098fe00229917de5659746dda";
var serverSeed =
"a72c7d6e95b7122dc21968505b91d729f4eef30af2c019488bb76274290ad183";
var nonce = 28;
//----- official roll function from primedice.com
var crypto = require('crypto');
var roll = function(key, text) {
var hash = crypto.createHmac('sha512', key).update(text).digest('hex');
var index = 0;
var lucky = parseInt(hash.substring(index * 5, index * 5 + 5), 16);
while (lucky >= Math.pow(10, 6)) {
index++;
lucky = parseInt(hash.substring(index * 5, index * 5 + 5), 16); //if we reach the end of the hash, just default to highest number
if (index * 5 + 5 > 128) {
lucky = 99.99;
break;
}
}
lucky %= Math.pow(10, 4);
lucky /= Math.pow(10, 2);
return lucky;
}
console.log(roll(serverSeed, clientSeed+'-'+nonce));
then used the roll function as given by the site.
the result generated was:
roll number generated : 50.64
so 50.64 is not what the site shows ==> 35.02
Any idea?
I understand what the problem is. Here is the roll to verify
roll #20,549,462,672
server seed (hashed) :
a72c7d6e95b7122dc21968505b91d729f4eef30af2c019488bb76274290ad183
client seed (nonced) : a9e65af098fe00229917de5659746dda-28
As you can see, it shows "server seed (hashed)". You cannot calculate the right value with a hashed representation of the seed. You will need the real seed to get the right number.
Knowing the real seed would mean you can bet and be 100% sure to win on each new roll.
So, the service won't let you see the server seed that is used to let you bet. You must do a seeds change then the server will show you the real seed used for your last sessions.
Therefore you can only verify rolls numbers from an old server seed, not from the current one, since it is kept secret by the service for security reasons.
I'm programming an embedded board using Broadcom's Bluetooth LE device, and Broadcom's WICED smart IDE.
Can't figure out how to change the default UUID that the board advertises at power up.
Following is a sample code to change adverts to have flags, 128-bit UUID and local name.
BLE_ADV_FIELD adv[3];
// flags
adv[0].len = 1 + 1;
adv[0].val = ADV_FLAGS;
adv[0].data[0] = LE_LIMITED_DISCOVERABLE | BR_EDR_NOT_SUPPORTED;
adv[1].len = 16 + 1;
adv[1].val = ADV_SERVICE_UUID128_COMP;
memcpy(adv[1].data, db_pdu.pdu, 16);
// name
adv[2].len = strlen(bleprofile_p_cfg->local_name) + 1;
adv[2].val = ADV_LOCAL_NAME_COMP;
memcpy(adv[2].data, bleprofile_p_cfg->local_name, adv[2].len - 1);
bleprofile_GenerateADVData(adv, 3);
I'm working on a checksum algorithm, and I'm having some issues. The kicker is, when I hand craft a "fake" message, that is substantially smaller than the "real" data I'm receiving, I get a correct checksum. However, against the real data - the checksum does not work properly.
Here's some information on the incoming data/environment:
This is a groovy project (see code below)
All bytes are to be treated as unsigned integers for the purpose of checksum calculation
You'll notice some finagling with shorts and longs in order to make that work.
The size of the real data is 491 bytes.
The size of my sample data (which appears to add correctly) is 26 bytes
None of my hex-to-decimal conversions are producing a negative number, as best I can tell
Some bytes in the file are not added to the checksum. I've verified that the switch for these is working properly, and when it is supposed to - so that's not the issue.
My calculated checksum, and the checksum packaged with the real transmission always differ by the same amount.
I have manually verified that the checksum packaged with the real data is correct.
Here is the code:
// add bytes to checksum
public void addToChecksum( byte[] bytes) {
//if the checksum isn't enabled, don't add
if(!checksumEnabled) {
return;
}
long previouschecksum = this.checksum;
for(int i = 0; i < bytes.length; i++) {
byte[] tmpBytes = new byte[2];
tmpBytes[0] = 0x00;
tmpBytes[1] = bytes[i];
ByteBuffer tmpBuf = ByteBuffer.wrap(tmpBytes);
long computedBytes = tmpBuf.getShort();
logger.info(getHex(bytes[i]) + " = " + computedBytes);
this.checksum += computedBytes;
}
if(this.checksum < previouschecksum) {
logger.error("Checksum DECREASED: " + this.checksum);
}
//logger.info("Checksum: " + this.checksum);
}
If anyone can find anything in this algorithm that could be causing drift from the expected result, I would greatly appreciate your help in tracking this down.
I don't see a line in your code where you reset your this.checksum.
This way, you should alway get a this.checksum > previouschecksum, right? Is this intended?
Otherwise I can't find a flaw in your above code. Maybe your 'this.checksum' is of the wrong type (short for instance). This could rollover so that you get negative values.
here is an example for such a behaviour
import java.nio.ByteBuffer
short checksum = 0
byte[] bytes = new byte[491]
def count = 260
for (def i=0;i<count;i++) {
bytes[i]=255
}
bytes.each { b ->
byte[] tmpBytes = new byte[2];
tmpBytes[0] = 0x00;
tmpBytes[1] = b;
ByteBuffer tmpBuf = ByteBuffer.wrap(tmpBytes);
long computedBytes = tmpBuf.getShort();
checksum += computedBytes
println "${b} : ${computedBytes}"
}
println checksum +"!=" + 255*count
just play around with the value of the 'count' variable which somehow corresponds to the lenght of your input.
Your checksum will keep incrementing until it rolls over to being negative (as it is a signed long integer)
You can also shorten your method to:
public void addToChecksum( byte[] bytes) {
//if the checksum isn't enabled, don't add
if(!checksumEnabled) {
return;
}
long previouschecksum = this.checksum;
this.checksum += bytes.inject( 0L ) { tot, it -> tot += it & 0xFF }
if(this.checksum < previouschecksum) {
logger.error("Checksum DECREASED: " + this.checksum);
}
//logger.info("Checksum: " + this.checksum);
}
But that won't stop it rolling over to being negative. For the sake of saving 12 bytes per item that you are generating a hash for, I would still suggest something like MD5 which is know to work is probably better than rolling your own... However I understand sometimes there are crazy requirements you have to stick to...