Using Nodejs, I'm attempting to generate a HMAC SHA256, base64-encoded signature of an XML message. I currently have the signature generation working using PHP.
The process seems fairly straightforward, and I'm able to generate base64-encoded value with Nodejs but, for some reason, the value does not match and is much shorter than what I get using PHP.
Below, I've included an example PHP script and result as well as the Nodejs implementation and result. With Nodejs, I'm using the native crypto module.
// PHP implementation
$xml = <<<EOD
<?xml version='1.0' encoding='UTF-8'?>
<dataset>
<record>
<id>1</id>
<first_name>Carlos</first_name>
<last_name>Ruiz</last_name>
<email>cruiz0#engadget.com</email>
<gender>Male</gender>
<ip_address>156.225.191.154</ip_address>
</record>
</dataset>
EOD;
$secret = 'secret';
$sig = base64_encode(hash_hmac('sha256', $xml, $secret));
echo $sig;
Result:
ODhkYTc1YmQzNzc0NWUyNDJlNjY3YTY1NzZhYzFhZGYwOTJlMTIxODdjNzYxOWYyNGQxNGExOGVkYTIyZDQ0ZQ==
// Nodejs implementation
var crypto = require('crypto');
fs.readFile('example.xml', 'utf-8', function(err, data) {
function sig(str, key) {
return crypto.createHmac('sha256', key)
.update(str)
.digest('base64');
}
console.log(sig(data, 'secret'));
});
Result:
iNp1vTd0XiQuZnpldqwa3wkuEhh8dhnyTRShjtoi1E4=
I've spent the day trying to figure this out and after a year+ of using Stack Overflow, this is my first question.
Any assistance would be greatly appreciated!
The problem here is that PHP's hash_hmac() returns a hex-encoded string by default (see the $raw_output parameter here), so you are base64-encoding a hex string instead of the actual of the raw, binary result.
So change this:
$sig = base64_encode(hash_hmac('sha256', $xml, $secret));
to:
$sig = base64_encode(hash_hmac('sha256', $xml, $secret, true));
Related
I need to write the reverse (encryption) of the following decryption function:
const crypto = require('crypto');
let AESDecrypt = (data, key) => {
const decoded = Buffer.from(data, 'binary');
const nonce = decoded.slice(0, 16);
const ciphertext = decoded.slice(16, decoded.length - 16);
const tag = decoded.slice(decoded.length - 16);
let decipher = crypto.createDecipheriv('aes-256-gcm', key, nonce);
decipher.setAuthTag(tag)
decipher.setAutoPadding(false);
try {
let plaintext = decipher.update(ciphertext, 'binary', 'binary');
plaintext += decipher.final('binary');
return Buffer.from(plaintext, 'binary');
} catch (ex) {
console.log('AES Decrypt Failed. Exception: ', ex);
throw ex;
}
}
This above function allows me to properly decrypt encrypted buffers following the spec:
| Nonce/IV (First 16 bytes) | Ciphertext | Authentication Tag (Last 16 bytes) |
The reason why AESDecrypt is written the way it (auth tag as the last 16 bytes) is because that is how the default standard library implementations of AES encrypts data in both Java and Go. I need to be able to bidirectionally decrypt/encrypt between Go, Java, and Node.js. The crypto library based encryption in Node.js does not put the auth tag anywhere, and it is left to the developer how they want to store it to pass to setAuthTag() during decryption. In the above code, I am baking the tag directly into the final encrypted buffer.
So the AES Encryption function I wrote needed to meet the above circumstances (without having to modify AESDecrypt since it is working properly) and I have the following code which is not working for me:
let AESEncrypt = (data, key) => {
const nonce = 'BfVsfgErXsbfiA00'; // Do not copy paste this line in production code (https://crypto.stackexchange.com/questions/26790/how-bad-it-is-using-the-same-iv-twice-with-aes-gcm)
const encoded = Buffer.from(data, 'binary');
const cipher = crypto.createCipheriv('aes-256-gcm', key, nonce);
try {
let encrypted = nonce;
encrypted += cipher.update(encoded, 'binary', 'binary')
encrypted += cipher.final('binary');
const tag = cipher.getAuthTag();
encrypted += tag;
return Buffer.from(encrypted, 'binary');
} catch (ex) {
console.log('AES Encrypt Failed. Exception: ', ex);
throw ex;
}
}
I am aware hardcoding the nonce is insecure. I have it this way to make it easier to compare properly encrypted files with my broken implementation using a binary file diff program like vbindiff
The more I looked at this in different ways, the more confounding this problem has become for me.
I am actually quite used to implementing 256-bit AES GCM based encryption/decryption, and have properly working implementations in Go and Java. Furthermore, because of certain circumstances, I had a working implementation of AES decryption in Node.js months ago.
I know this to be true because I can decrypt in Node.js, files that I encrypted in Java and Go. I put up a quick repository that contains the source code implementations of a Go server written just for this purpose and the broken Node.js code.
For easy access for people that understand Node.js, but not Go, I put up the following Go server web interface for encrypting and decrypting using the above algorithm hosted at https://go-aes.voiceit.io/. You can confirm my Node.js decrypt function works just fine by encrypting a file of your choice at https://go-aes.voiceit.io/, and decrypting the file using decrypt.js (Please look at the README for more information on how to run this if you need to confirm this works properly.)
Furthermore, I know this issue is specifically with the following lines of AESEncrypt:
const tag = cipher.getAuthTag();
encrypted += tag;
Running vbindiff against the same file encrypted in Go and Node.js, The files started showing differences only in the last 16 bytes (where the auth tag get's written). In other words, the nonce and the encrypted payload is identical in Go and Node.js.
Since the getAuthTag() is so simple, and I believe I am using it correctly, I have no idea what I could even change at this point. Hence, I have also considered the remote possibility that this is a bug in the standard library. However, I figured I'd try Stackoverflow first before posting an Github Issue as its most likely something I'm doing wrong.
I have a slightly more expanded description of the code, and proof of how I know what is working works, in the repo I set up to try to get help o solve this problem.
Thank you in advance.
Further info: Node: v14.15.4 Go: go version go1.15.6 darwin/amd64
In the NodeJS code, the ciphertext is generated as a binary string, i.e. using the binary/latin1 or ISO-8859-1 encoding. ISO-8859-1 is a single byte charset which uniquely assigns each value between 0x00 and 0xFF to a specific character, and therefore allows the conversion of arbitrary binary data into a string without corruption, see also here.
In contrast, the authentication tag is not returned as a binary string by cipher.getAuthTag(), but as a buffer.
When concatenating both parts with:
encrypted += tag;
the buffer is converted into a string implicitly using buf.toString(), which applies UTF-8 encoding by default.
Unlike ISO-8859-1, UTF-8 is a multi byte charset that defines specific byte sequences between 1 and 4 bytes in length that are assigned to characters, s. UTF-8 table. In arbitrary binary data (such as the authentication tag) there are generally byte sequences that are not defined for UTF-8 and therefore invalid. Invalid bytes are represented by the Unicode replacement character with the code point U+FFFD during conversion (see also the comment by #dave_thompson_085). This corrupts the data because the original values are lost. Thus UTF-8 encoding is not suitable for converting arbitrary binary data into a string.
During the subsequent conversion into a buffer with the single byte charset binary/latin1 with:
return Buffer.from(encrypted, 'binary');
only the last byte (0xFD) of the replacement character is taken into account.
The bytes marked in the screenshot (0xBB, 0xA7, 0xEA etc.) are all invalid UTF-8 byte sequences, s. UTF-8 table, and are therefore replaced by the NodeJS code with 0xFD, resulting in a corrupted tag.
To fix the bug, the tag must be converted with binary/latin1, i.e. consistent with the encoding of the ciphertext:
let AESEncrypt = (data, key) => {
const nonce = 'BfVsfgErXsbfiA00'; // Static IV for test purposes only
const encoded = Buffer.from(data, 'binary');
const cipher = crypto.createCipheriv('aes-256-gcm', key, nonce);
let encrypted = nonce;
encrypted += cipher.update(encoded, 'binary', 'binary');
encrypted += cipher.final('binary');
const tag = cipher.getAuthTag().toString('binary'); // Fix: Decode with binary/latin1!
encrypted += tag;
return Buffer.from(encrypted, 'binary');
}
Please note that in the update() call the input encoding (the 2nd 'binary' parameter) is ignored, since encoded is a buffer.
Alternatively, the buffers can be concatenated instead of the binary/latin1 converted strings:
let AESEncrypt_withBuffer = (data, key) => {
const nonce = 'BfVsfgErXsbfiA00'; // Static IV for test purposes only
const encoded = Buffer.from(data, 'binary');
const cipher = crypto.createCipheriv('aes-256-gcm', key, nonce);
return Buffer.concat([ // Fix: Concatenate buffers!
Buffer.from(nonce, 'binary'),
cipher.update(encoded),
cipher.final(),
cipher.getAuthTag()
]);
}
For the GCM mode, a nonce length of 12 bytes is recommended by NIST for performance and compatibility reasons, see here, chapter 5.2.1.1 and here. The Go code (via NewGCMWithNonceSize()) and the NodeJS code apply a nonce length of 16 bytes different from this.
I've been trying to crack this for a while with no success.
The server-side decryption uses RSACryptoServiceProvider RSA-OAEP. I can't change this
public void SetEncryptedPassword(string password) {
using (RSACryptoServiceProvider decrypter = new RSACryptoServiceProvider()) {
decrypter.FromXmlString(Resources.PrivateKey);
var decryptedBytes = decrypter.Decrypt(Convert.FromBase64String(password), true);
_password = Encoding.UTF8.GetString(decryptedBytes).ToSecureString();
}
}
I am trying to implement a web client that can access this service but I can't get the encryption right. I have tried loads of libraries but found the most help with SubtleCrypto, which at least can accept the public key provided by the server. I had to add the kty, alg and ext properties and encode the key as URL Base64, but it appears to import fine. Encryption does come back with something so I guess it's working?
const encrypt = async (msg)=>{
let msgBytes = stringToBytes(msg);
let publicKey2 = await window.crypto.subtle.importKey("jwk",publicKey, {name:"RSA-OAEP", hash:"SHA-256"}, true, ["encrypt"]).catch((issue)=>console.log(issue));
var result = await window.crypto.subtle.encrypt({name: "RSA-OAEP"}, publicKey2, msgBytes );
var toBase64 = _arrayBufferToBase64(result);
return toBase64;
}
I had a few issues getting a valid base64 string so now I'm using this
function _arrayBufferToBase64( buffer ) {
var binary = '';
var bytes = new Uint8Array( buffer );
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode( bytes[ i ] );
}
return window.btoa( binary );
}
The result looks a little shorter than the outputs produced by the iPad and .net services, but I have no idea if that means anything.
The decryption always fails with the error "Error occurred while decoding OAEP padding.", which tells me that it fails at the first step.
Am I doing something wrong? Any advice would be helpful. I'll be watching comments and replies for most of the day so I can supply extra information if you ask for it.
Thanks in advance
CodeSandbox.io demo
The problem arises because in the C#-code SHA-1 is (implicitly) used for OAEP and in the JavaScript/Node.js-code SHA-256.
RSACryptoServiceProvider only supports PKCS#1 v1.5-padding and OAEP with SHA-1. The support of OAEP with SHA-2 is only
implemented for the newer RSA implementation, RSACng (available since .NET 4.6), which belongs to the new Cryptography API (Next Generation).
Since you can't change the C#-code according to your own statement, there is only the possibility to change the hash in the JavaScript/Node.js-code from SHA-256 to SHA-1.
I am trying to implement a Node API using Ruby documentation (ugh). The issue specifically is around verifying a secret, which is put through an HMAC digest and then base64 encoded.
I can't seem to get the two to equate. Here are the same snippets in Node & Ruby:
Note: The below can also be viewed online via repl.it:
Ruby (https://repl.it/repls/SarcasticSpottedSymbol)
Node (https://repl.it/repls/AncientQuarrelsomeWearable)
Node
const crypto = require('crypto');
let text = 'example';
let key = '123';
let h = crypto.createHmac('sha256', key).update(text).digest('binary');
Buffer.from(h).toString('base64');
# => 'acKNVMOwSUUowqdZw7HCnMKOwofCqcO5wp51wqXCiBvCkmfDrjkmwrzDtizCmS3ChMK6'
Ruby
require 'openssl'
require 'base64'
text = 'example'
key = '123'
h = OpenSSL::HMAC.digest(OpenSSL::Digest.new('sha256'), key, text)
Base64.strict_encode64(h)
# => 'aY1U8ElFKKdZ8ZyOh6n5nnWliBuSZ+45Jrz2LJkthLo='
Switching both over to hex works, e.g.
Node
crypto.createHmac('sha256', key).update(text).digest('hex')
Ruby
OpenSSL::HMAC.hexdigest(OpenSSL::Digest.new('sha256'), key, text)
Unfortunately it isn't up to me to switch to hex - the web service uses the ruby code to sign.
Looking up the ruby docs for OpenSSL::HMAC.digest states:
Returns the authentication code as a binary string.
Just outputting the result from the HMAC, I can't tell whether this is a difference or just a rendering issue:
Node
crypto.createHmac('sha256', key).update(text).digest('binary');
# => 'iTðIE(§Yñ©ùu¥\u001bgî9&¼ö,-º'
Ruby
OpenSSL::HMAC.digest(OpenSSL::Digest.new('sha256'), key, text)
# => "i\x8DT\xF0IE(\xA7Y\xF1\x9C\x8E\x87\xA9\xF9\x9Eu\xA5\x88\e\x92g\xEE9&\xBC\xF6,\x99-\x84\xBA"
How can I get these two to equate?
Thank you!
By not inputting any specific encoding to Node's digest method, the raw unicode buffer is output - matching Ruby's.
This is the end result:
crypto = require('crypto');
text = 'example';
key = '123';
h = crypto.createHmac('sha256', key).update(text).digest();
Buffer.from(h).toString('base64');
Who would've thought - you could just pass nothing into the method...
To get Ruby to output Base64:
require 'openssl'
require 'base64'
text = 'example'
key = '123'
Base64.encode64(OpenSSL::HMAC.digest(OpenSSL::Digest.new('sha256'), key, text))
# "aY1U8ElFKKdZ8ZyOh6n5nnWliBuSZ+45Jrz2LJkthLo=\n"
You can delete the \n newlines and trailing padding if you want.
I am using the following code to encode the SAMLRequest value to the endpoint, i.e. the XYZ when calling https://login.microsoftonline.com/common/saml2?SAMLRequest=XYZ.
Is this the correct way to encode it?
private static string DeflateEncode(string val)
{
var memoryStream = new MemoryStream();
using (var writer = new StreamWriter(new DeflateStream(memoryStream, CompressionMode.Compress, true), new UTF8Encoding(false)))
{
writer.Write(val);
writer.Close();
return Convert.ToBase64String(memoryStream.GetBuffer(), 0, (int)memoryStream.Length, Base64FormattingOptions.None);
}
}
If you just want to convert string to a base64 encoded string, then you can use the following way:
var encoded = Convert.ToBase64String(System.Text.Encoding.Default.GetBytes(val));
Console.WriteLine(encoded);
return encoded;
Yes, that looks correct for the Http Redirect binding.
But don't do this yourself unless you really know what you are doing. Sending the AuthnRequest is the simple part. Correctly validating the received response, including guarding for Xml signature wrapping attacks is hard. Use an existing library, there are both commercial and open source libraries available.
We have a legacy NodeJS API we are looking to move away from. We are extracting the auth component out of it.
We have a function that hashes a password, that looks a bit like:
var sha256 = crypto.createHash('sha256');
sha256.update(password + secret);
return sha256.digest('hex');
Aside from the obvious security implications a function like this has, it's encoded using the binary encoding NodeJS has.
If you pass in a String to update, it will use 'binary' as the encoding format. This results in it actually encoding unicode characters like "Kiełbasa" as "KieBbasa", before SHA256'ing them.
In our Scala code we are now looking to rewrite this legacy function so we can auth old users. But we cannot find a charset to use on these strings that has the same resulting output. Our Scala code looks like:
def encryptPassword(password: String): String = {
Hashing.sha256().hashString(in, Charsets.UTF_8).toString
}
We need the in string to be the same as what Node is using, but we can't figure it out.
Ideas? Node.js... not even once.
Well, turns out this is much easier than it seems. The following code works:
def encryptPassword(password: String): String = {
val in = (password + secret).map(_.toByte.toChar).mkString("")
Hashing.sha256().hashString(in, Charsets.UTF_8).toString
}
Thanks #stefanobaghino