Javacard KeyAgreement differs from BouncyCastle KeyAgreement - bouncycastle

My problem looks like this. I have generated keys on the card and the terminal sides. I have on the terminal side the card public and private keys and the terminals public and private keys, and the same on the card side (i'm doing tests so thats why i have all of them on the terminal and on the card). When i generate KeyAgreement (terminal side) for the card as private and for the terminal as private the secters are the same, so the generation is OK and i get a 24 bytes (192 bit) secret. When i generate the the secrets on the card (2 cases like on the terminal) the secrets are also the same, but they ale shorter - 20 bytes (160 bit). Here are the generation codes. the terminal:
ECPublicKey publicKey;
ECPrivateKey privateKey;
...
KeyAgreement aKeyAgree = KeyAgreement.getInstance("ECDH", "BC");
aKeyAgree.init(privateKey);
aKeyAgree.doPhase(publicKey, true);
byte[] aSecret = aKeyAgree.generateSecret();
and the card side:
eyAgreement = KeyAgreement.getInstance(KeyAgreement.ALG_EC_SVDP_DH, false);
short length = terminalEcPublicKey.getW(array, (short) 0);
keyAgreement.init(cardEcPrivateKey);
short secretlength = keyAgreement.generateSecret(array, (short)0, length, buffer, (short)0);

There is a problem in your implementation of KeyAgreement.ALG_EC_SVDP_DH in the terminal side. The correct length of output of the this method of key agreement should always be 20 bytes since SHA-1 is being performed on the derived output.
So in your terminal side, you should perform SHA-1 after generating the secret data.

Related

Encrypt/Decrypt aes256cbc in Nodejs

I'm working on a porject where I need develop a Encrypt/Decrypt string in nodejs.
I receive the string the next format: pTS3JQzTxrSbd+cLESXHpg==
this string is generate from this page: https://encode-decode.com/aes-256-cbc-encrypt-online/
and use the aes-256-cbc standard
the code that i implemented is the next:
var CryptoJS = require("crypto-js");
var key = 'TEST_KEY';
var text = 'pTS3JQzTxrSbd+cLESXHpg==';
function decript(text, key) {
return CryptoJS.AES.decrypt(text.trim(), key);
}
console.log(decript(text, key).toString(CryptoJS.enc.Utf8));
But i always get an empty response.
could you say to me what is the issue?
thanks a lot!
As the documentation explains and I just answered yesterday, CryptoJS.AES when given a 'key' that is a string treats it as a password and uses password-based key derivation compatible with openssl enc. That is different from and incompatible with what your linked website does, which is not clearly stated, but based on the list of cipher names is almost certainly internally calling OpenSSL's 'EVP' interface, which means among other things that if you specify a key too short for the algorithm, as you did, it uses whatever happens to be adjacent in memory, which apparently was zero-value bytes (not unusual for programs run on operating systems newer than about 1980), and it either uses the default IV of zero bytes or similarly sets it to something that is zero bytes. And for CBC it uses PKCS5/7 padding, which is compatible with CryptoJS (and most other things). Therefore:
const CryptoJS = require('crypto-js');
var key = CryptoJS.enc.Latin1.parse("TEST_KEY\0\0\0\0\0\0\0\0"+"\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0")
var iv = CryptoJS.enc.Latin1.parse("\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0")
var ctx = CryptoJS.enc.Base64.parse("pTS3JQzTxrSbd+cLESXHpg==")
var enc = CryptoJS.lib.CipherParams.create({ciphertext:ctx})
console.log( CryptoJS.AES.decrypt (enc,key,{iv:iv}) .toString(CryptoJS.enc.Utf8) )
->
Test text

Application Cryptogram Generation and Validation Failure

I am working on a Master Card transaction processing app but still in the development stages. To be able to test my cryptogram validation app, I personalized a card using MChip with the following profile info:
MChip Jcop
app version 1.0
profile revision 1.0.11
requires universal OS.
After reading the contributions on these questions, Unable to Generate correct application Cryptogram and Generating Cryptogram Manually, I tried to check for my card's cryptogram version number but tag 0x9F10 was absent on my personalization data and there was no way I could add this tag before personalization. I have tried various cryptogram generation combinations on the Thales HSM but non is returning the same value as that returned by the card.
Being in the development stage with access to the development keys, I have checked to ensure the keys are good, same data passed for the cryptogram generation and at this stage I am completely clueless about what to do. I will appreciate any help I can get on this issue. Thanks
foreach (var tagLen in EMVTag.ParseDOL(crmDolstr))
{
requestData.Append(EMVData[tagLen.Split(',')[0]]);
dolData.AppendFormat("{0}|{1},", tagLen.Split(',')[0],
EMVData[tagLen.Split(',')[0]]);
}
string commandStr = string.Format("80 AE 8000 {0} {1} 00",
GetHexLen(requestData.ToString()), requestData.ToString());
byte[] hexData = Helpers.HexStringToBytes(commandStr);
apdu = new APDUCommand(hexData);
public APDUCommand(byte[] apdu)
{
if (apdu.Length < 5)
throw new Exception("Wrong APDU length.");
this.cla = apdu[OFFSET_CLA];
this.ins = apdu[OFFSET_INS];
this.p1 = apdu[OFFSET_P1];
this.p2 = apdu[OFFSET_P2];
this.lc = apdu[OFFSET_LC];
if (this.lc == apdu.Length - 5)
this.le = (byte) 0;
else if (this.lc == apdu.Length - 5 - 1)
this.le = apdu[apdu.Length - 1];
else
throw new Exception("Wrong LC value.");
this.data = new byte[this.lc];
System.Array.Copy(apdu, OFFSET_CDATA, this.data, 0,
this.data.Length);
}
The data you use (from CDOL) are not sufficient to generate cryptogram. Cryptogram usually includes AIP, ATC and CVR.
Please look at the response to the cryptogram generation for IAD as it usually contains also a dynamically generated CVR that is used in the cryptogram generation process.

Difference between signing with SHA256 vs. signing with RSA-SHA256

I play with digital signatures using node.js. For test purpose, I created a digital signature of some XML data, first using only SHA256, then using RSA-SHA256.
The thing that puzzles me is that both methods of signing create exactly the same signature. Both signatures are identical. If they're identical, then why two different methods (SHA256 vs. RSA-SHA256)?
I include code below:
var crypto = require('crypto'),
path = require('path'),
fs = require('fs'),
pkey_path = path.normalize('private_key.pem'),
pkey = '';
function testSignature(pkey) {
var sign1 = crypto.createSign('RSA-SHA256'),
sign2 = crypto.createSign('SHA256');
fs.ReadStream('some_document.xml')
.on('data', function (d) {
sign1.update(d);
sign2.update(d);
})
.on('end', function () {
var s1 = sign1.sign(pkey, "base64"),
s2 = sign2.sign(pkey, "base64");
console.log(s1);
console.log(s2);
});
}
// You need to read private key into a string and pass it to crypto module.
// If the key is password protected, program execution will stop and
// a prompt will appear in console, awaiting input of password.
testSignature(fs.readFileSync(pkey_path));
The code above outputs some string, which is the signature, and then again exactly the same string, which is also a signature of the same data, but created with - supposedly - different algorithm, yet it's identical with previous one...
A signature cannot be created by SHA256 alone.
SHA256 is a hashing algorithm; i.e. an algorithm creating a short fingerprint number representing an arbitrary large amount of data. To produce a signature, this fingerprint still has to be treated somehow to allow identification of the holder of some private signature key. One such treatment is to encrypt the fingerprint using the private key of a rsa key pair allowing others to decrypt the result using the associated public key and so verify that the keeper of the private key indeed must have been the signer.
In the context of your crypto API that RSA encryption scheme either is the default treatment when the treatment is not explicitly named, or the kind of treatment is deduced from the private key you use as parameter in the sign call --- if it is a RSA private key, it uses RSA; if it is a DSA key, it uses DSA; ...
What you are looking at is two times a PKCS#1 v1.5 signature. This is a deterministic scheme for signatures, so it always returns the same result (compare this to the PSS scheme, which is randomized, providing better security properties). RSA PKCS#1 v1.5 signature generation and PSS signature generation is defined within RFC 3447 (also known as the RSA v2.1 specifications).
If you use your code with RSA 512 bits (testing purposes only, use a key of 2048 bits or over) then you will get the following result:
Private key:
-----BEGIN RSA PRIVATE KEY-----
MIIBOgIBAAJBALLA/Zk6+4JFJ+XdU6wmUkuEhGa8hLZ+m6J3puZbc9E+DSt7pW09
yMYwHF5MMICxE86cA6BrLjQLUUwvquNSK0ECAwEAAQJAcI/w4e3vdRABWNFvoCcd
iWpwSZWK6LR/YuZ/1e1e2DJw+NXyPXbilSrLvAdxnjlWTsTxUiEy1jFh36pSuvMk
AQIhAO4WtgysOoWkyvIOLIQwD0thWfdHxTpxqfd6flrBJ91hAiEAwDOQqHhnSeET
+N/hwUJQtCkHBJqvMF/kAi4Ry5G+OeECIEg1Exlc0pLdm781lUKx4LGX4NUiKyrC
di3cNJ4JnrGBAiEAi2gbYYbLbDO8F8TTayidfr9PXtCPhyfWKpqdv6i7cCECIH7A
6bh0tDCl6dOXQwbhgqF4hXiMsqe6DaHqIw8+XLnG
-----END RSA PRIVATE KEY-----
signature as base 64 (using your code):
YY6sur9gkHXH23cUbDMYjCJYqDdBK8GKp4XyRNl8H09cW8H/gKQI9Z6dkLMhNh7oPq1yABCRfTP8yRtfLVj7FA==
and in hexadecimals
618eacbabf609075c7db77146c33188c2258a837412bc18aa785f244d97c1f4f5c5bc1ff80a408f59e9d90b321361ee83ead720010917d33fcc91b5f2d58fb14
decrypted using RAW RSA (i.e. just modular exponentiation with the public exponent):
0001ffffffffffffffffffff003031300d0609608648016503040201050004202af565b95e5f4479492c520c430f07ae05d2bcff8923322e6f2ef6404d72ac64
This is a very clear example of a PKCS#1 signature, easily recognized by the FF padding, followed by the ASN.1 structure (starting with 30, SEQUENCE):
SEQUENCE (2 elem)
SEQUENCE (2 elem)
OBJECT IDENTIFIER 2.16.840.1.101.3.4.2.1 {joint-iso-itu-t(2) country(16) us(840) organization(1) gov(101) csor(3) nistAlgorithm(4) hashAlgs(2) sha256(1)}
NULL
OCTET STRING(32 byte) 2AF565B95E5F4479492C520C430F07AE05D2BCFF8923322E6F2EF6404D72AC64
So that thing in the end is the hash, in this case over just Test 123\n as I didn't want to type out any XML today.
$ sha256sum some_document.xml
2af565b95e5f4479492c520c430f07ae05d2bcff8923322e6f2ef6404d72ac64 some_document.xml
$ sha256sum some_document.xml
2af565b95e5f4479492c520c430f07ae05d2bcff8923322e6f2ef6404d72ac64 some_document.xml

Node.js Crypto AES Cipher

For some odd reason, Node's built-in Cipher and Decipher classes aren't working as expected. The documentation states that cipher.update
"Returns the enciphered contents, and can be called many times with new data as it is streamed."
The docs also state that cipher.final
"Returns any remaining enciphered contents."
However, in my tests you must call cipher.final to get all of the data, thus rendering the Cipher object worthless, and to process the next block you have to create a new Cipher object.
var secret = crypto.randomBytes(16)
, source = crypto.randomBytes(8)
, cipher = crypto.createCipher("aes128", secret)
, decipher = crypto.createDecipher("aes128", secret);
var step = cipher.update(source);
var end = decipher.update(step);
assert.strictEqual(source.toString('binary'), end); // should not fail, but does
Note that this happens when using crypto.createCipher or crypto.createCipheriv, with the secret as the initialization vector. The fix is to replace lines 6 and 7 with the following:
var step = cipher.update(source) + cipher.final();
var end = decipher.update(step) + decipher.final();
But this, as previously noted, renders both cipher and decipher worthless.
This is how I expect Node's built-in cryptography to work, but it clearly doesn't. Is this a problem with how I'm using it or a bug in Node? Or am I expecting the wrong thing? I could go and implement AES directly, but that would be time-consuming and annoying. Should I just create a new Cipher or Decipher object every time I need to encrypt or decrypt? That seems expensive if I'm doing so as part of a stream.
I was having two problems: the first is that I assumed, incorrectly, that the size of a block would be 64 bits, or 8 bytes, which is what I use to create the "plaintext." In reality the internals of AES split the 128 bit plaintext into two 64 bit chunks, and go from there.
The second problem was that despite using the correct chunk size after applying the above changes, the crypto module was applying auto padding, and disabling auto padding solved the second problem. Thus, the working example is as follows:
var secret = crypto.randomBytes(16)
, source = crypto.randomBytes(16)
, cipher = crypto.createCipheriv("aes128", secret, secret); // or createCipher
, decipher = crypto.createDecipheriv("aes128", secret, secret);
cipher.setAutoPadding(false);
decipher.setAutoPadding(false);
var step = cipher.update(source);
var end = decipher.update(step);
assert.strictEqual(source.toString('binary'), end); // does not fail
AES uses block sizes of 16 bytes (not two times 8 as you were suggesting). Furthermore, if padding is enabled it should always pad. The reason for this is that otherwise the unpadding algorithm cannot distinguish between padding and the last bytes of the plaintext.
Most of the time you should not expect the ciphertext to be the same size as the plain text. Make sure that doFinal() is always called. You should only use update this way for encryption / decryption if you are implementing your own encryption scheme.
There's a node.js issue with calling update multiple times in a row. I suppose it's been solved and reflected in the next release.

Verify detached PKSC #7 signature in java (may be by Bouncy Castle), generated by CAPICOM

all!
I'm solving two problems:
Stop using CAPICOM to sign/verify documents - because it's no longer supported by Microsoft (see more: Alternatives to Using CAPICOM)
Flowing out first we want to wide list of supported browsers by our system (customized Documentum webtop). Now it fully supported only by IE 6+ because of CAPICOM used as ActiveX.
For signing we use windows module CryptoPro because of only it officially have legal effect in Russia. Our system deployed in government of one Russian region.
Our system works already 5 years and there are many generated signs (all by CAPICOM). Signs is detached and persist in database.
We want to find solution to verify those signs in java code (wrapped in Applet).
I have tried code below, but I can't find any suitable method to verify any signature. This method always returns false.
public boolean verifyFile(String fileInput, String metadata, String base64Signature) throws Exception {
Security.addProvider(new BouncyCastleProvider());
byte[] signedContent = Base64.decode(base64Signature.getBytes("UTF-8"));
CMSSignedData cms7 = new CMSSignedData(signedContent);
CertStore certs = cms7.getCertificatesAndCRLs("Collection", "BC");
SignerInformationStore signers = cms7.getSignerInfos();
Collection c = signers.getSigners();
SignerInformation signer = (SignerInformation)c.iterator().next();
Collection certCollection = certs.getCertificates(signer.getSID());
Iterator certIt = certCollection.iterator();
X509Certificate cert = (X509Certificate)certIt.next();
Signature signature = Signature.getInstance("SHA1withRSA", "BC");
signature.initVerify(cert.getPublicKey());
String signedContentString = getSignedDataString(fileInput, metadata);
signature.update(signedContentString.getBytes("UTF-8"));
return signature.verify(signer.getSignature());
}
Have somebody any solution or already encountered this pr

Resources