encryption and decryption with openssl AES on Linux - linux

I want to use AES to communicate tcp/ip. However, difficulties arose in making the AES function.
In the process of decoding, dummy values are generated or the values are deleted.
I'd appreciate it if you could give me a little help.
int main(void)
{
unsigned char mykey[] = "01234567890123456789012345678\0";
unsigned char iv[] = "0123456789012\0";
char buf[BUF_SIZE]="hi";
char enc[BUF_SIZE];
char dec[BUF_SIZE];
AES_encryption(buf,enc,mykey,iv);
AES_decryption(enc,dec,mykey,iv);
printf("buf : %s\n",buf);
printf("enc: %s\n",enc);
printf("dec: %s\n", dec);
return 0;
}
void AES_encryption(char plainfn[], char cipherfn[], unsigned char key[],unsigned char iv[])
{
EVP_CIPHER_CTX ctx;
int in_len, out_len=0;
in_len=strlen(plainfn);
EVP_CIPHER_CTX_init(&ctx);
EVP_CipherInit_ex(&ctx,EVP_aes_128_cbc(),NULL,key,iv,AES_ENCRYPT);
EVP_CipherUpdate(&ctx,cipherfn,&out_len,plainfn,in_len);
EVP_CipherFinal_ex(&ctx,cipherfn,&out_len);
EVP_CIPHER_CTX_cleanup(&ctx);
}
void AES_decryption(char cipherfn[], char plainfn[], unsigned char key[], unsigned char iv[])
{
EVP_CIPHER_CTX ctx;
int in_len, out_len=0;
in_len=strlen(cipherfn);
EVP_CIPHER_CTX_init(&ctx);
EVP_CipherInit_ex(&ctx,EVP_aes_128_cbc(),NULL,key,iv,AES_DECRYPT);
EVP_CipherUpdate(&ctx,plainfn,&out_len,cipherfn,in_len);
EVP_CipherFinal_ex(&ctx,plainfn,&out_len);
EVP_CIPHER_CTX_cleanup(&ctx);
}
These results come out.
buf : hi
enc: U▒▒B▒ac▒▒]▒▒▒▒Y▒-
dec: hi▒?!▒

The main problem is that AES_encryption is likely put NULL chars to the enc buffer. You then count the scrambled enc buffer "string length" via strlen() in AES_decryption. This is certainly wrong since decryption can stop too early thus not reading the entire input buffer.
You should probably pass an buffer size argument to encrypt and decrypt functions to properly encrypt/decrypt the buffer(s).
Calculate the string length before encryption and some how pass the same buffer length also to the decryption stage. You probably have to encode the string length in your buf before the actual data.
Also since enc buffer is by definition scrambled you can't just printf("%s",enc) it for the same reason strlen() doesn't work for it. You need to print the chars one by one by putchar() or some other way that is immune to null chars.

Related

How to send an int over uint8_t data?

I'm using the RadioHead Packet Radio library from airspayce.com. In the example (nrf24_reliable_datagram_client & server) they let two nodes communicate with each other by sending strings back and forth. Now I want to send an int instead of a string there, and do something with this data. This is what they do in the example:
Define the buf byte.
uint8_t buf[RH_NRF24_MAX_MESSAGE_LEN];
This function receives the data:
manager.recvfromAckTimeout(buf, &len, 500, &from)
Print the buf variable.
Serial.print((char*)buf);
So far so good.Now I want to do something like:
int value = (char*)buf;
Or:
char value[10] = { (char*)buf };
But then I get:
invalid conversion from 'char*' to 'int' (or to 'char'...)
Next to that, on the other side where I'm sending the data, I have:
uint8_t data[] = { analogRead(A0) };
When I'm printing this data on the receiver side, using the code from the first question, I get weird characters. So I thought, let's try:
Serial.print((char*)buf, DEC); // or BYTE
But then I get:
call of overloaded 'print(char*, int)' is ambiguous
What am I doing wrong? Thanks in advance!
You can't just assign an array to an integer and hope that it merges the elements together for you - for example, how does it know how to merge them?
For converting a uint16_t to a uint8_t[2] array you would want to do something like this:
uint16_t analog = analogRead(A0); //read in as int.
uint8_t data[2] = {analog, (analog >> 8)}; // extract as {lower byte, upper byte)
Serial.write(data,2); //write the two bytes to the serial port, lower byte first.
You could do it in other ways like using a union of a uint16_t with an array of two uint8_t's, but the above way is more portable. You could also do it by type casting the pointer to an int, however if one end uses big endian and the other uses little endian, that won't work unless you flip the data around in the array as you are receiving it.
For the receiver end, you would have:
uint8_t data[2];
...
... //whatever you do to receive the bytes that were sent over serial.
...
//Now assuming that data[] contains the received bytes where:
//data[0] was the first in (lower byte) and data[1] was the second in (upper byte)
uint16_t merged = (data[1] << 8) | data[0]; //merge them back together
Hopefully that helps.
Also, the 'overloaded prototype' is saying that no function exists which takes that particular set of input variables. From the print class header you will find there is however this prototype:
write(const uint8_t *buffer, size_t size);
which does what you want - print a specified number of uint8_t's from an array.

How to convert BSTR string to Unsigned Char (Using com technology in the appln)

I am writing small application which uses com technology. I want to convert BSTR string to an unsigned char. To do this, i used W2A() Macro to convert from BSTR to String and then copied String.C_STR() to an unsigned char array. The code snippet is as follows:
Send(BSTR *packet, int length)
{
std::string strPacket = W2A(*packet);
unsigned char * pBuffer = new unsigned char [strPacket.length()+1];
memset(pBuffer,0,strPacket.length()+1);
memcpy(pBuffer,strPacket.c_str(),strPacket.length()+1);
}
This works fine when packet contains normal string. But if the packet contains a NUL character in it, the problem occurs. Some unknown characters appear after that NUL in the pBuffer i.e, after conversion.
Can anyone please let me know how to avoid that? Or is there any other way to do it correctly?
A BSTR is a Windows API type and must be managed with API macros or functions. If you cannot use W2A macro because your string may have null chars inside, you will have to use functions as WideCharToMultiByte that can convert from wide characters of BSTR to narrow chararacters for a char*. Be sure to have the SDK documentation. Alternatively, you could make you program use WCHARs

How to send integer as a string with WriteFile for serialport

I want to send an integer as a string buffer to a serial port with WriteFile. This data value is result from the sensor, this data max has 2 characters.
I have tried to convert with itoa
for example:
DWORD nbytes;
int a,b,c;
a=10;
char *tempa ="";
tempa = itoa(a, tempa,0);
if(!WriteFile( hnd_serial, a, 2, &nbytes, NULL )){MessageBox(L"Write Com Port fail!");return;}
This code is not working.
Unhandled exception at 0x1024d496 (msvcr100d.dll) in ENVSConfig.exe: 0xC0000094: Integer division by zero.
Also I have tried the suggestion from this website:
convert int to string but still does not working to.
Is there any clue to do this?
You are not using itoa properly, you need to allocate space for your string, you need to provide a proper radix (this is where your divide-by-zero error is happening) and finally you need to use the buffer, not your original a value, as the buffer in your write.
Try the following:
DWORD nbytes;
int a,b,c;
a = 10;
char tempa[64]; // Randomly picked 64 characters as the max size
itoa(a, tempa, 10);
if(!WriteFile(hnd_serial, tempa, 2, &nbytes, NULL))
{
MessageBox(L"Write Com Port fail!");
return;
}

Arduino: Crashes and errors when concatenating Strings

I try to concatenate the output of AES-256 encryption to a string (to compare this string against the encrypted String sent from an Android phone).
Basically, the concatination seems to work, but after a few runs errors (non readable characters, string getting shorter instead of longer) or crashes occur. It is reproducible, crashes at the exact same point after restart.
I extracted some lines of Arduino code that demonstrate the problem. It does the following:
Create a random number and write it into an array (works)
AES- encode this array (works)
Build a HEX representation of each array index (works)
Concatenate the indices to a String (crashes)
#include <SPI.h>
#include "aes256.h" //include this lib
uint8_t key[] = {9,2,3,4,5,6,7,8,1,2,3,4,5,6,7,8,
1,2,3,4,5,6,7,8,1,2,3,4,5,6,7,8 }; //the encryption key
aes256_context ctxt; //context needed for aes library
void setup() {
Serial.begin(9600);
}
void loop() {
uint8_t data[] = {
0x53, 0x73, 0x64, 0x66, 0x61, 0x73, 0x64, 0x66,
0x61, 0x73, 0x64, 0x66, 0x61, 0x73, 0x64, 0x65, }; //the message to be encoded
long InitialRandom = random(2147483647); //0 to highest possible long
String RandomString = "" ;
RandomString+=InitialRandom; //random number to String
Serial.println(RandomString); //for debugging
//update data array: Random String into data array
for (int x=0; x<RandomString.length(); x++){
data[x] = RandomString[x];
}
//this encrypts data array, which changes
aes256_init(&ctxt, key); //initialize the lib
aes256_encrypt_ecb(&ctxt, data); //do the encryption
aes256_done(&ctxt);
//Here the problem starts.............................................
String encrypted=""; //the string I need finally
for (int x=0; x<sizeof(data); x++){ //data array is 16 in size
int a = data[x];
String b = String (a, HEX);
if(b.length()==1) b="0"+b; //if result is e.g "a" it should be "0a"
encrypted.concat(b); //this line causes the problem!!!
//Serial.println(b); //works perfect if above line commented out
Serial.println(encrypted); //see the string geting longer until problems occur
}
//Here the problem ends.............................................
Serial.println(); //go for next round, until crashing
}
I have searched the forums, tried different ways to concatenate (+ operator, strcat). All had similar effects. I read that the String library had a bug, updated Arduino IDE to 1.0.
This has kept me busy for days, any help is highly appreciated,
Thanks a lot!
You are probably running out of memory as Arduino only has a small amount.
Check how much memory you have free during your loop.
The culprit may be that the implementation of String (see Arduino WString.cpp) is using realloc(), and your memory is probably being heavily defregmented with one or two byte strings (each of which has a 16 byte heap header cost).
You could re-write the above more efficiently by pre-allocating a String reserve() functions to pre-allocate the buffer. Or re-write it using native C++ char arrays.

openssl encryption and decryption using evp library

I have a plain text and I have the cipher text with me and my task is to find the key for the cipher text declared. The key is a word list like a dictionary. I have written the code in c and it compiles perfect and creates the file with all the ciphers.
The problem I am facing is that every time i run the code a cipher text is completely different. I have no clue where I am making a mistake.
The following is the code I had written
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#include <openssl/evp.h>
int main()
{
int i;
char words[32], t;
FILE *key, *outFile;
const char *out = "Output.txt";
unsigned char outbuf[1024 + EVP_MAX_BLOCK_LENGTH];
unsigned char iv[] = "0000000000000000";
int outlen, tmplen;
int num;
EVP_CIPHER_CTX ctx;
EVP_CIPHER_CTX_init(&ctx);
char inText[] = "This is a top secret.";
char cipherText[] = "8d20e5056a8d24d0462ce74e4904c1b513e10d1df4a2ef2ad4540fae1ca0aaf9";
key = fopen("words.txt", "r");
if( remove("ciphertext.txt") == -1 ) {
perror("Error deleting file");
}
outFile = fopen("ciphertext.txt", "a+");
if( key < 0 || outFile < 0 )
{
perror ("Cannot open file");
exit(1);
}
char pbuffer[1024];
while ( fgets(words,32, key) )
{
i=strlen(words);
words[i-1]='\0';
//printf("%s",words);
i = 0;
EVP_EncryptInit_ex(&ctx, EVP_aes_128_cbc(), NULL, words, iv);
if(!EVP_EncryptUpdate(&ctx, outbuf, &outlen, inText, strlen(inText)))
{
EVP_CIPHER_CTX_cleanup(&ctx);
return 0;
}
if(!EVP_EncryptFinal_ex(&ctx, outbuf + outlen, &tmplen))
{
EVP_CIPHER_CTX_cleanup(&ctx);
return 0;
}
outlen += tmplen;
print_hex(outbuf, outlen, outFile);
}
fclose(key);
fclose(outFile);
return 1;
}
int print_hex(unsigned char *buf, int len, FILE *outFile)
{
int i,n;
char x='\n';
for ( i = 0; i < len; i++ )
{
fprintf(outFile,"%02x",buf[i]);
}
fprintf(outFile,"%c",x);
return (0);
}
Since the key is a word. The words in the wordlist can be of size < or > 16 bytes and from my research on openssl it was said that there will be a pkcs#5 padding if the block length is does not fit into 16bytes. Is it the same case for the key also.
The cipher text I declared does not match with the cipher text I am generating from the program and I am unable to find the key for the cipher text.
I need help from the experts. I would appreciate if some one helps me in getting out of the trouble
Thanks in advance
What are you actually trying to achieve? Your code looks like an attempt to carry out a brute-force attack using a dictionary of passwords ... I'm not sure I should be trying to help with that!
I'll assume it's just an exercise ...
The first thing that strikes me is that you are setting your initialization vector (the variable iv) to a string of ASCII zeros. That's almost certainly wrong, and you probably need to use binary zeros.
unsigned char iv[16] = { 0 };
I don't know how the ciphertext that you have was generated (by another program, presumably) but I would imagine that that program didn't use the dictionary word itself as a key, but went through some sort of key derivation process first. You are using 128-bit AES as your encryption algorithm, so your keys should be 16 bytes long. You could achieve that by padding, as you suggest, but it's more usual to go through some process that mixes up the bits of the key to make it look more random and to distribute the key bits throughout the whole key. It wouldn't be unusual to hash the word and to use the output of the hash function rather than the word itself as key. Another possibility is that the dictionary word may be used as the input to a passphrase-based key derivation function such as that defined in PKCS#5.
You really need to find out how the word is used to generate a key before you can get any further with this.
Thank you very much for the reply.
Yes it is just an exercise and is like a dictionary attack.
I am supposed to use iv with zeros but not ASCII zero, which is one of the mistakes I had made.
I assume the given cipher text is encrypted purely with a word from the word list without any hashing and might be padding is done but I am not sure because I am supposed to do find the key from the cipher text. The word list might have words less than 16 bytes or words greater than 16 bytes. So the problem I am thinking might be with the padding.
I am thinking may be if the word length is less than 16 bytes, then I have to pad with either ASCII zeros or something like that. Which one do you suggest me to do and with little push may be I am finished.
Thanks

Resources