Converting Byte Array to String (NXC) - string

Is there a way to show a byte array on the NXTscreen (using NXC)?
I've tried like this:
unsigned char Data[];
string Result = ByteArrayToStr(Data[0]);
TextOut(0, 0, Result);
But it gives me a File Error! -1.
If this isn't possible, how can I watch the value of Data[0] during the program?

If you want to show the byte array in hexadecimal format, you can do this:
byte buf[];
unsigned int buf_len = ArrayLen(buf);
string szOut = "";
string szTmp = "00";
// Convert to hexadecimal string.
for(unsigned int i = 0; i < buf_len; ++i)
{
sprintf(szTmp, "%02X", buf[i]);
szOut += szTmp;
}
// Display on screen.
WordWrapOut(szOut,
0, 63,
NULL, WORD_WRAP_WRAP_BY_CHAR,
DRAW_OPT_CLEAR_WHOLE_SCREEN);
You can find WordWrapOut() here.
If you simply want to convert it to ASCII:
unsigned char Data[];
string Result = ByteArrayToStr(Data);
TextOut(0, 0, Result);
If you only wish to display one character:
unsigned char Data[];
string Result = FlattenVar(Data[0]);
TextOut(0, 0, Result);

Try byte. byte is an unsigned char in NXC.
P.S. There is a heavily-under-development debugger in BricxCC (I assume you're on windows). Look here.
EDIT: The code compiles and runs, but does not do anything.

Related

String to Int Conversion in Arduino

I' am trying to convert string to int(like Integer.parseInt() in java) in arduino in order to make some operation's on the numbers. Unfortunately none of my solution's worked.
Until now I tried:
Create char Array and call atoi function:
String StringPassword;
uint8_t *hash;
//Here I define hash
int j;
for (j = 0; j < 20; ++j) {
StringPassword.concat(hash[j]);
}
//Checking String Size
Serial.println("Size");
//Checking String
Serial.println(StringPassword.length());
Serial.println(StringPassword);
int jj;
char PasswordCharArray[StringPassword.length()];
StringPassword.toCharArray(PasswordCharArray, StringPassword.length());
awa = atoi(PasswordCharArray);
Serial.println(awa);
Output:
Size
48
168179819314217391617011617743249832108225513297
18209
Create char Array for null terminated string and call atoi function:
String StringPassword;
uint8_t *hash;
//Here I define hash
int j;
for (j = 0; j < 20; ++j) {
StringPassword.concat(hash[j]);
}
//Checking String Size
Serial.println("Size");
//Checking String
Serial.println(StringPassword.length());
Serial.println(StringPassword);
int jj;
char PasswordCharArray[StringPassword.length()+1];
StringPassword.toCharArray(PasswordCharArray,StringPassword.length()+1);
awa = atoi(PasswordCharArray);
Serial.println(awa);
Output:
Size
48
168179819314217391617011617743249832108225513297
-14511
use toInt Function:
String StringPassword;
uint8_t *hash;
//Here I define hash
int j;
for (j = 0; j < 20; ++j) {
StringPassword.concat(hash[j]);
}
//Checking String Size
Serial.println("Size");
//Checking String
Serial.println(StringPassword.length());
Serial.println(StringPassword);
awa = StringPassword.toInt();
Serial.println(awa);
Output:
Size
48
168179819314217391617011617743249832108225513297
-14511
What is the proper way of changing String to Int so:
awa = 168179819314217391617011617743249832108225513297 ?
And could someone explain to me why my solution's didn't worked? I tried to use the function's that were mentioned on Stackoverflow and Arduino forum to solve this.
The number 168179819314217391617011617743249832108225513297 reaches the maximum integer value limit so therefore this will not convert into an integer.
Try using atol() instead of atoi(). Long numbers can hold more data like the number shown above.

How to return a int converted to char array back to main for displaying it

My doubts are as follows :
1 : how to send 'str' from function 'fun' , So that i can display it in main function.
2 : And is the return type correct in the code ?
2 : the current code is displaying some different output.
char * fun(int *arr)
{
char *str[5];
int i;
for(i=0;i<5;i++)
{
char c[sizeof(int)] ;
sprintf(c,"%d",arr[i]);
str[i] = malloc(sizeof(c));
strcpy(str[i],c);
}
return str;
}
int main()
{
int arr[] = {2,1,3,4,5},i;
char *str = fun(arr);
for(i=0;i<5;i++)
{
printf("%c",str[i]);
}
return 0;
}
how to send 'str' from function 'fun' , So that i can display it in main function.
This is the way:
char* str = malloc( size );
if( str == NULL ) {
fprintf( stderr,"Failed to malloc\n");
}
/* Do stuff with str, use str[index],
* remember to free it in main*/
free(str);
And is the return type correct in the code ?
No, Probably char** is the one you need to return.
the current code is displaying some different output.
Consider explaining what/why do you want to do ? The way you have written, seems completely messed up way to me. You're passing array of integer but not its length. How is the fun() supposed to know length of array? Another problem is array of pointers in fun().
You can't write a int to a char (See the both size). So I used char array instead.
However, I'm not sure if this is what you want to do (might be a quick and dirty way of doing it):
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char**
fun(int *arr, int size)
{
char **str = malloc( sizeof(char*)*size );
if( str == NULL ) {
fprintf( stderr, "Failed malloc\n");
}
int i;
for(i=0;i<5;i++) {
str[i] = malloc(sizeof(int));
if( str == NULL ) {
fprintf( stderr, "Failed malloc\n");
}
sprintf(str[i],"%d",arr[i]);
}
return str;
}
int
main()
{
int arr[] = {2,1,3,4,5},i;
char **str = fun(arr, 5);
for(i=0;i<5;i++) {
printf("%s\n",str[i]);
free(str[i]);
}
free(str);
return 0;
}
I made these changes to your code to get it working:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char **fun(int *arr)
{
char **str = malloc(sizeof(char *) * 5);
int i;
for(i = 0; i < 5; i++) {
if ((arr[i] >= 0) && (arr[i] <= 9)) {
char c[2] ;
sprintf(c, "%d", arr[i]);
str[i] = (char *) malloc(strlen(c) + 1);
strcpy(str[i],c);
}
}
return str;
}
int main()
{
int arr[] = {2, 1, 3, 4, 5}, i;
char **str = fun(arr);
for(i = 0; i < 5; i++) {
printf("%s", str[i]);
free(str[i]);
}
printf("\n");
free(str);
return 0;
}
Output
21345
I added a check to make sure that arr[i] is a single digit number. Also, returning a pointer to a stack variable will result in undefined behavior, so I changed the code to allocate an array of strings. I don't check the return value of the malloc calls, which means this program could crash due to a NULL pointer reference.
This solution differs from the others in that it attempts to answer your question based on the intended use.
how to send 'str' from function 'fun' , So that i can display it in main function.
First, you need to define a function that returns a pointer to array.
char (*fun(int arr[]))[]
Allocating variable length strings doesn't buy you anything. The longest string you'll need for 64bit unsigned int is 20 digits. All you need is to allocate an array of 5 elements of 2 characters long each. You may adjust the length to suit your need. This sample assumes 1 digit and 1 null character. Note the allocation is done only once. You may choose to use the length of 21 (20 digits and 1 null).
For readability on which values here are related to the number of digits including the terminator, I'll define a macro that you can modify to suit your needs.
#define NUM_OF_DIGITS 3
You can then use this macro in the whole code.
char (*str)[NUM_OF_DIGITS] = malloc(5 * NUM_OF_DIGITS);
Finally the receiving variable in main() can be declared and assigned the returned array.
char (*str)[NUM_OF_DIGITS] = fun(arr);
Your complete code should look like this:
Code
char (*fun(int arr[]))[]
{
char (*str)[NUM_OF_DIGITS] = malloc(5 * NUM_OF_DIGITS);
int i;
for(i=0;i<5;i++)
{
snprintf(str[i],NUM_OF_DIGITS,"%d",arr[i]); //control and limit to single digit + null
}
return str;
}
int main()
{
int arr[] = {24,1,33,4,5},i;
char (*str)[NUM_OF_DIGITS] = fun(arr);
for(i=0;i<5;i++)
{
printf("%s",str[i]);
}
free(str);
return 0;
}
Output
2413345
With this method you only need to free the allocated memory once.

How to use a set of numbers as the Key for RCA Encryption

i would like to know how can i use a set of numbers as a KEY for the rc4 encryption.
According to the internet and wiki the KEY is actually a string of letters but the bytes are used . But in my program i need to use a 6 digit number as a KEY. Should i covert it to a string or how.
Key Sheudling Algorithm is indicated below.
void ksa(u_char *State, u_char *key) {
int byte, i, keylen, j=0;
keylen = (int) strlen((char *) key);
for(i=0; i<256; i++) {
j = (j + State[i] + key[i%keylen]) % 256;
swap(&State[i], &State[j]);
}
How can i modify the code or should i just convert the numbers to string.
Strings and numbers are both bytes. Here is a working RC4 code that accepts a key of unsigned chars:
#include<stdio.h>
#include<string.h>
#define SIZE 256
unsigned char SBox[SIZE];
int i;
int j;
void initRC4(unsigned char Key[]);
unsigned char getByte(void);
void initRC4(unsigned char Key[])
{
unsigned char tmp;
unsigned char KBox[SIZE];
for(i=0;i<SIZE;i++)
SBox[i]=i;
for(i=0;i<SIZE;i++)
KBox[i]=Key[i % strnlen(Key,SIZE)];
for(j=0,i=0;i<SIZE;i++)
{
j=(j+SBox[i]+KBox[i]) % SIZE;
tmp=SBox[i];
SBox[i]=SBox[j];
SBox[j]=tmp;
}
}
unsigned char getByte(void)
{
unsigned char tmp;
i=(i+1)%SIZE;
j=(j+SBox[i])%SIZE;
tmp=SBox[i];
SBox[i]=SBox[j];
SBox[j]=tmp;
return SBox[(SBox[i]+SBox[j])%SIZE];
}
First, you initialize the RC4 stream:
initRC4(key);
Then you do:
getByte()
...which always returns 1 byte from the RC4 stream you've set up.
One thing to remember though - a letter in string is not always equal to 1 byte. Same goes for the integers and number symbols in strings. Really, you must read an introduction to computer programming before you mess with ciphers.
Here is a demonstration of how bytes differ in strings in integers:
#include <string>
int main(int argc, char **argv) {
const int n=67898;
const std::string str = "67898";
const int arrayLength = sizeof(int);
const int stringArrayLength = str.size();
unsigned char *bytePtr=(unsigned char*)&n;
printf("Bytes for integer: ");
for(int i=0;i<arrayLength;i++)
{
printf("%X ", bytePtr[i]);
}
printf("\n");
printf("Bytes for string: ");
for(int i=0;i<stringArrayLength;i++)
{
printf("%X ", str.at(i));
}
printf("\n");
return 0;
}
Output:
Bytes for integer: 3A 9 1 0
Bytes for string: 36 37 38 39 38
There will usually be a terminating byte at the end of a string, so you could add +1 byte to string size.

How to convert a char array to integer

I have an array of 1's and 0's which is compressed in such a way that when the number of 1's is greater than 10 it writes +n+ when n in the number of 1's and when the number of 0's is greater than 10 it writes -n- when n in the number of 0's otherwise it writes them as it is.
Now the issue is, I need to decompress the array to write it back to the file. But I can't find a way to convert the number of zeros or ones to integer. It keeps giving me an error which says initializing argument 1 of ‘int atoi(const char*) and another one on the same line which says invalid conversion from ‘char’ to ‘const char*’
I'm working in Linux.
Here's a peice of my code
else if(str[i]=='+')
{
n=atoi(str[i+1]);
for(int j=0;j<n;j++)
{
strcat(temp,"1");
i=i+n-1;
}
}
This is an algorithm do "expansion" - don't ever use it in production - for example, there is no error checking, so it is not safe. It is a quick example.
char *decode(char *q)
{
char *all=NULL;
long i=0;
int n='0';
char *p;
if(*q== '+')
n='1';
++q;
i=strtol(q, NULL, 10);
all=calloc( i + 1, 1);
for(p=all; i; i--)
*p++=n;
return all;
}
char *decompress(char *dest, char *str)
{
char *p=str;
char *q=dest;
for(; *p; p++)
{
if( isdigit((int)*p) )
{
*q++=*p;
*q=0x0;
}
else // - or +
{
char *tmp=decode(p);
strcpy(q, tmp);
q=strchr(q, '\0');
free(tmp);
p=strchr(p+1, *p); // next
}
}
return dest;
}

Unicode <-> Multibyte conversion (native vs. managed)

I'm trying to convert unicode strings coming from .NET to native C++ so that I can write them to a text file. The process shall then be reversed, so that the text from the file is read and converted to a managed unicode string.
I use the following code:
String^ FromNativeToDotNet(std::string value)
{
// Convert an ASCII string to a Unicode String
std::wstring wstrTo;
wchar_t *wszTo = new wchar_t[lvalue.length() + 1];
wszTo[lvalue.size()] = L'\0';
MultiByteToWideChar(CP_UTF8, 0, value.c_str(), -1, wszTo, (int)value.length());
wstrTo = wszTo;
delete[] wszTo;
return gcnew String(wstrTo.c_str());
}
std::string FromDotNetToNative(String^ value)
{
// Pass on changes to native part
pin_ptr<const wchar_t> wcValue = SafePtrToStringChars(value);
std::wstring wsValue( wcValue );
// Convert a Unicode string to an ASCII string
std::string strTo;
char *szTo = new char[wsValue.length() + 1];
szTo[wsValue.size()] = '\0';
WideCharToMultiByte(CP_UTF8, 0, wsValue.c_str(), -1, szTo, (int)wsValue.length(), NULL, NULL);
strTo = szTo;
delete[] szTo;
return strTo;
}
What happens is that e.g. a Japanese character gets converted to two ASCII chars (漢 -> "w). I assume that's correct?
But the other way does not work: when I call FromNativeToDotNet wizh "w I only get "w as a managed unicode string...
How can I get the Japanese character correctly restored?
Best to use UTF8Encoding:
static String^ FromNativeToDotNet(std::string value)
{
array<Byte>^ bytes = gcnew array<Byte>(value.length());
System::Runtime::InteropServices::Marshal::Copy(IntPtr((void*)value.c_str()), bytes, 0, value.length());
return (gcnew System::Text::UTF8Encoding)->GetString(bytes);
}
static std::string FromDotNetToNative(String^ value)
{
if (value->Length == 0) return std::string("");
array<Byte>^ bytes = (gcnew System::Text::UTF8Encoding)->GetBytes(value);
pin_ptr<Byte> chars = &bytes[0];
return std::string((char*)chars, bytes->Length);
}
a Japanese character gets converted to two ASCII chars (漢 -> "w). I assume that's correct?
No, that character, U+6F22, should be converted to three bytes: 0xE6 0xBC 0xA2
In UTF-16 (little endian) U+6F22 is stored in memory as 0x22 0x6F, which would look like "o in ascii (rather than "w) so it looks like something is wrong with your conversion from String^ to std::string.
I'm not familiar enough with String^ to know the right way to convert from String^ to std::wstring, but I'm pretty sure that's where your problem is.
I don't think the following has anything to do with your problem, but it is obviously wrong:
std::string strTo;
char *szTo = new char[wsValue.length() + 1];
You already know a single wide character can produce multiple narrow characters, so the number of wide characters is obviously not necessarily equal to or greater than the number of corresponding narrow characters.
You need to use WideCharToMultiByte to calculate the buffer size, and then call it again with a buffer of that size. Or you can just allocate a buffer to hold 3 times the number of chars as wide chars.
Try this instead:
String^ FromNativeToDotNet(std::string value)
{
// Convert a UTF-8 string to a UTF-16 String
int len = MultiByteToWideChar(CP_UTF8, 0, value.c_str(), value.length(), NULL, 0);
if (len > 0)
{
std::vector<wchar_t> wszTo(len);
MultiByteToWideChar(CP_UTF8, 0, value.c_str(), value.length(), &wszTo[0], len);
return gcnew String(&wszTo[0], 0, len);
}
return gcnew String((wchar_t*)NULL);
}
std::string FromDotNetToNative(String^ value)
{
// Pass on changes to native part
pin_ptr<const wchar_t> wcValue = SafePtrToStringChars(value);
// Convert a UTF-16 string to a UTF-8 string
int len = WideCharToMultiByte(CP_UTF8, 0, wcValue, str->Length, NULL, 0, NULL, NULL);
if (len > 0)
{
std::vector<char> szTo(len);
WideCharToMultiByte(CP_UTF8, 0, wcValue, str->Length, &szTo[0], len, NULL, NULL);
return std::string(&szTo[0], len);
}
return std::string();
}

Resources