How to encode byte array into string and the same decode the encodedstring into byte array? Mostly i want to save the encoded string into server and decoded the same into byte array for image loading from server.
If programmers want to encode and decode byte[] in java. Here is the sample code for your knowledge. Use this below code
Encode Byte array into String
String strByteValue = Base64.encodeToString(byteArrayValue, Base64.URL_SAFE);
Decode String into Byte[]
byte decodedString[] = Base64.decode(strByteValue , Base64.URL_SAFE);
Related
k = b'\xf2-\x92\xe7\x98\x90#\xddF\xbf\x13I4\x92\x0f\xc5'
I tried encoding in 'utf-8', but I am getting an error
utf-8' codec can't decode byte 0xf2 in position 0: invalid continuation byte
How can I properly convert this to a string object?
#Update
Ok I had to look at your byte it was the wrong encoding you need to use
ISO-8859-1
encoding = 'ISO-8859-1'
k = b'\xf2-\x92\xe7\x98\x90#\xddF\xbf\x13I4\x92\x0f\xc5'.decode(encoding)
print(type(k))
That will fix the issue
I am trying to serialize a bytes object - which is an initialization vector for my program's encryption. But, the Google Protocol Buffer only accepts strings. It seems like the error starts with casting bytes to string. Am I using the correct method to do this? Thank you for any help or guidance!
Or also, can I make the Initialization Vector a string object for AES-CBC mode encryption?
Code
Cast the bytes to a string
string_iv = str(bytes_iv, 'utf-8')
Serialize the string using SerializeToString():
serialized_iv = IV.SerializeToString()
Use ParseToString() to recover the string:
IV.ParseFromString( serialized_iv )
And finally, UTF-8 encode the string back to bytes:
bytes_iv = bytes(IV.string_iv, encoding= 'utf-8')
Error
string_iv = str(bytes_iv, 'utf-8')
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x9b in position 3: invalid start byte
If you must cast an arbitrary bytes object to str, these are your option:
simply call str() on the object. It will turn it into repr form, ie. something that could be parsed as a bytes literal, eg. "b'abc\x00\xffabc'"
decode with "latin1". This will always work, even though it technically makes no sense if the data isn't text encoded with Latin-1.
use base64 or base85 encoding (the standard library has a base64 module wich covers both)
Other way to encode byte[] to String and decode String to byte[] without using Base64.
Because when I encode a byte[] to String and then I compress the String using LZW. I can't decode it back to byte[] using Base64. Is there an encoder or decoder which can keep decode a String although the String has modified by LZW?
Not practical, but easy to implement and easily reversible encoding of bytes to Unicode characters is to encode each byte into (offset+byte_value) in such a way that all 256 values fit into some valid Unicode block.
I.e. looking at Unicode blocks range 2200..22FF (Mathematical Operators) is quite reasonable for such operation (C# sample):
char EncodeByte(byte x) { return (char)(0x2200 + x);}
byte DecodeByte(char x) { return (byte)(x - 0x2200);}
Note: regular LZW manipulates sequences of bytes - so no encoding necessary when starting from bytes.
When I add a single byte to my string at 0x80 or above, golang will add 0xc2 before my byte.
I think this has something to do with utf8 runes. Either way, how do I just add 0x80 to the end of my string?
Example:
var s string = ""
len(s) // this will be 0
s += string(0x80)
len(s) // this will be 2, string is now bytes 0xc2 0x80
The From the specification:
Converting a signed or unsigned integer value to a string type yields a string containing the UTF-8 representation of the integer.
The expression string(0x80) evaluates to a string with the UTF-8 representation of 0x80, not a string containing the single byte 0x80. The UTF-8 representation of 0x80 is 0xc2 0x80.
Use the \x hex escape to specify the byte 0x80 in a string:
s += "\x80"
You can create a string from an arbitrary sequence of bytes using the string([]byte) conversion.
s += string([]byte{0x80})
I haven't found a way to avoid adding that character, if I use string(0x80) to convert the byte. However, I did find that if I change the whole string to a slice of bytes, then add the byte, then switch back to a string, I can get the correct byte order in the string.
Example:
bytearray := []byte(some_string)
bytearray = append(bytearray, 0x80)
some_string = string(bytearray)
Kind of a silly work around, if anyone finds a better method, please post it.
Suppose there is a string:
String str="Hello";
HOw can i get the ASCII value of that above mentioned string?
Given your comment, it sounds like all you need is:
char[] chars = str.ToCharArray();
Array.Sort(chars);
A char value in .NET is actually a UTF-16 code unit, but for all ASCII characters, the UTF-16 code unit value is the same as the ASCII value anyway.
You can create a new string from the array like this:
string sortedText = new string(chars);
Console.WriteLine(chars);
As it happens, "Hello" is already in ascending ASCII order...
byte[] asciiBytes =Encoding.ASCII.GetBytes(str);
You now have an array of the ASCII value of the bytes