Lua decompress gzip from string - string

I have some trouble with decompress gzip from string in lua. (mb bad understanding)
Response of one web-services is base64-encoded gzip string, for sample I get some code on C#.
public static string Decompress(byte[] value, Encoding Encoding = null)
{
if (value == null) return null;
Encoding = Encoding ?? System.Text.Encoding.Unicode;
using (var inputStream = new MemoryStream(value))
using (var outputStream = new MemoryStream())
{
using (var zip = new GZipStream(inputStream, CompressionMode.Decompress))
{
byte[] bytes = new byte[4096];
int n;
while ((n = zip.Read(bytes, 0, bytes.Length)) != 0)
{
outputStream.Write(bytes, 0, n);
}
zip.Close();
}
return Encoding.GetString(outputStream.ToArray());
}
}
static void Main(string[] args)
{
const string encodedText = "H4sIAAAAAAAEAHMNCvIPUlRwzS0oqVQoLinKzEtXyC9SyCvNyYFxM/OAqKC0RKEgsSgxN7UktQgAwOaxgjUAAAA=";
byte[] decodedBytes = Convert.FromBase64String(encodedText);
var decodedString = Decompress(decodedBytes, Encoding.UTF8);
Console.WriteLine(decodedString);
}
I try to do this with lua (on nginx) and make from base64 string array of byte
local byte_table={};
base64.base64_decode(res_string):gsub(".", function(c){
table.insert(byte_table, string.byte(c))
})
but have some problem with zlib.
Please help me understanding how can I use IO stream in lua and decompress gzip.

I try to do this with lua (on nginx) and make from base64 string array of byte
No, you are making Lua table with bunch of numbers.
Decode base64 and feed to zlib the whole resulting string.
I'm use http://luaforge.net/projects/lzlib/ for gzip decompressing:
local result = zlib.decompress(str,31)

Related

Javascript GZIP and btoa and decompress with C#

i am developing an application where i compress large JSON data using pako.gzip and then use the btoa function to make it base64string in order to post the data to the server. In the javascript i wrote:
var data = JSON.stringify(JSONData);
var ZippedData = pako.gzip(data, { to: 'string' });
var base64String = btoa(ZippedData);
/* post to server*/
$http.post("URL?base64StringParam=" + base64String").then(function (response) {
//do stuff
});
the problem is that i need to decompress the data again in C# code after posting in order to do other workings on it. In the C# code i wrote:
byte[] data = Convert.FromBase64String(base64StringParam);
string decodedString = System.Text.ASCIIEncoding.ASCII.GetString(data);
Encoding enc = Encoding.Unicode;
MemoryStream stream = new MemoryStream(enc.GetBytes(decodedString));
GZipStream decompress = new GZipStream(stream, CompressionMode.Decompress);
string plainDef = "";
and i get the error here
using (var sr = new StreamReader(decompress))
{
plainDef = sr.ReadToEnd();
}
Found invalid data while decoding.
any help to decompress the data back in C# will be appreciated
EDIT:to sum up what needed to be done
javascript does the following:
Plain text >> to >> gzip bytes >> to >> base64 string
i need C# to do the reverse:
Base64 >> to >> unzip bytes >> to >> plain text
Assuming the following js:
dataToCommitString = btoa(pako.gzip(dataToCommitString, { to: "string" }));
This is the correct c# code to compress/decompress with GZip: Taken from https://stackoverflow.com/a/7343623/679334
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace YourNamespace
{
public class GZipCompressor : ICompressor
{
private static void CopyTo(Stream src, Stream dest)
{
byte[] bytes = new byte[4096];
int cnt;
while ((cnt = src.Read(bytes, 0, bytes.Length)) != 0)
{
dest.Write(bytes, 0, cnt);
}
}
public byte[] Zip(string str)
{
var bytes = Encoding.UTF8.GetBytes(str);
using (var msi = new MemoryStream(bytes))
using (var mso = new MemoryStream())
{
using (var gs = new GZipStream(mso, CompressionMode.Compress))
{
//msi.CopyTo(gs);
CopyTo(msi, gs);
}
return mso.ToArray();
}
}
public string Unzip(byte[] bytes)
{
using (var msi = new MemoryStream(bytes))
using (var mso = new MemoryStream())
{
using (var gs = new GZipStream(msi, CompressionMode.Decompress))
{
//gs.CopyTo(mso);
CopyTo(gs, mso);
}
return Encoding.UTF8.GetString(mso.ToArray());
}
}
}
}
Calling it as follows:
value = _compressor.Unzip(Convert.FromBase64CharArray(value.ToCharArray(), 0, value.Length));
In client use:
let output = pako.gzip(JSON.stringify(obj));
send as: 'Content-Type': 'application/octet-stream'
=====================
then in C#:
[HttpPost]
[Route("ReceiveCtImage")]
public int ReceiveCtImage([FromBody] byte[] data)
{
var json = Decompress(data);
return 1;
}
public static string Decompress(byte[] data)
{
// Read the last 4 bytes to get the length
byte[] lengthBuffer = new byte[4];
Array.Copy(data, data.Length - 4, lengthBuffer, 0, 4);
int uncompressedSize = BitConverter.ToInt32(lengthBuffer, 0);
var buffer = new byte[uncompressedSize];
using (var ms = new MemoryStream(data))
{
using (var gzip = new GZipStream(ms, CompressionMode.Decompress))
{
gzip.Read(buffer, 0, uncompressedSize);
}
}
string json = Encoding.UTF8.GetString(buffer);
return json;
}

padding is invalid and cannot be removed decrypt value

Hello I want to encrypt and Decrypt Text . My encrypt code is working fine and matching the value that i want. But when i want Decrypt this is giving error padding is invalid and cannot be removed . In below code first i am giving my Encrypt and Decrypt both code. Also i have to fix this error Stack overflow link, StackoverlFlow Link 2 but not fix it .
string getHashKey1 = EncryptText("10002:1486703720424", "hpIw4SgN)TxJdoQj=GKo)p83$uHePgoF");
Result = 1ltQFLRGNif73uCNzi0YEvBqLKiRgx6fWsk5e/GcTQc=
string reverseKey = DecryptText('1ltQFLRGNif73uCNzi0YEvBqLKiRgx6fWsk5e/GcTQc=', "hpIw4SgN)TxJdoQj=GKo)p83$uHePgoF");
When i add in AES_Decrypt aes.Padding = PaddingMode.Zeros; i get below result.
Result : -����y�7�t���Ij���,���� Z��$�
public string EncryptText(string input, string password)
{
string result = "";
try
{
// Get the bytes of the string
byte[] bytesToBeEncrypted = Encoding.UTF8.GetBytes(input);
byte[] passwordBytes = Encoding.UTF8.GetBytes(password);
byte[] bytesEncrypted = AES_Encrypt(bytesToBeEncrypted, passwordBytes);
result = Convert.ToBase64String(bytesEncrypted);
return result;
}
catch (Exception ex)
{
}
return result;
}
public byte[] AES_Encrypt(byte[] bytesToBeEncrypted, byte[] passwordBytes)
{
byte[] encryptedBytes = null;
try
{
using (MemoryStream ms = new MemoryStream())
{
using (Aes aes = Aes.Create())
{
aes.Key = passwordBytes;
aes.Mode = CipherMode.ECB;
// "zero" IV
aes.IV = new byte[16];
using (var cs = new CryptoStream(ms, aes.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(bytesToBeEncrypted, 0, bytesToBeEncrypted.Length);
cs.Close();
}
encryptedBytes = ms.ToArray();
}
}
}
catch (Exception ex)
{
}
return encryptedBytes;
}
Above code is working fine for encrypt .
Below code is giving error
padding is invalid and cannot be removed
public string DecryptText(string input, string password)
{
// Get the bytes of the string
byte[] bytesToBeDecrypted = Convert.FromBase64String(input);
byte[] passwordBytes = Encoding.UTF8.GetBytes(password);
passwordBytes = SHA256.Create().ComputeHash(passwordBytes);
byte[] bytesDecrypted = AES_Decrypt(bytesToBeDecrypted, passwordBytes);
string result = Encoding.UTF8.GetString(bytesDecrypted);
return result;
}
public byte[] AES_Decrypt(byte[] bytesToBeDecrypted, byte[] passwordBytes)
{
byte[] decryptedBytes = null;
using (MemoryStream ms = new MemoryStream())
{
using (Aes aes = Aes.Create())
{
aes.Key = passwordBytes;
aes.Mode = CipherMode.ECB;
aes.IV = new byte[16];
using (var cs = new CryptoStream(ms, aes.CreateDecryptor(), CryptoStreamMode.Write))
{
cs.Write(bytesToBeDecrypted, 0, bytesToBeDecrypted.Length);
cs.Close(); // here i am getting error
}
decryptedBytes = ms.ToArray();
}
}
return decryptedBytes;
}
You have two problems:
1) (Already pointed out by pedrofb): You use UTF8.GetBytes in encrypt, but SHA256(UTF8.GetBytes()) in decrypt.
You shouldn't do either of these methods, but instead should use a proper Password-Based Key-Derivation Function, such as PBKDF2. In .NET PBKDF2 is available via the Rfc2898DeriveBytes class.
byte[] salt = 8 or more bytes that you always pass in as the same.
// (salt could be fixed for your application,
// but if you have users it should be unique per user and stored along with the output value)
int iterations = 100000;
// Or bigger. If you were making a user management system you
// should write this number down, too, so you can increase it over time;
// it should be whatever number makes it take 100ms or more on the fastest relevant computer)
Rfc2898DeriveBytes pbkdf2 = new Rfc2898DeriveBytes(password, salt, iterations);
passwordBytes = pbkdf2.GetBytes(16); // 16 = AES128, 24 = AES192, 32 = AES256.
2) You use Base64-encoding in encrypt, but UTF8.GetBytes in decrypt.
Bonus problems:
3) You are using Electronic Codebook (ECB) chaining. Cipher Block Chaining (CBC) is recommended over ECB.
To use CBC properly, let a random initialization vector (IV) be generated in encrypt (which is done automatically when you create a new Aes object, or you can call GenerateIV() in encrypt if you re-use the object). Then you can just prepend the IV (which will always be 16 bytes for AES) to the ciphertext. In decrypt you can either a) chop off the first 16 bytes and assign it as the IV (then decrypt the rest of the data) or b) decrypt the whole blob and ignore the first 16 bytes of decrypted output.
You are hashing the password when you decrypt,
passwordBytes = SHA256.Create().ComputeHash(passwordBytes);
but not when encrypt. This means you are using different passwords

Handling a WebRTC byte[] stream?

I'm basically trying to record audio chunks from a webrtc stream, I've been able to send the binary data with help of this resource HTML Audio Capture streaming to Node.js.
Im using netty-socketio, as this library plays well with socket-io on the client side.
Here are my server endpoints:
server.addEventListener("audio-blob", byte[].class, (socketIOClient, bytes, ackRequest) -> {
byteArrayList.add(bytes);
});
server.addEventListener("audio-blob-end", Object.class, (socket, string, ackRequest) -> {
ByteArrayInputStream in = new ByteArrayInputStream(byteArrayList.getArray());
AudioInputStream audiIn = new AudioInputStream(in, getAudioFormat(), 48000l);
AudioFileFormat.Type fileType = AudioFileFormat.Type.WAVE;
File wavFile = new File("RecordAudio.wav");
AudioSystem.write(audiIn,fileType,wavFile);
});
The format settings:
public static AudioFormat getAudioFormat() {
float sampleRate = 48000;
int sampleSizeInBits = 8;
int channels = 2;
boolean signed = true;
boolean bigEndian = true;
AudioFormat format = new AudioFormat(sampleRate, sampleSizeInBits,
channels, signed, bigEndian);
return format;
}
Im using this class to collect the byte arrays (and yes I know the risk with this solution)
class ByteArrayList {
private List<Byte> bytesList;
public ByteArrayList() {
bytesList = new ArrayList<Byte>();
}
public void add(byte[] bytes) {
add(bytes, 0, bytes.length);
}
public void add(byte[] bytes, int offset, int length) {
for (int i = offset; i < (offset + length); i++) {
bytesList.add(bytes[i]);
}
}
public int size(){
return bytesList.size();
}
public byte[] getArray() {
byte[] bytes = new byte[bytesList.size()];
for (int i = 0; i < bytesList.size(); i++) {
bytes[i] = bytesList.get(i);
}
return bytes;
}
}
The generated wav file only plays noise though, no recording is present. What am I doing wrong?
When Google:ing around for answers I stumbled upon this resource how to save a wav file
What I was doing wrong is that I had a fixed size on the AudioInputstream constructor parameter:
new AudioInputStream(in, getAudioFormat(), 48000l)
changed it to the:
new AudioInputStream(in, getAudioFormat(),byteArrayList.getArray().length);

How to get an uncoded string from a base64 encoded gzipped text

Text I am reading from a XML, which is suppose to be a string with base64 encoded and Gzip compressed. I am following the below steps:
string text = childNodes.Item(i).InnerText.Trim();
byte[] compressed = Convert.FromBase64String(text);
byte[] compressed = Convert.FromBase64String(text);
using (var uncompressed = new MemoryStream())
using (var inStream = new MemoryStream(compressed))
using (var outStream = new GZipStream(inStream, CompressionMode.Decompress))
{
outStream.CopyTo(uncompressed);
var reader = new StreamReader(uncompressed);
uncompressed.Position = 0;
string myStr = reader.ReadToEnd();
Console.WriteLine(myStr);
}
I am getting myStr value as something like :
�\b\0\0\0\0\0\0Ľk��ƒ �Y��ؘX{���z:�n�,ɏ�ek��xϞ�`�\0؍�|\t ��_3�\n(\0$�s.Cb�\0*3++��|
͛ �-7�6�fW\r\t�\b���W\"�\n�ə��L&���Ez�-����E��\t�%���/���O��Q����
i�����]�T�b�<_�dŦ�W۫���ܭn^[X�ϕ��{�"
I am expecting adecoded string. Any hint on this is much appreciated.
Thanks in Advance. :)

Deflate - Inflate errors. Causing "incorrect header check" errors

I am working on implementing a SAMLSLO through HTTP-REDIRECT binding mechanism. Using deflate-inflate tools gives me a DataFormatException with incorrect header check.
I tried this as a stand-alone. Though I did not get DataFormatException here I observed the whole message is not being returned.
import java.io.UnsupportedEncodingException;
import java.util.logging.Level;
import java.util.zip.DataFormatException;
import java.util.zip.Deflater;
import java.util.zip.Inflater;
public class InflateDeflate {
public static void main(String[] args) {
String source = "This is the SAML String";
String outcome=null;
byte[] bytesource = null;
try {
bytesource = source.getBytes("UTF-8");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
int byteLength = bytesource.length;
Deflater compresser = new Deflater();
compresser.setInput(bytesource);
compresser.finish();
byte[] output = new byte[byteLength];
int compressedDataLength = compresser.deflate(output);
outcome = new String(output);
String trimmedoutcome = outcome.trim();
//String trimmedoutcome = outcome; // behaves the same way as trimmed;
// Now try to inflate it
Inflater decompresser = new Inflater();
decompresser.setInput(trimmedoutcome.getBytes());
byte[] result = new byte[4096];
int resultLength = 0;
try {
resultLength = decompresser.inflate(result);
} catch (DataFormatException e) {
e.printStackTrace();
}
decompresser.end();
System.out.println("result length ["+resultLength+"]");
String outputString = null;
outputString = new String(result, 0, resultLength);
String returndoc = outputString;
System.out.println(returndoc);
}
}
Surprisingly I get the result as [22] bytes, the original is [23] bytes and the 'g' is missing after inflating.
Am I doing something fundamentally wrong here?
Java's String is a CharacterSequence (a character is 2 bytes). Using new String(byte[]) may not correctly convert your byte[] to a String representation. At least you should specify a character encoding new String(byte[], "UTF-8") to prevent invalid character conversions.
Here's an example of compressing and decompressing:
import java.util.zip.Deflater;
import java.util.zip.InflaterInputStream;
...
byte[] sourceData; // bytes to compress (reuse byte[] for compressed data)
String filename; // where to write
{
// compress the data
Deflater deflater = new Deflater(Deflater.DEFAULT_COMPRESSION);
deflater.setInput(sourceData);
deflater.finish();
int compressedSize = deflater.deflate(data, 0, sourceData.length, Deflater.FULL_FLUSH);
// write the data
OutputStream stream = new FileOutputStream(filename);
stream.write(data, 0, compressedSize);
stream.close();
}
{
byte[] uncompressedData = new byte[1024]; // where to store the data
// read the data
InputStream stream = new InflaterInputStream(new FileInputStream(filename));
// read data - note: may not read fully (or evenly), read from stream until len==0
int len, offset = 0;
while ((len = stream.read(uncompressedData , offset, uncompressedData .length-offset))>0) {
offset += len;
}
stream.close();
}

Resources