What's Deno equivalent of Node.js Buffer.from(string) - node.js

How can I convert a string to a buffer?
I tried: Uint8Array.from('hello world') but it isn't working

The equivalent of Buffer.from('Hello World') is:
const encoder = new TextEncoder()
const buffer = encoder.encode('Hello World');
If you want to decode it back, you'll need to use TextDecoder.
const decoder = new TextDecoder()
console.log(decoder.decode(buffer))
Deno tries to implement Web APIs when possible, reason why it works the same way on the browser.
const decoder = new TextDecoder();
const encoder = new TextEncoder();
const buffer = encoder.encode('Hello World');
console.log(buffer);
console.log(decoder.decode(buffer))
Have in mind that Node.js' Buffer supports multiple encodings, such as base64 or hex, which won't work with TextDecoder
So if you have a base64 string and want to convert it to utf8 instead of doing:
const base64String = Buffer.from('Hello World').toString('base64'); // Hello World
const utf8String = Buffer.from(base64String, 'base64').toString();
You would need to use atob (Same as Web API) instead:
const base64String = btoa('Hello World');
const utf8String = atob(base64String);
console.log('Base64:', base64String);
console.log('utf8string:', utf8String);

Related

Node Zlib. Unzip response with file structure in-memory

I'm receiving a Buffer data from a response with data in a file.
// Save it just to check if content is correct
const fd = fs.openSync('data.zip', 'w')
fs.writeSync(fd, data)
fs.closeSync(fd)
Produces file data.zip, which contains file foo.csv.
I can unzip it with UZIP in-memory with:
const unzipArray = UZIP.parse(data)['foo.csv']
However, I cannot do it with Zlib.
const unzipArray = zlib.unzipSync(data)
// Rises: incorrect header check
It looks like Zlib cannot parse the file structure.
How to unzip the above buffer in-memory with Zlib, without saving files to the filesystem?
You have a zip file, not a zlib or gzip stream. As you found, zlib doesn't process zip files. There are many solutions out there for node.js, which you can find using your friend google. Here is one.
for single file:
const fs = require('fs');
const zlib = require('zlib');
const fileContents = fs.createReadStream('./data/file1.txt.gz');
const writeStream = fs.createWriteStream('./data/file1.txt');
const unzip = zlib.createGunzip();
fileContents.pipe(unzip).pipe(writeStream);
for a group of file:
const fs = require('fs');
const zlib = require('zlib');
const directoryFiles = fs.readdirSync('./data');
directoryFiles.forEach(filename => {
const fileContents = fs.createReadStream(`./data/${filename}`);
const writeStream = fs.createWriteStream(`./data/${filename.slice(0, -3)}`);
const unzip = zlib.createGunzip();
fileContents.pipe(unzip).pipe(writeStream);
});

XML scraping using nodeJs

I have a very huge xml file that I got by exporting all the data from tally, I am trying to use web scraping to get elements out of my code using cheerio, but I am having trouble with the formatting or something similar. Reading it with fs.readFileSync() works fine and the console.log shows complete xml file but when I write the file using the fs.writeFileSync it makes it look like this:
And my web scraping code outputs empty file:
const cheerio = require('cheerio');
const fs = require ('fs');
var xml = fs.readFileSync('Master.xml','utf8');
const htmlC = cheerio.load(xml);
var list = [];
list = htmlC('ENVELOPE').find('BODY>TALLYMESSAGE>STOCKITEM>LANGUAGENAME.LIST>NAME.LIST>NAME').each(function (index, element) {
list.push(htmlC(element).attr('data-prefix'));
})
console.log(list)
fs.writeFileSync("data.html",list,()=>{})
You might try checking to make sure that Cheerio isn't decoding all the HTML entities. Change:
const htmlC = cheerio.load(xml);
to:
const htmlC = cheerio.load(xml, { decodeEntities: false });

React gives Error: not supported error when I try and import local module

I have a local module (speech.js) in my create-react-app src folder that is the google text to speech code on their website. I adjusted it to be an arrow function and use that specific export syntax.
const textToSpeech = require('#google-cloud/text-to-speech');
// Import other required libraries
const fs = require('fs');
const util = require('util');
export const main = async () => {
// Creates a client
const client = new textToSpeech.TextToSpeechClient();
// The text to synthesize
const text = "Hello world";
// Construct the request
const request = {
input: {text: text},
// Select the language and SSML Voice Gender (optional)
voice: {languageCode: 'en-US', ssmlGender: 'NEUTRAL'},
// Select the type of audio encoding
audioConfig: {audioEncoding: 'MP3'},
};
// Performs the Text-to-Speech request
const [response] = await client.synthesizeSpeech(request);
// Write the binary audio content to a local file
const writeFile = util.promisify(fs.writeFile);
await writeFile('output.mp3', response.audioContent, 'binary');
console.log('Audio content written to file: output.mp3');
};
What I'm not understanding is why this syntax isn't working in App.js.
import {main} from './speech';
I get the error, Error: not support and "4 stack frames were collapsed". Quite informative!
Does anyone know what the error could be here? I thought as long as I used es6 style imports and exports I wouldn't receive errors. Could this be due to the first require() statement of speech.js? Any help would be appreciated. I've felt like banging my head against the wall for the past 40 minutes.
May not be the correct answer but I believe it has a good chance of being right. I believe that since node is just a runtime environment and not a part of the actual browser, you aren't able to use node modules with react (a frontend framework). The solution to this quandary would be to use something like electron.

Extract WAV header on javascript frontend (ReactJS)

I'm trying to analyze a file I'll be uploading from react, I need to know if it can be uploaded based on several factors.
I found https://github.com/TooTallNate/node-wav
It works great on nodejs and I'm trying to use it on react. The sample creates a readable stream and pipes it to the wav reader.
var fs = require('fs');
var wav = require('wav');
var file = fs.createReadStream('track01.wav');
var reader = new wav.Reader();
// the "format" event gets emitted at the end of the WAVE header
reader.on('format', function (format) {
//Format of the file
console.log(format);
});
file.pipe(reader);
Using FilePond controller I'm able to get a base64 string of the file. But I can't figure out how to pass it to the reader
this is what I have so far on ReactJS:
var reader = new wav.Reader();
reader.on('format', function (format) {
//Format of file
console.log('format', format);
});
const buffer = new Buffer(base64String, 'base64')
const readable = new Readable()
readable._read = () => { }
readable.push(buffer)
readable.push(null)
readable.pipe(reader)
But I get Error: bad "chunk id": expected "RIFF" or "RIFX", got "u+Zj"
Since this file works on NodeJS with the same lib is obvious I'm doing something wrong.
EDIT:
this was a problem with my Base64 string, this method works if anyone needs to analyze a wav on the frontend

How to gzip http request post(client) data for node.js server

I have implemented node.js server application, which accepts post data from client(long json string). Is there a way I can gzip the post data at browser end and unzip it in node.js?
I specifically want to gzip the request and not response.
check https://jsfiddle.net/gynz82tg/
decompress in nodejs just same after you get the base64 encoded request string.
var jsonStr = JSON.stringify({
name: "JiangYD"
})
$('#origin').text(jsonStr);
var zip = new JSZip();
zip.file("data", jsonStr);
var content = zip.generate();
$('#compressed').text(content);
zip = new JSZip(content, {base64:true});
$('#decompressed').text(zip.file("data").asText());
<script src="https://raw.githubusercontent.com/Stuk/jszip/master/dist/jszip.js"></script>
<div id='origin'></div>
<div id='compressed'></div>
<div id='decompressed'></div>
UPDATE
because jsZip update the API
https://jsfiddle.net/cvuqr6h4/
async function go(){
const jsonStr = JSON.stringify({
name: "JiangYD"
})
$('#origin').text(jsonStr);
let zip = new JSZip();
zip.file("data", jsonStr);
const content = await zip.generateAsync({type : "base64"});
$('#compressed').text(content);
zip = new JSZip();
await zip.loadAsync(content, {base64:true});
const decoded = await zip.file("data").async('string');
$('#decompressed').text(decoded);
}
go();
You could try this: https://github.com/sapienlab/jsonpack
Example Client Code:
<script src="jsonpack.js" />
<script>
var BIG_JSON = {.....};
var packed = jsonpack.pack(BIG_JSON);
$.post('path_to_server',packed);
</script>
Example Nodejs Code:
var jsonpack = require('jsonpack/main');
app.on('/packed_data',function(req,res){
try{
jsonpack.unpack(req.data);
}catch(e){
//not good packed data.
}
})
This is a sample code of course i don't know what framework or libraries you use, but you can see how this could be implemented.
Anyway be careful with this because zipping and unzipping data is always a heavy cpu bound task. If you have several megabytes of data you dont want to make your users from phones,tablets etc.. to make this tasks!

Resources