Viewing piped data inside pipe error - node.js

In the following code block how can I log the piped data causing JSONStream.parse to fail?
data.pipe(JSONStream.parse('*'))
.on('error', () => {
// here I'd like to see the data causing JOSONStream.parse to blow up
reject("Error parsing the json!");
})
.pipe(objectStream);
I ask because currently when it blows up I get this sort of error message but no context to what the Invalid UTF-8 character is:
Error: Invalid JSON (Invalid UTF-8 character at position 2397 in state STRING1)
at Parser.proto.write (/var/www/data-site/pop-service/node_modules/JSONStream/node_modules/jsonparse/jsonparse.js:120:31)
at Stream.<anonymous> (/var/www/data-site/pop-service/node_modules/JSONStream/index.js:23:12)
at Stream.stream.write (/var/www/data-site/pop-service/node_modules/JSONStream/node_modules/through/index.js:26:11)
at IncomingMessage.ondata (_stream_readable.js:536:20)
at emitOne (events.js:82:20)
at IncomingMessage.emit (events.js:169:7)
at readableAddChunk (_stream_readable.js:153:18)
at IncomingMessage.Readable.push (_stream_readable.js:111:10)
at HTTPParser.parserOnBody (_http_common.js:124:22)
at TLSSocket.socketOnData (_http_client.js:320:20)
something like .on('error', (data) => {... doesn't seem to work
Solution using answer from #drinchev
//save the last chunk so that we can log it in case of parsing error
let lastRetrievedChunk = '';
data.on('data', (dd: any) => {
lastRetrievedChunk = dd.toString();
});
let jsonParser = JSONStream.parse('*');
jsonParser.on('error', (err: any) => {
reject(err.stack + ' lastchunk = ' + lastRetrievedChunk);
});
data.pipe(jsonParser).pipe(objectStream);

It does not work indeed, since it is event-driven process.
You can try to store the data in a variable and pass it to the error
var someString = '*';
data.pipe(JSONStream.parse(someString))
.on('error', () => {
console.log( someString );
reject("Error parsing the json!");
})
.pipe(objectStream);
At any point you can also use some library like through2 which can help you to make a stream that logs the piped data.

Related

Stripe Webhook constructEvent method is returning 400 error when pointed to ec2 instance

I have developed my application in windows OS and integrated Stripe webhook with the help of ngrok. Everything was smooth and I was able to receive the events from webhook. but once I moved it to ec2 instance on AWS Cloud, it is throwing me an error.
Error: (In Stripe Dashboard Webhook Attempts Section)
Webhook Error: The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received undefined
It is strange that have not changed any piece of code and it still works on windows.
Complete Error:
TypeError [ERR_INVALID_ARG_TYPE]: The "key" argument must be of type string or an instance of Buffer, TypedArray, DataView, or KeyObject
. Received undefined
at prepareSecretKey (internal/crypto/keys.js:322:11)
at new Hmac (internal/crypto/hash.js:111:9)
at Object.createHmac (crypto.js:147:10)
at Object._computeSignature (/home/psuser/middleware/node_modules/stripe/lib/Webhooks.js:65:8)
at Object.verifyHeader (/home/psuser/middleware/node_modules/stripe/lib/Webhooks.js:107:36)
at Object.constructEvent (/home/psuser/middleware/node_modules/stripe/lib/Webhooks.js:12:20)
at /home/psuser/middleware/routes/carts.js:210:29
at Layer.handle [as handle_request] (/home/psuser/middleware/node_modules/express/lib/router/layer.js:95:5)
at next (/home/psuser/middleware/node_modules/express/lib/router/route.js:137:13)
at /home/psuser/middleware/node_modules/body-parser/lib/read.js:130:5
at invokeCallback (/home/psuser/middleware/node_modules/raw-body/index.js:224:16)
at done (/home/psuser/middleware/node_modules/raw-body/index.js:213:7)
at IncomingMessage.onEnd (/home/psuser/middleware/node_modules/raw-body/index.js:273:7)
at IncomingMessage.emit (events.js:327:22)
at endReadableNT (_stream_readable.js:1327:12)
at processTicksAndRejections (internal/process/task_queues.js:80:21) {
code: 'ERR_INVALID_ARG_TYPE'
}
The "key" argument must be of type string or an instance of Buffer, TypedArray, DataView, or KeyObject. Received undefined
// End point Code
router.post('/webhook', async (request, response) => {
const sig = request.headers['stripe-signature'];
let event;
try {
event = stripe.webhooks.constructEvent(request.rawBody, sig, process.env.STRIPE_WEBHOOK_SECRET);
}
catch (err) {
console.log(err.message);
return response.status(400).send(`Webhook Error: ${err.message}`);
}
});
// In app.js
app.use(express.json({
// Because Stripe needs the raw body, we compute it but only when hitting the Stripe callback URL.
verify: function (req, res, buf) {
var url = req.originalUrl;
if (url.endsWith('/webhook')) {
req.rawBody = buf.toString()
}
}
}));
You can install micro (npm install micro) which has its buffer method
const { buffer } = require('micro')
in order to create the required type for the request:
const reqBuffer = await buffer(req)
then you can do:
event = stripe.webhooks.constructEvent(reqBuffer, sig, process.env.STRIPE_WEBHOOK_SECRET);
But I think your problem is simply an ENV variable missing in your EC2 server did you export STRIPE_WEBHOOKS_SECRET=secret in your server?

Json cannot read property of undefined

When I execute a command (the error doesn't occur when I type a normal message) in my discord bot I get an error saying:
Cannot read property 'active' of undefined
And it occurs when I try to console log an object from a json file where I store users data.
Those are the first lines of code of the index.js file of my bot, the line where I try to console log is where the error occurs
const Discord = require('discord.js');
const client = new Discord.Client();
const fs = require("fs");
const prefix = '>';
let xp = require("./storage/dbGeneral.json");
let pr = require("./storage/dbPremium.json");
let lv = require("./storage/levels.json");
client.on('message', message => {
console.log(pr[message.author.id].active);
}
This is the json file where I store the data
{
"397387465024864257": {
"active": false,
"dateStart": "",
"dateEnd": ""
}
}
This is the error:
index.js:14
console.log(pr[message.author.id].active);
^
TypeError: Cannot read property 'active' of undefined
at Client.client.on.message (index.js:14:39)
at emitOne (events.js:116:13)
at Client.emit (events.js:211:7)
at MessageCreateHandler.handle (\node_modules\discord.js\src\client\websocket\packets\handlers\MessageCreate.js:9:34)
at WebSocketPacketManager.handle (\node_modules\discord.js\src\client\websocket\packets\WebSocketPacketManager.js:103:65)
at WebSocketConnection.onPacket (\node_modules\discord.js\src\client\websocket\WebSocketConnection.js:333:35)
at WebSocketConnection.onMessage (\node_modules\discord.js\src\client\websocket\WebSocketConnection.js:296:17)
at WebSocket.onMessage (\node_modules\ws\lib\event-target.js:120:16)
at emitOne (events.js:116:13)
at WebSocket.emit (events.js:211:7)
I really don't know what is the cause of the error, all other json requests work fine with the same method.
The error is caused by trying to access a property of something not defined.
To fix your problem, you must check if the user exists in your database.
let user = pr[message.author.id];
if(user) console.log(user.active)

Cannot read property 'toString' of undefined node.js filesystem

I have this script here, upon loading the page, it should first load a header html boilerplate, some dynamic paragraphs, and end the request with a footer html boiler plate. Both Boilerplates come from external files.
After testing, I am receiving:
TypeError: Cannot read property 'toString' of undefined
at ReadFileContext.callback (/user_code/index.js:20:23)
at FSReqWrap.readFileAfterOpen [as oncomplete] (fs.js:367:13)
My understanding was that I needed .toString() to ensure that what was loaded from the external files were buffered as strings.
exports.request = functions.https.onRequest((req, res) => {
fs.readFile('/public/templates/header-template.html', function(_err, data) {
res.write(data.toString());
console.log('Header Loaded');
});
res.writeHead(200, {'Content-Type': 'text/html'});
var q = url.parse(req.url, true).query;
var txt = '<p>Thanks <b>' + q.firstname + ' ' + q.lastname + '</b>. An email has been sent to <b>' + q.email + '</b> with details about your request.';
res.write(txt);
fs.readFile('/public/templates/footer-template.html', function(_err, data) {
res.end(data.toString());
console.log('Footer Loaded');
});
});
Any idea what I am doing wrong?
You should pass the encoding to toString() to convert from a Buffer.
data.toString('utf8');
Additionally, it seems your variable data is not defined, so perfhaps your file doesn't exists. Capture the error and see what's wrong:
if(_err){
console.error(_err);
}

Error: stream.push() after EOF

Playing with node steams
This code reads from index.js and writes to indexCopy.js - kind of file copy.
Target file got created, but during execution exception is thrown:
node index.js
events.js:183
throw er; // Unhandled 'error' event
^
Error: stream.push() after EOF
at readableAddChunk (_stream_readable.js:240:30)
at MyStream.Readable.push (_stream_readable.js:208:10)
at ReadStream.f.on (C:\Node\index.js:16:28)
at emitOne (events.js:116:13)
at ReadStream.emit (events.js:211:7)
at addChunk (_stream_readable.js:263:12)
at readableAddChunk (_stream_readable.js:250:11)
at ReadStream.Readable.push (_stream_readable.js:208:10)
at fs.read (fs.js:2042:12)
at FSReqWrap.wrapper [as oncomplete] (fs.js:658:17)
C:\Node>
This is code:
var util = require('util');
var stream = require('stream');
var fs = require('fs');
var MyStream = function(){
stream.Readable.call(this)
}
util.inherits(MyStream,stream.Readable);
MyStream.prototype._read = function(d){
f = fs.createReadStream("index.js");
f.on('data',(d)=>{this.push(d)});
f.on('end',()=>{this.push(null)}); //when file finished need to close stream
}
var f = fs.createWriteStream("indexCopy.js")
var myStream = new MyStream()
myStream.pipe(f);
I tried to call this.push(null) in 'data' event, in that case even target file is not created and code fails with the exception.
I realize that copy file should be done easier with pipe() function - I am just experimenting/learning.
What is wrong with my approach?
You don't want the f = fs.createReadStream("index.js") line inside the _read method -- _read gets called repeatedly so you're creating multiple read streams. Put that in your constructor instead.
function MyStream () {
stream.Readable.call(this);
this.source = fs.createReadStream("index.js");
this.haveBound = false;
}
MyStream.prototype._read = function () {
if (this.haveBound) return; // Don't bind to events repeatedly
this.haveBound = true;
this.source.on("data", d => this.push(d));
this.source.on("end", () => this.push(null));
};
This is awkward though. Streams are meant to be pipe'ed.

Chunking stream into data chunks using Node.js

I am trying to chunk a file into data chunks. I found that link does the job beautifully but when I use the above library in the following manner:
var in = fs.createReadStream(__dirname+'/try.html'),
chunker = new SizeChunker({
chunkSize: 2048
}),
output;
chunker.on('chunkStart', function(id, done) {
output = fs.createWriteStream('./output-' + id);
console.log("Chunkstart!");
console.log("Input: "+in.length);
done();
});
chunker.on('chunkEnd', function(id, done) {
output.end();
console.log("Chunkend!");
done();
});
chunker.on('data', function(dat) {
console.log("Writing chunk to output!")
output.write(dat.chunk);
console.log(dat.chunk);
});
input.pipe(chunker);
But I am getting this error :
_stream_writable.js:201
var len = state.objectMode ? 1 : chunk.length;
^
TypeError: Cannot read property 'length' of undefined
at writeOrBuffer (_stream_writable.js:201:41)
at WriteStream.Writable.write (_stream_writable.js:180:11)
at SizeChunker.<anonymous> (/Users/admin/Documents/chunk.js:16:15)
at SizeChunker.EventEmitter.emit (events.js:95:17)
at SizeChunker.<anonymous> (_stream_readable.js:746:14)
at SizeChunker.EventEmitter.emit (events.js:92:17)
at emitReadable_ (_stream_readable.js:408:10)
at emitReadable (_stream_readable.js:404:5)
at readableAddChunk (_stream_readable.js:165:9)
at SizeChunker.Readable.push (_stream_readable.js:127:10)
Also, in.length is undefined when displayed using console.log(). Can anyone please help me resolve this issue? Thanks in advance.
When you listen for data on the chunker stream, the dat argument has no property chunk. You can read on the chunking-stream readme the following:
Each data chunk is an object with the following fields:
id: number of chunk (starts from 1) data: Buffer with data
You can do something like this instead:
chunker.on('data', function(dat) {
console.log("Writing chunk to output!")
output.write(dat.data);
console.log(dat);
});
Also, in is a stream and has no length property defined.

Resources