Redis in Node.js: how to read Node Buffers - node.js

Must be missing something here, but I'm using Node_redis as Node.js client for Redis.
I'm testing Redis' Lrange command which per that doc returns a 'multi-bulk reply'.
From the node_redis docs this gets exposed as a 'JavaScript Array of node Buffers' in Node.
That's all pretty and all, but what are node buffers and more importantly, how do I read them in Node.js? I just want to transform them to an array of string (json) and from there to an array of object literals.
For ref, grabbing the first element of the array buffer[0] and printing it (trying all kinds of things:
console.log(multibulk[i]) -> [object Object]
console.log(multibulk[i].toString("binary")) -> [object Object]
etc.
thanks.
EDIT:
I verified the data is actually there in Redis (and is not stored as the String [object Object] as I began to expect). In Java when using JRedis' lrange command I get a List < String >. The first result of that list gives me the correct String as expected.

Just to close this:
as part of a kind of locking-mechanism I made sure a key was written to in Node. Stupidly I did this by inserting an object literal without stringifying it. This caused all later inserts in the list to fail.

Related

Problem accesing a dictionary value on dialogflow fullfilment

I am using fullfilment section on dialogflow on a fairly basic program I have started to show myself I can do a bigger project on dialogflow.
I have an object setup that is a dictionary.
I can make the keys a constant through
const KEYS=Object.keys(overflow);
I am going through the values using
if(KEYS.length>0){
var dictionary = overflow[keys[i]]
if I stringify dictionary using
JSON.stringify(item);
I get:
{"stack":"overflow","stack2":"overflowtoo", "stack3":3}
This leads me to believe I am actually facing a dictionary, hence the name of the variable.
I am accesing a string variable such as stack and unlike stack3
Everything I have read online tells me
dictionary.stack
Should work since
JSON.stringify(item);
Shows me:
{"stack":"overflow","stack2":"overflowtoo","stack3":3}
Whenever I:
Try to add the variable to the response string.
Append it to a string using output+=${item.tipo};
I get an error that the function crashed. I can replace it with the stringify line and it works and it gives me the JSON provided so the issue isnt there
Dictionary values are created as such before being accessed on another function:
dictionary[request.body.responseId]={
"stack":"overflow",
"stack2":"overflowtoo",
"stack3":3 };
Based on the suggestion here I saw that the properties where being accessed properly but their type was undefined. Going over things repeatedly I saw that the properties where defined as list rather than single values.
Dot notation works when they stop being lists.
Thanks for guiding me towards the problem.

AND occasionally produces wrong result on shell and node native driver

I’ve built a dynamic query generator to create my desired queries based on many factors, however, in rare cases it acted weird. After a day on reading logs I found a situation that can be simplified in this:
db.users.find({att: ‘a’, att: ‘b’})
What I expect is that mongodb by default uses AND, so above query’s result should be an empty array. However, it’s not!
But when I use AND explicitly, the result is an empty array
db.users.find({$and: [{att: 'a'}, {att: ‘b'}]})
In javascript object’s keys should be unique, otherwise the value will be replaced by the latest value
(also mongodb shell is based on js, so it follows some rules of js)
const t = {att: 'a', att: 'b'};
console.log(t);
So in your case your query is acting like this:
db.users.find({att: ‘b’})
You’ve to handle this situation on your code if you want the result be empty in the mentioned condition

JSON.Stringify() Invalid String Length when parsing large JSON object for storage in Postgres JSONB

Background
I'm building out a NodeJS application that will capture data from one source database and store these snapshots as JSON in a Postgres DB. For the most part, this is working great, as this removes the overhead of managing the storage tables which can change over time ever so slightly, based upon what SQL is run on the source to capture this data. In essence, take the data and dump in JSONB until needed later on in the process.
Issues
When running some of the larger queries against the source DB (Oracle EBS) , the column length is significant, likewise with the rowcount. ( 150+ columns, 250k+ rows). Although this is pulled in fine to NodeJS and stored as JSON, I hit an issue when trying to parse this for storage in Postgres which is:
RangeError: Invalid string length
at JSON.stringify (<anonymous>)
at prepareObject (/***/node_modules/pg/lib/utils.js:81:15)
at prepareValue (/***/node_modules/pg/lib/utils.js:66:12)
at prepareValueWrapper (/***/node_modules/pg/lib/utils.js:182:12)
at writeValues (/***/node_modules/pg-protocol/dist/serializer.js:66:41)
at Object.bind (/***/node_modules/pg-protocol/dist/serializer.js:97:5)
at Connection.bind (/***/node_modules/pg/lib/connection.js:161:26)
at Query.prepare (/***/node_modules/pg/lib/query.js:204:18)
at Query.submit (/***/node_modules/pg/lib/query.js:155:12)
at Client._pulseQueryQueue (/***/node_modules/pg/lib/client.js:481:45) {stack: 'RangeError:
Invalid string length
at JSON…gration/node_modules/pg/lib/client.js:481:45)', message: 'Invalid string length'}
arg0:RangeError: Invalid string length
at JSON.stringify (<anonymous>)
at prepareObject (/***/node_modules/pg/lib/utils.js:81:15)
at prepareValue (/***/node_modules/pg/lib/utils.js:66:12)
at prepareValueWrapper (/***/node_modules/pg/lib/utils.js:182:12)
at writeValues (/***/node_modules/pg-protocol/dist/serializer.js:66:41)
at Object.bind (/***/node_modules/pg-protocol/dist/serializer.js:97:5)
at Connection.bind (/***/node_modules/pg/lib/connection.js:161:26)
at Query.prepare (/***/node_modules/pg/lib/query.js:204:18)
at Query.submit (/***/node_modules/pg/lib/query.js:155:12)
at Client._pulseQueryQueue (/***/node_modules/pg/lib/client.js:481:45) {stack: 'RangeError:
Invalid string length
at JSON…g...
message:'Invalid string length'
Reading what's on here already I seem to be hitting something like the below whereby I'm reaching the max of what JSON.Stringify() can handle, however, what's called out is pretty old. I've also researched JSONStream, however, my problem is that Postgres still calls Stringify() as a function so anything I do seems to get ignored anyway.
Is there a way I can pass this object directly to Postgres without stringify being called? Or is there a way to chunk this data up and append the column data?
RangeError: Invalid string length --- it should be saying Out Of Memory #14170
JSON.stringify throws RangeError: Invalid string length for huge objects

Representing NodeJS MongoDB Query as String

I like to store mongodb queries as string. I would like to load and execute them. How can I convert the parameter object used in find to a string without loosing information?
I have already tried to use JSON.stringify and JSON.parse, but it seems, that JSON.parse does not revert JSON.stringify completely.
historyCollection.find({"date":{$gte:new Date("2018-12-20 13:47:57.461Z")}, $or:[{"source":2}, {"source":3}]}).limit(4).toArray(function (err2, result2) {
// Do Stuff
});

How to get string and Date object from TIMESTAMP in Firebase Functions?

I got timestamp in Firebase Functions with
admin.database.ServerValue.TIMESTAMP
However, when I pass it as a string, it becomes [object object]
I saved it in Firebase Realtime database and when I checked it from console I saw 1505298866520 which I think it is milliseconds since UNIX epoch.
Do you know how can I get string from it or how can I do time calculations with it? Or what type of object TIMESTAMP returns?
The firebase.database.ServerValue.TIMESTAMP or admin.database.ServerValue.TIMESTAMP hold the following object {.sv: "timestamp"}. This is a specific firebase-database object. That tells the server it should use the server's UNIX-time when an object is added to the database.
That is the reason you can not use it as a date object in javascript.
If you use admin.database.ServerValue.TIMESTAMP on the server via cloud functions it should represent the same as new Date().getTime()

Resources