JSON stringify and PostgreSQL bigint compliance - node.js

I am trying to add BigInt support within my library, and ran into an issue with JSON.stringify.
The nature of the library permits not to worry about type ambiguity and de-serialization, as everything that's serialized goes into the server, and never needs any de-serialization.
I initially came up with the following simplified approach, just to counteract Node.js throwing TypeError: Do not know how to serialize a BigInt at me:
// Does JSON.stringify, with support for BigInt:
function toJson(data) {
return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? v.toString() : v);
}
But since it converts each BigInt into a string, each value ends up wrapped into double quotes.
Is there any work-around, perhaps some trick within Node.js formatting utilities, to produce a result from JSON.stringify where each BigInt would be formatted as an open value? This is what PostgreSQL understands and supports, and so I'm looking for a way to generate JSON with BigInt that's compliant with PostgreSQL.
Example
const obj = {
value: 123n
};
console.log(toJson(obj));
// This is what I'm getting: {"value":"123"}
// This is what I want: {"value":123}
Obviously, I cannot just convert BigInt into number, as I would be losing information then. And rewriting the entire JSON.stringify for this probably would be too complicated.
UPDATE
At this point I have reviewed and played with several polyfills, like these ones:
polyfill-1
polyfill-2
But they all seem like an awkward solution, to bring in so much code, and then modify for BigInt support. I am hoping to find something more elegant.

Solution that I ended up with...
Inject full 123n numbers, and then un-quote those with the help of RegEx:
function toJson(data) {
return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? `${v}n` : v)
.replace(/"(-?\d+)n"/g, (_, a) => a);
}
It does exactly what's needed, and it is fast. The only downside is that if you have in your data a value set to a 123n-like string, it will become an open number, but you can easily obfuscate it above, into something like ${^123^}, or 123-bigint, the algorithm allows it easily.
As per the question, the operation is not meant to be reversible, so if you use JSON.parse on the result, those will be number-s, losing anything that's between 2^53 and 2^64 - 1, as expected.
Whoever said it was impossible - huh? :)
UPDATE-1
For compatibility with JSON.stringify, undefined must result in undefined. And within the actual pg-promise implementation I am now using "123#bigint" pattern, to make an accidental match way less likely.
And so here's the final code from there:
function toJson(data) {
if (data !== undefined) {
return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? `${v}#bigint` : v)
.replace(/"(-?\d+)#bigint"/g, (_, a) => a);
}
}
UPDATE-2
Going through the comments below, you can make it safe, by counting the number of replacements to match that of BigInt injections, and throwing error when there is a mismatch:
function toJson(data) {
if (data !== undefined) {
let intCount = 0, repCount = 0;
const json = JSON.stringify(data, (_, v) => {
if (typeof v === 'bigint') {
intCount++;
return `${v}#bigint`;
}
return v;
});
const res = json.replace(/"(-?\d+)#bigint"/g, (_, a) => {
repCount++;
return a;
});
if (repCount > intCount) {
// You have a string somewhere that looks like "123#bigint";
throw new Error(`BigInt serialization conflict with a string value.`);
}
return res;
}
}
though I personally think it is an overkill, and the approach within UPDATE-1 is quite good enough.

If you are using Typescript on express then place the following code on the main server file. Easy Hack 😎 works fine
BigInt.prototype['toJSON'] = function () {
return parseInt(this.toString());
};

Related

Type check with typeof === custom type with Flow error

Given the following code:
type CustomTypeA = {
x: number,
y: number,
}
type CustomTypeB = CustomTypeA & {
z: number,
}
type CustomType = CustomTypeA | CustomTypeB
export const myFunction = (a: CustomType | number): number | void => {
if (typeof a === 'number') {
return a; // THIS WORKS WELL, it's considered number after the check, which is correct
} else if (typeof a === CustomType) {
const newX = a.x // ERR: Flow: Cannot get `a.x` because property `x` is missing in `Number`
const newZ = a.z // SAME ERR, Flow: Cannot get `a.z` because property `z` is missing in `Number`.
}
}
Also, the typeof a === CustomType check is highlighted as an error as well:
Flow: Cannot reference type `CustomType` from a value position.
This however doesn't happen for the typeof a === 'number' one.
It's like the check against the custom object type I created is not valid/recognized.
Can someone explain why and possibly how to escape this?
Thanks.
Flow custom types are not values, they do not exist, they vanish after transpilation, therefore you can not use them with a JS operator like typeof because it requires a value. So when you do typeof a === CustomType it will fail, because after compilation you will end with typeof a === , CustomType is just stripped out and you end with invalid JS.
This seems to be a flow limitation to be honest.
There is the %checks operator which allows you to build type guard functions.
One may think you can use this feature to build a type refinement for your custom types with a function that has the proper logic, but nothing on its documentation suggest that it can be used to refine custom types.
It also requires the body of the guard function to be very simple so flow can understand what do you mean. Some type guard function examples may look like this (from flow docs):
function truthy(a, b): boolean %checks {
return !!a && !!b;
}
function isString(y): %checks {
return typeof y === "string";
}
function isNumber(y): %checks {
return typeof y === "number";
}
However when you try a more complex check, for example checking that something is an object, but it is not an array or a date, flow fails to understand your intention and the predicate function will not work. Something like this:
function isObject(obj: mixed): boolean %checks {
return Object.prototype.toString.call(obj) === '[object Object]'
}
Will fail because flow doesn't understand that as a type refinement for object. For that particular case, there is a workaround suggested on a github issue that consist on declaring the function on the type level asserting that it checks for the object type:
declare function isObject(obj: mixed): boolean %checks(obj instanceof Object)
But you can not use that either for your particular case, because you can not do instanceof on a custom type, because it is not a class.
So your options are either go verbose and check all the expected properties are present on a type check, like this:
if (typeof a.x === 'number' && typeof a.y === 'number' && typeof a.z === 'number') {
const {x: ax, y: ay, z: az} = a
// now you can safely use the extracted variables
Note you need to extract the props from the object because, any time you call a function flow will invalidate your type refinement and the next line that access a.x will fail.
You can declare your point as a Class, and use the type system checking for instances of that class.
Or you build a validation function that returns either the correct type or null, so flow can understand the type has been refined:
function isCustomType (p: mixed): CustomType | null {
const validated = ((p:any):CustomType)
if (typeof validated.x === 'number' && typeof validated.y === 'number') return validated
return null
}
const validA = isCustomType(a)
if (validA) {
const {x: ax, y: ay} = validA
// do stuff
This has the disadvantage that you need to allocate extra variables just to satisfy the type system, but I think that is a minor problem.
Also, it will not allow flow to validate the isCustomType function for you, because we are doing type casts to basically cheat flow. But given the surface is small and the objective is very focused it should be ok to be able to keep it manually correct.

Nestjs & TypeOrm: No results from Query Builder using getOne() / getMany()

I don't get this. I have a service that injects entity repositories and has dedicated methods to do some business logic and functions.
Beside that I expose a method that just returns QueryBuilder - to avoid injecting repositories all over the place - for a few occasions when other service needs just a quick query:
type EntityFields = keyof MyEntity;
entityQueryBuilder(alias?: string, id?: number, ...select: EntityFields[]) {
const q = this.entityRepository.createQueryBuilder(alias);
if (id) {
q.where({id});
}
if (select) {
q.select(select);
}
return q;
}
Now when I am trying to use this and call:
const r = await service.entityQueryBuilder('a', 1, 'settings').getOne();
the result is always empty although in the log the generated SQL is correct.
However when I do:
const r = await service.entityQueryBuilder('a', 1, 'settings').execute();
I get (almost) what I need. I get array instead of an entity object directly but the data are there.
I am unhappy though as I need to map the result to the object I wanted, which is something that getOne() should do on my behalf. getMany() does not return results either.
What did I do wrong?
Edit:
FWIW here is the final solution I came up with based on the hint in accepted reply:
entityQueryBuilder(id?: number, ...select: EntityFields[]) {
const q = this.entityRepository.createQueryBuilder('alias');
if (id) {
q.where({id});
}
if (select) {
q.select(select.map(f => `alias.${f}`));
}
return q;
}
Admittedly it has hardcoded alias but that I can live with and is OK for my purpose.
Hope this helps someone in the future.
It happens because you put no really proper select. In your case, you need a.settings instead of settings:
const r = await service.entityQueryBuilder('a', 1, 'a.settings').getOne(); // it should works

how to turn the flattenObj function from the ramda cookbook into an iterative function

I'm dealing with a test environment nodejs/sequelize/mocha/chai.
I find this flattenObj extremely useful
when testing objects, generated by sequelize for instance.
It makes those structures digestible for chai and the results become more concise
Too bad it's implemented in a recursive way :( .Especially in Javascript this spells doom, as there is always a call stack limit lurking.
Hacks like wrapping the recursive function in a setTimeout doesn't seem to work for me and are kind of ugly.
I'm currently trying to figure out to rewrite it in an iterative way, but that's quite a brain teaser, at least for me.
Dealing with while loops inside a ramda function doesn't feel right.
Is there a way of doing this in a call stack friendly way without breaking ramda conventions?
const go = obj_ => chain(([k, v]) => {
if (type(v) === 'Object' || type(v) === 'Array') {
return pipe(
tap(console.log),
map(([k_, v_]) => [`${k}.${k_}`, v_])
)(go(v))
} else {
return [[k, v]]
}
}, toPairs(obj_))
const flattenObj = obj => {
return fromPairs(go(obj))
}
flattenObj({a:1, b:{c:3}, d:{e:{f:6}, g:[{h:8, i:9}, 0]}})
{
"a": 1,
"b.c": 3,
"d.e.f": 6,
"d.g.0.h": 8,
"d.g.0.i": 9,
"d.g.1": 0
}
this works as expected, but it breaks down causing a call stack exceeded error, because of the recursive go function, when the object becomes too complex.
This would be super useful if it's applicable on more complex structures as well.
I don't think it's a bad thing that it's implemented in a recursive way. That's the best way to deal with recursive data structures such as JS objects.
But you can always convert recursive solutions to iterative ones if you want to manage your own stack. Here's a fairly ugly approach, but which seems to work for that simple test case:
const flattenObj = (obj) => {
const results = [];
const steps = Object.entries(obj)
while (steps.length) {
const [key, val] = steps.splice(0, 1)[0]
if (typeof val == 'object') {
Array.prototype.push.apply(steps, Object.entries(val).map(
([k, v]) => [key + '.' + k, v]
))
} else {
results.push([key, val])
}
}
return results.reduce((a, [k, v]) => ({...a, [k]: v}), {})
}
const foo = {a:1, b:{c:3}, d:{e:{f:6}, g:[{h:8, i:9}, 0]}}
console.log(flattenObj(foo))
This will not work with cyclical structures, but the cookbook version would not have either, presumably.
I wrote this originally using some Ramda functions (toPairs in place of Object.entries, is(Object, val) in place of typeof val == 'object' and return fromPairs(results) in place of return results.reduce(...).) But with all the mutation going on (splice and push), it feels a very unRamda-ish solution, and I removed them. (Ramda functions, you understand, don't want to be associated with gauche mutability!)
I don't know if this will solve your problem. I've only used flattenObj a few times, although I can see the utility in tests. But it strikes me that if this is causing recursion problems, cyclical data structures are a more likely issue than actual depth. But of course I don't know your data, so who knows?

How do I get the result of class getters into JSON? [duplicate]

Take this object:
x = {
"key1": "xxx",
"key2": function(){return this.key1}
}
If I do this:
y = JSON.parse( JSON.stringify(x) );
Then y will return { "key1": "xxx" }. Is there anything one could do to transfer functions via stringify? Creating an object with attached functions is possible with the "ye goode olde eval()", but whats with packing it?
json-stringify-function is a similar post to this one.
A snippet discovered via that post may be useful to anyone stumbling across this answer. It works by making use of the replacer parameter in JSON.stringify and the reviver parameter in JSON.parse.
More specifically, when a value happens to be of type function, .toString() is called on it via the replacer. When it comes time to parse, eval() is performed via the reviver when a function is present in string form.
var JSONfn;
if (!JSONfn) {
JSONfn = {};
}
(function () {
JSONfn.stringify = function(obj) {
return JSON.stringify(obj,function(key, value){
return (typeof value === 'function' ) ? value.toString() : value;
});
}
JSONfn.parse = function(str) {
return JSON.parse(str,function(key, value){
if(typeof value != 'string') return value;
return ( value.substring(0,8) == 'function') ? eval('('+value+')') : value;
});
}
}());
Code Snippet taken from Vadim Kiryukhin's JSONfn.js or see documentation at Home Page
I've had a similar requirement lately. To be clear, the output looks like JSON but in fact is just javascript.
JSON.stringify works well in most cases, but "fails" with functions.
I got it working with a few tricks:
make use of replacer (2nd parameter of JSON.stringify())
use func.toString() to get the JS code for a function
remember which functions have been stringified and replace them directly in the result
And here's how it looks like:
// our source data
const source = {
"aaa": 123,
"bbb": function (c) {
// do something
return c + 1;
}
};
// keep a list of serialized functions
const functions = [];
// json replacer - returns a placeholder for functions
const jsonReplacer = function (key, val) {
if (typeof val === 'function') {
functions.push(val.toString());
return "{func_" + (functions.length - 1) + "}";
}
return val;
};
// regex replacer - replaces placeholders with functions
const funcReplacer = function (match, id) {
return functions[id];
};
const result = JSON
.stringify(source, jsonReplacer) // generate json with placeholders
.replace(/"\{func_(\d+)\}"/g, funcReplacer); // replace placeholders with functions
// show the result
document.body.innerText = result;
body { white-space: pre-wrap; font-family: monospace; }
Important: Be careful about the placeholder format - make sure it's not too generic. If you change it, also change the regex as applicable.
Technically this is not JSON, I can also hardly imagine why would you want to do this, but try the following hack:
x.key2 = x.key2.toString();
JSON.stringify(x) //"{"key1":"xxx","key2":"function (){return this.key1}"}"
Of course the first line can be automated by iterating recursively over the object. Reverse operation is harder - function is only a string, eval will work, but you have to guess whether a given key contains a stringified function code or not.
You can't pack functions since the data they close over is not visible to any serializer.
Even Mozilla's uneval cannot pack closures properly.
Your best bet, is to use a reviver and a replacer.
https://yuilibrary.com/yui/docs/json/json-freeze-thaw.html
The reviver function passed to JSON.parse is applied to all key:value pairs in the raw parsed object from the deepest keys to the highest level. In our case, this means that the name and discovered properties will be passed through the reviver, and then the object containing those keys will be passed through.
This is what I did https://gist.github.com/Lepozepo/3275d686bc56e4fb5d11d27ef330a8ed
function stringifyWithFunctions(object) {
return JSON.stringify(object, (key, val) => {
if (typeof val === 'function') {
return `(${val})`; // make it a string, surround it by parenthesis to ensure we can revive it as an anonymous function
}
return val;
});
};
function parseWithFunctions(obj) {
return JSON.parse(obj, (k, v) => {
if (typeof v === 'string' && v.indexOf('function') >= 0) {
return eval(v);
}
return v;
});
};
The naughty but effective way would be to simply:
Function.prototype.toJSON = function() { return this.toString(); }
Though your real problem (aside from modifying the prototype of Function) would be deserialization without the use of eval.
I have come up with this solution which will take care of conversion of functions (no eval). All you have to do is put this code before you use JSON methods. Usage is exactly the same but right now it takes only one param value to convert to a JSON string, so if you pass remaning replacer and space params, they will be ignored.
void function () {
window.JSON = Object.create(JSON)
JSON.stringify = function (obj) {
return JSON.__proto__.stringify(obj, function (key, value) {
if (typeof value === 'function') {
return value.toString()
}
return value
})
}
JSON.parse = function (obj) {
return JSON.__proto__.parse(obj, function (key, value) {
if (typeof value === 'string' && value.slice(0, 8) == 'function') {
return Function('return ' + value)()
}
return value
})
}
}()
// YOUR CODE GOES BELOW HERE
x = {
"key1": "xxx",
"key2": function(){return this.key1}
}
const y = JSON.parse(JSON.stringify(x))
console.log(y.key2())
It is entirely possible to create functions from string without eval()
var obj = {a:function(a,b){
return a+b;
}};
var serialized = JSON.stringify(obj, function(k,v){
//special treatment for function types
if(typeof v === "function")
return v.toString();//we save the function as string
return v;
});
/*output:
"{"a":"function (a,b){\n return a+b;\n }"}"
*/
now some magic to turn string into function with this function
var compileFunction = function(str){
//find parameters
var pstart = str.indexOf('('), pend = str.indexOf(')');
var params = str.substring(pstart+1, pend);
params = params.trim();
//find function body
var bstart = str.indexOf('{'), bend = str.lastIndexOf('}');
var str = str.substring(bstart+1, bend);
return Function(params, str);
}
now use JSON.parse with reviver
var revivedObj = JSON.parse(serialized, function(k,v){
// there is probably a better way to determ if a value is a function string
if(typeof v === "string" && v.indexOf("function") !== -1)
return compileFunction(v);
return v;
});
//output:
revivedObj.a
function anonymous(a,b
/**/) {
return a+b;
}
revivedObj.a(1,2)
3
To my knowledge, there are no serialization libraries that persist functions - in any language. Serialization is what one does to preserve data. Compilation is what one does to preserve functions.
It seems that people landing here are dealing with structures that would be valid JSON if not for the fact that they contain functions. So how do we handle stringifying these structures?
I ran into the problem while writing a script to modify RequireJS configurations. This is how I did it. First, there's a bit of code earlier that makes sure that the placeholder used internally (">>>F<<<") does not show up as a value in the RequireJS configuration. Very unlikely to happen but better safe than sorry. The input configuration is read as a JavaScript Object, which may contain arrays, atomic values, other Objects and functions. It would be straightforwardly stringifiable as JSON if functions were not present. This configuration is the config object in the code that follows:
// Holds functions we encounter.
var functions = [];
var placeholder = ">>>F<<<";
// This handler just records a function object in `functions` and returns the
// placeholder as the value to insert into the JSON structure.
function handler(key, value) {
if (value instanceof Function) {
functions.push(value);
return placeholder;
}
return value;
}
// We stringify, using our custom handler.
var pre = JSON.stringify(config, handler, 4);
// Then we replace the placeholders in order they were encountered, with
// the functions we've recorded.
var post = pre.replace(new RegExp('"' + placeholder + '"', 'g'),
functions.shift.bind(functions));
The post variable contains the final value. This code relies on the fact that the order in which handler is called is the same as the order of the various pieces of data in the final JSON. I've checked the ECMAScript 5th edition, which defines the stringification algorithm and cannot find a case where there would be an ordering problem. If this algorithm were to change in a future edition the fix would be to use unique placholders for function and use these to refer back to the functions which would be stored in an associative array mapping unique placeholders to functions.

Mongodb $where query always true with nodejs

When I query my database with a function passed in the "$where" clause in nodejs, it always return me all documents in the db.
For example, if I do
var stream = timetables.find({$where: function() { return false; }}).stream();
it return me all the documents.
Instead, if I do
var stream = timetables.find({$where: 'function() { return false; }'}).stream();
the function is really executed, and this code doesn't return any document.
The problem is that if I convert in string my function the context's bindinds are removed, and I need them for more complex query. For example:
var n = 1;
var f = function() { return this.number == n; }
var stream = timetables.find({$where: f.toString()}).stream();
// error: n is not defined
Is this a normal behaviour? How can I solve my problem?
Please excuse me for my poor english!
First off, keep in mind that the $where operator should almost never be used for the reasons explained here (credit goes to #WiredPrairie).
Back to your issue, the approach you'd like to take won't work even in the mongodb shell (which explicitly allows naked js functions with the $where operator). The javascript code provided to the $where operator is executed on the mongo server and won't have access to the enclosing environment (the "context bindings").
> db.test.insert({a: 42})
> db.test.find({a: 42})
{ "_id" : ObjectId("5150433c73f604984a7dff91"), "a" : 42 }
> db.test.find({$where: function() { return this.a == 42 }}) // works
{ "_id" : ObjectId("5150433c73f604984a7dff91"), "a" : 42 }
> var local_var = 42
> db.test.find({$where: function() { return this.a == local_var }})
error: {
"$err" : "error on invocation of $where function:\nJS Error: ReferenceError: local_var is not defined nofile_b:1",
"code" : 10071
}
Moreover it looks like that the node.js native mongo driver behaves differently from the shell in that it doesn't automatically serialize a js function you provide in the query object and instead it likely drops the clause altogether. This will leave you with the equivalent of timetables.find({}) which will return all the documents in the collection.
This one is works for me , Just try to store a query as a string in one variable then concat your variable in query string,
var local_var = 42
var query = "{$where: function() { return this.a == "+local_var+"}}"
db.test.find(query)
Store your query into a varibale and use that variable at your find query. It works..... :D
The context will always be that of the mongo database, since the function is executed there. There is no way to share the context between the two instances. You have to rethink the way you query and come up with a different strategy.
You can use a wrapper to pass basic JSON objects, ie. (pardon coffee-script):
# That's the main wrapper.
wrap = (f, args...) ->
"function() { return (#{f}).apply(this, #{JSON.stringify(args)}) }"
# Example 1
where1 = (flag) ->
#myattr == 'foo' or flag
# Example 2 with different arguments
where2 = (foo, options = {}) ->
if foo == options.bar or #_id % 2 == 0
true
else
false
db.collection('coll1').count $where: wrap(where1, true), (err, count) ->
console.log err, count
db.collection('coll1').count $where: wrap(where2, true, bar: true), (err, count) ->
console.log err, count
Your functions are going to be passed as something like:
function () {
return (function (flag) {
return this.myattr === 'foo' || flag;
}).apply(this, [true])
}
...and example 2:
function () {
return (
function (foo, options) {
if (options == null) {
options = {};
}
if (foo === options.bar || this._id % 2 === 0) {
return true;
} else {
return false;
}
}
).apply(this, [ true, { "bar": true } ])
}
This is how it is supposed to be. The drivers don't translate the client code into the mongo function javascript code.
I'm assuming you are using Mongoose to query your database.
If you take a look at the actual Query object implementation, you'll find that only strings are valid arguments for the where prototype.
When using the where clause, you should use it along with the standard operators such as gt, lt that operates on in the path created by the where function.
Remember that Mongoose querying, as in Mongo, is by example, you may want to reconsider your query specification in a more descriptive fashion.

Resources