how does mongo determine how to serialize an object? - node.js

I'm working with a BigNumbers from the bignumber.js package
I'm new to Mongo and I'm curious as to how Mongo knows how to serialize this object correctly (or any other object for that matter)?
I ask because I have 2 scripts which are seemingly identical in they way the insert these objects, but in one script the BigNumbers are inserted as strings in the other script they are inserted as numbers.
The object comes in from an any stream and I need to convert or cast the any object to Altitude.
Sample code:
export interface Altitude {
altitude: BigNumber;
minAltitude: BigNumber;
maxAltitude: BigNumber;
}
... 'message' object of type any from stream ...
// this JSON back and forth stuff seems like it should be unnecessary
let jsonString = JSON.stringify(message);
let alt = JSON.parse(jsonString) as Altitude;
collection.insert(alt, (error: MongoError, result: InsertOneWriteOpResult) => {
...
});
Strings are inserted for the BigNumbers.
Seeing the values being inserted into Mongo as strings I try forcing the issue even more, just to see if it would work:
let alt = JSON.parse(jsonString) as Altitude;
let alt1 = {
size: alt.altitude as BigNumber,
minAltitude: alt.minAltitude as BigNumber,
maxAltitude: alt.minAltitude as BigNumber,
};
collection.insert(alt1, (error: MongoError, result: InsertOneWriteOpResult) => {
...
});
However the values are still inserted as strings. And in my other script everything is numbers.
Ideas?

MongoDB uses BSON Serialisation. The actual implementation would depend on the driver.
Reference https://www.mongodb.com/json-and-bson

Related

How to get TransactionUnspentOutput as a hex encoded bytes string programmatically

So basically I want to convert a normal UTxO hash like:
550665309dee7e2f64d13f999297f001763f65fe50bb05524afc0990c7dce0c3
to a TransactionUnspentOutput as a hex encoded bytes string like:
828258205537396d59c1b0546bb9cec5cb6b930238af2d8998d24ca1d47e89a3dd400a8701825839016af9a0d2c9b5bce8999bc6430eb48f424399b73f0ecc143f40e8cac89b130cc3198a8594862fe25df331cb79447304dcd49712c86834fdf1821a00150bd0a1581cb0df0ee7dbb96b18b682a1091514f250eb0ec1122e6c4bf3b4d45123a14b436f6e766963743033363701
This is how it is done with a nami wallet implementation:
cardano.getUtxos(amount?: Value, paginate?: {page: number, limit: number}) : [TransactionUnspentOutput]
I tried to pass a UTxO into the lucid utxoToCore() function:
export const utxoToCore = (utxo: UTxO): Core.TransactionUnspentOutput => {
const output = C.TransactionOutput.new(
C.Address.from_bech32(utxo.address),
assetsToValue(utxo.assets)
);
if (utxo.datumHash) {
output.set_datum(
C.Datum.new_data_hash(C.DataHash.from_bytes(fromHex(utxo.datumHash)))
);
}
return C.TransactionUnspentOutput.new(
C.TransactionInput.new(
C.TransactionHash.from_bytes(fromHex(utxo.txHash)),
C.BigNum.from_str(utxo.outputIndex.toString())
),
output
);
};
However the only output I get is:
TransactionUnspentOutput { ptr: 1247376 }
How to get the unpacked (?), or at least the right format I want, TransactionUnspentOutput?
It looks like you are trying to call an external library. Usually, in such cases, it will return you a memory address. Just like you have it in wasm calls through browser client. One way is to deserialize your object inside your node js code.
Maybe you can use some sorta utility function to get it as a string.
E.g. https://github.com/Emurgo/cardano-serialization-lib/blob/master/doc/getting-started/metadata.md#json-conversion
Hope this helps

Is it absolutely the case that arrays are used by reference in other modules in Node.js?

I have
// file cars.js
var bodyshop = require('./bodyshop')
var connections = [];
many functions which operate on connections. adding them, changing them etc.
code in this file includes things like
bodyshop.meld(blah)
bodyshop.mix(blah)
exports.connections = connections
and then
// file bodyshop.js
let cars = require('./cars');
even more functions which operate on connections. adding them, changing them etc.
code in this file includes things like
cars.connections[3].color = pink
cars.connections.splice(deleteMe, 1)
module.exports = { meld, mix, flatten }
Is it absolutely honestly the case that code in bodyshop such as cars.connections.splice(deleteMe, 1) will indeed delete an item from "the" connections (ie, the one and only connections, declared in cars.js) and code in bodyshop such as cars.connections[3].color = pink will indeed change the color of index 3 of "the" self-same one and only connections?
Is it quite OK / safe / acceptable that I used the syntax "module.exports = { }" at the end of bodyshop, rather than three lines like "exports.meld = meld" ?
Is this sentence indeed to totally correct?? "In Node.js if you export from M an array, when using the array in another module X which requires M, the array will be by reference in X, i.e. not by copy" ... ?
I created two files with the following methods and the array as you mentioned.
First File: test1.js
const testArray = [];
const getArray = () => {
return testArray;
};
module.exports = {
testArray,
getArray
}
Second File: test2.js
const { testArray, getArray } = require('./test1');
console.log('testing the required array before modifying it');
console.log(getArray());
testArray.push('test');
console.log('testing the method result after modifying the required array content');
console.log(getArray());
If you can create the mentioned files and run them locally, you will see the following result.
>node test2.js
testing the required array before modifying it
[]
testing the method result after modifying the required array content
[ 'test' ]
The points observed is,
yes, it's okay if you want to export it with the syntax module.exports = { }, It not much an issue.
If any of the methods modify this array outside of the required file, it will affect here as well, This because require will be a reference, not a copy.
The one possible solution will be creating a JSON copy of it while requiring as below:
const { testArray, getArray } = require('./test1');
const testArrayCopy = JSON.parse(JSON.stringify(testArray));
console.log('testing the required array before modifying it');
console.log(getArray());
testArrayCopy.push('test');
console.log('testing the method result after modifying the required array content');
console.log(getArray());
This is the result:
>node test2.js
testing the required array before modifying it
[]
testing the method result after modifying the required array content
[]
Note: JSON copy will not help you in parsing DateTime properly.

Convert google.protobuf.Timestamp to ISO format in Fabric 1.4 in NodeJS

I am running a Hyperledger Fabric 1.4 chaincode and try to retrieve the history of a key with getHistoryForKey stub method. I am iterating over each entry and want to convert them for standardization in all my chaincode functions.
Now, I can handle all keys in the iterator, except the timestamp which is a google.protobuf.Timestamp. Any tries of mine fail to convert it to an ISO datetime string.
Code
// Entry method to retrieve the full history of any asset
async (stub, args) => {
const idToSearch = args.id
const historyIterator = await stub.getHistoryForKey(idToSearch)
let historyData = []
await iterate(historyData, historyIterator)
if (historyData.length === 0) throw errors.ASSET_NOT_FOUND(idToSearch)
return historyData
}
// I use node v8 and thus cannopt use for await to iterator and must write recursive helper func
const iterate = async (historyData, historyIterator) => {
const element = await historyIterator.next()
if (!element) return historyIterator.close()
const {value} = element
if (!value) return historyIterator.close()
historyData.push({
value: value.value.toString('utf8'),
isDeleted: value.is_delete,
txId: value.tx_id,
timestamp: value.timestamp // <-- WANT TO CONVERT TO ISO DATE TIME STRING
})
await iterate(historyData, historyIterator)
}
My Approaches
1. toISOString()
Regarding the documentation of the protobuf of timestamp it says "In JavaScript, one can convert a Date object to this format using the standard toISOString()". This does not work, since I get "toISOString is not a function".
2. new Date()
Further, I tried to run new Date(protobufTimestamp), which results in "Invalid Date".
3. Using the seconds
I though maybe I can utilize the seconds which are on of two keys (Object.keys(protobufTimestamp) => [seconds, nanos]) in the timestamp to create the Date. But that date object also says "Invalid Date". That could be explained since I read that Protobuf Timestamp covers the ranges from year 0 to 9999. So, maybe the conversion fails.
Question
Can someone explain me how to convert the google protobuf timestamp to an ISO timestamp in Fabric 1.4 in NodeJS?
You can try something like
new Date(protobufTimestamp.seconds * 1000).toISOString()

JSON stringify and PostgreSQL bigint compliance

I am trying to add BigInt support within my library, and ran into an issue with JSON.stringify.
The nature of the library permits not to worry about type ambiguity and de-serialization, as everything that's serialized goes into the server, and never needs any de-serialization.
I initially came up with the following simplified approach, just to counteract Node.js throwing TypeError: Do not know how to serialize a BigInt at me:
// Does JSON.stringify, with support for BigInt:
function toJson(data) {
return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? v.toString() : v);
}
But since it converts each BigInt into a string, each value ends up wrapped into double quotes.
Is there any work-around, perhaps some trick within Node.js formatting utilities, to produce a result from JSON.stringify where each BigInt would be formatted as an open value? This is what PostgreSQL understands and supports, and so I'm looking for a way to generate JSON with BigInt that's compliant with PostgreSQL.
Example
const obj = {
value: 123n
};
console.log(toJson(obj));
// This is what I'm getting: {"value":"123"}
// This is what I want: {"value":123}
Obviously, I cannot just convert BigInt into number, as I would be losing information then. And rewriting the entire JSON.stringify for this probably would be too complicated.
UPDATE
At this point I have reviewed and played with several polyfills, like these ones:
polyfill-1
polyfill-2
But they all seem like an awkward solution, to bring in so much code, and then modify for BigInt support. I am hoping to find something more elegant.
Solution that I ended up with...
Inject full 123n numbers, and then un-quote those with the help of RegEx:
function toJson(data) {
return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? `${v}n` : v)
.replace(/"(-?\d+)n"/g, (_, a) => a);
}
It does exactly what's needed, and it is fast. The only downside is that if you have in your data a value set to a 123n-like string, it will become an open number, but you can easily obfuscate it above, into something like ${^123^}, or 123-bigint, the algorithm allows it easily.
As per the question, the operation is not meant to be reversible, so if you use JSON.parse on the result, those will be number-s, losing anything that's between 2^53 and 2^64 - 1, as expected.
Whoever said it was impossible - huh? :)
UPDATE-1
For compatibility with JSON.stringify, undefined must result in undefined. And within the actual pg-promise implementation I am now using "123#bigint" pattern, to make an accidental match way less likely.
And so here's the final code from there:
function toJson(data) {
if (data !== undefined) {
return JSON.stringify(data, (_, v) => typeof v === 'bigint' ? `${v}#bigint` : v)
.replace(/"(-?\d+)#bigint"/g, (_, a) => a);
}
}
UPDATE-2
Going through the comments below, you can make it safe, by counting the number of replacements to match that of BigInt injections, and throwing error when there is a mismatch:
function toJson(data) {
if (data !== undefined) {
let intCount = 0, repCount = 0;
const json = JSON.stringify(data, (_, v) => {
if (typeof v === 'bigint') {
intCount++;
return `${v}#bigint`;
}
return v;
});
const res = json.replace(/"(-?\d+)#bigint"/g, (_, a) => {
repCount++;
return a;
});
if (repCount > intCount) {
// You have a string somewhere that looks like "123#bigint";
throw new Error(`BigInt serialization conflict with a string value.`);
}
return res;
}
}
though I personally think it is an overkill, and the approach within UPDATE-1 is quite good enough.
If you are using Typescript on express then place the following code on the main server file. Easy Hack 😎 works fine
BigInt.prototype['toJSON'] = function () {
return parseInt(this.toString());
};

How do I filter keys from JSON in Node.js?

I'm trying to select certain keys from an JSON array, and filter the rest.
var json = JSON.stringify(body);
which is:
{
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
}
Want I want:
{
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
}
}
I've checked out How to filter JSON data in node.js?, but I'm looking to do this without any packages.
Now you can use Object.fromEntries like so:
Object.fromEntries(Object.entries(raw).filter(([key]) => wantedKeys.includes(key)))
You need to filter your obj before passing it to json stringify:
const rawJson = {
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
};
// This array will serve as a whitelist to select keys you want to keep in rawJson
const filterArray = [
"FirstName",
"typeform_form_submits",
];
// this function filters source keys (one level deep) according to whitelist
function filterObj(source, whiteList) {
const res = {};
// iterate over each keys of source
Object.keys(source).forEach((key) => {
// if whiteList contains the current key, add this key to res
if (whiteList.indexOf(key) !== -1) {
res[key] = source[key];
}
});
return res;
}
// outputs the desired result
console.log(JSON.stringify(filterObj(rawJson, filterArray)));
var raw = {
"FirstName":"foo",
"typeform_form_submits":{
"foo":true,
"bar":true,
"baz":true
},
"more keys": "foo",
"unwanted key": "foo"
}
var wantedKeys =["FirstName","typeform_form_submits" ]
var opObj = {}
Object.keys(raw).forEach( key => {
if(wantedKeys.includes(key)){
opObj[key] = raw[key]
}
})
console.log(JSON.stringify(opObj))
I know this question was asked aways back, but I wanted to just toss out there, since nobody else did:
If you're bound and determined to do this with stringify, one of its less-well-known capabilities involves replacer, it's second parameter. For example:
// Creating a demo data set
let dataToReduce = {a:1, b:2, c:3, d:4, e:5};
console.log('Demo data:', dataToReduce);
// Providing an array to reduce the results down to only those specified.
let reducedData = JSON.stringify(dataToReduce, ['a','c','e']);
console.log('Using [reducer] as an array of IDs:', reducedData);
// Running a function against the key/value pairs to reduce the results down to those desired.
let processedData = JSON.stringify(dataToReduce, (key, value) => (value%2 === 0) ? undefined: value);
console.log('Using [reducer] as an operation on the values:', processedData);
// And, of course, restoring them back to their original object format:
console.log('Restoration of the results:', '\nreducedData:', JSON.parse(reducedData), '\nprocessedData:', JSON.parse(processedData));
In the above code snippet, the key value pairs are filtered using stringify exclusively:
In the first case, by providing an array of strings, representing the keys you wish to preserve (as you were requesting)
In the second, by running a function against the values, and dynamically determining those to keep (which you didn't request, but is part of the same property, and may help someone else)
In the third, their respective conversions back to JSON (using .parse()).
Now, I want to stress that I'm not advocating this as the appropriate method to reduce an object (though it will make a clean SHALLOW copy of said object, and is actually surprisingly performant), if only from an obscurity/readability standpoint, but it IS a totally-effective (and mainstream; that is: it's built into the language, not a hack) option/tool to add to the arsenal.

Resources