So, I am receiving some JSON data from a client to my Node.JS server. I want to insert that json into my MongoDB instance using Mongoose.
I can insert the JSON as-is, and it works great, because it's just text. However, I want to parse it before insertion so that when I extract it later it will be all nice and neat.
So, this works:
wordStream.words.push(wordData);
And this doesn't:
wordStream.words.push(JSON.parse(wordData));
So, should I even want to parse the JSON before insertion?
And if I should parse the JSON, how do I do it without throwing an error? I need to put everything in double quotes "", I believe, before it will parse, but for some reason whenever I make a string with double quotes and parse it, it turns everything all wrong.
Here is the JSON:
{ word: 'bundle',
definitions:
[ { definition: 'A group of objects held together by wrapping or tying.',
partOfSpeech: 'noun' } ],
urlSource: 'testurl',
otherSource: '' }
And the error when I try to parse
/Users/spence450/Documents/development/wordly-dev/wordly-server/node_modules/mongoose/lib/utils.js:409
throw err;
^
SyntaxError: Unexpected token o
Ideas?
So, should I even want to parse the JSON before insertion?
Convert the strings to JSON objects will benefit you later, when you need to make queries in your MongoDB database.
And if I should parse the JSON, how do I do it without throwing an error? I need to put everything in double quotes "", I believe, before it will parse, but for some reason whenever I make a string with double quotes and parse it, it turns everything all wrong.
You aren't receiving JSON documents. JSON documents must contain the keys quoted.
You can:
Use a library that recognizes invalid JSON objects (please don't)
Use eval (this is a security issue, so don't do it)
Fix the source of the problem, creating real JSON objects. This isn't difficult, you can see the JSON features here
Related
I have an API created in Loopback 4 which retrieves data from a database in PostgreSQL 13 encoded with UTF8. Visiting the API explorer (localhost:3000/explorer) and executing the GET requests I realize that even when the database fields contain characters like letters with accents and ñ's; the retrieved JSON only shows blanks in the position where the character must have appeared. For example, if the database has a field with a word like 'piña', the JSON returns 'pi a'.
When I try a POST request, inserting a field like 'ramírez' (note the í), in the database, the field is shown as 'ramφrez' and when I execute a GET of that entry, the JSON now has de correct value, 'ramírez'.
How can I fix that?
I'd recommend using the Buffer class:
var encodedString = Buffer.from('string', 'utf-8');
with this way you will be able to return anything you want. In NodeJS Buffer class already included so you don't need to install any dependencies.
If you don't get what you need you can change 'utf-8' part.
I like to store mongodb queries as string. I would like to load and execute them. How can I convert the parameter object used in find to a string without loosing information?
I have already tried to use JSON.stringify and JSON.parse, but it seems, that JSON.parse does not revert JSON.stringify completely.
historyCollection.find({"date":{$gte:new Date("2018-12-20 13:47:57.461Z")}, $or:[{"source":2}, {"source":3}]}).limit(4).toArray(function (err2, result2) {
// Do Stuff
});
Hi I am using "XML to JSON" policy to change my XML into JSON but it is adding an extra "$" character. not sure what is the benefit of having it and how to get rid of that.
Currently:
hello becomes { "a": { "$" : "hello" } }
Expecting it to return { "a": "hello" }
Can anyone please help here?
Just because it uses Badger Fish for Transformation. For this you can use Mapper policy to map your attributes to transform from XML to JSON.
The best way to do XML to JSON and JSON to XML transforms is using the mapping node, if you use the "automatic" nodes, it is slower than the mapping nodes and also might have weird behaviours like adding "$" or later removing them... (according to my experience, if the XML has attributes, it might remove the "$".. so I moved to mapping nodes and ready..)
I would like to invoke array_prepend into a json[] using a parameterized query. I am using pg-promise npm package but this uses the normal node-postgres adapter under the hood.
My query is:
db.query(`update ${schema}.chats set messages =
array_prepend('{"sender":"${sender}","tstamp":${lib.ustamp()},"body":$1}',messages) where chat_id = ${chat_id}`
, message));
Same with "$1".
It works with a non-parameterized query.
Above code produces :
{ [error: syntax error at or near "hiya"]
Main reason for this is to avoid sql injection (docs say that they escape adequately when using the parameterized queries).
There are 2 problems in your query.
The first one is that you are using ES6 template strings, while also using sql formatting with ${propName} syntax.
From the library's documentation:
Named Parameters are defined using syntax $*propName*, where * is any of the following open-close pairs: {}, (), [], <>, //, so you can use one to your liking, but remember that {} are also used for expressions within ES6 template strings.
So you either change from ES6 template strings to standard strings, or simply switch to a different variable syntax, like $/propName/ or $[propName], this way you will avoid the conflict.
The second problem is as I pointed earlier in the comments, when generating the proper SQL names, use what is documented as SQL Names.
Below is a cleaner approach to the query formatting:
db.query('update ${schema~}.chats set messages = array_prepend(${message}, messages) where chat_id = ${chatId}', {
schema: 'your schema name',
chatId: 'your chat id',
message: {
sender: 'set the sender here',
tstamp: 'set the time you need',
body: 'set the body as needed'
}
}
);
When in doubt about what kind of query you are trying to execute, the quickest way to peek at it is via pgp.as.format(query, values), which will give you the exact query string.
And if you still want to use ES6 template strings for something else, then you can change the string to:
`update $/schema~/.chats set messages = array_prepend($/message/, messages) where chat_id = $/chatId/`
That's only one example, the syntax is flexible. Just remember not to use ES6 template string formatting to inject values into queries, because ES6 templates have no knowledge about how to properly format JavaScript types to comply with PostgreSQL, only the library knows it.
I'm running a node.js EB container and trying to store JSON inside an Environment Variable. The JSON is stored correctly, but when retrieving it via process.env.MYVARIABLE it is returned with all the double quotes stripped.
E.g. MYVARIABLE looks like this:
{ "prop": "value" }
when I retrieve it via process.env.MYVARIABLE its value is actualy { prop: value} which isn't valid JSON. I've tried to escape the quotes with '\' ie { \"prop\": \"value\" } that just adds more weird behavior where the string comes back as {\ \"prop\\":\ \"value\\" }. I've also tried wrapping the whole thing in single quotes e.g. '{ "prop": "value" }', but it seems to strip those out too.
Anyone know how to store JSON in environment variables?
EDIT: some more info, it would appear that certain characters are being doubly escaped when you set an environment variable. E.g. if I wrap the object in single quotes. the value when I fetch it using the sdk, becomes:
\'{ "prop": "value"}\'
Also if I leave the quotes out, backslashes get escaped so if the object looks like {"url": "http://..."} the result when I query via the sdk is {"url": "http:\\/\\/..."}
Not only is it mangling the text, it's also rearranging the JSON properties, so properties are appearing in a different order than what I set them to.
UPDATE
So I would say this seems to be a bug in AWS based on the fact that it seems to be mangling the values that are submitted. This happens whether I use the node.js sdk or the web console. As a workaround I've taken to replacing double quotes with single quotes on the json object during deployment and then back again in the application.
Use base64 encoding
An important string is being auto-magically mangled. We don't know the internals of EB, but we can guess it is parsing JSON. So don't store JSON, store the base64-encoded JSON:
a = `{ "public": { "s3path": "https://d2v4p3rms9rvi3.cloudfront.net" } }`
x = btoa(a) // store this as B_MYVAR
// "eyAicHVibGljIjogeyAiczNwYXRoIjogImh0dHBzOi8vZDJ2NHAzcm1zOXJ2aTMuY2xvdWRmcm9udC5uZXQiIH0gfQ=="
settings = JSON.parse(atob(process.env.B_MYVAR))
settings.public.s3path
// "https://d2v4p3rms9rvi3.cloudfront.net"
// Or even:
process.env.MYVAR = atob(process.env.B_MYVAR)
// Sets MYVAR at runtime, hopefully soon enough for your purposes
Since this is JS, there are caveats about UTF8 and node/browser support, but I think atob and btoa are common. Docs.