I was trying to extract data from a JSON object using jsonpath package for following JSON structure -
[
{
"id": 1,
"images": [
{ "id": 1,"url": "http://url1.jpg" },
{ "id": 2,"url": "http://url2.jpg" }
]
},
{
"id": 2,
"images": [
{ "id": 3,"url": "http://url3.jpg" },
{ "id": 4,"url": "http://url4.jpg" }
]
},
{
"id": 3,
"images": [
{ "id": 5,"url": "http://url5.jpg" },
{ "id": 6,"url": "http://url6.jpg" }
]
}
]
In he above example, $..id json-path expression responds with following array -
[ 1, 1, 2, 2, 3, 4, 3, 5, 6 ]
What I have understood from the documentation is $..id is a recursive descent operator that recursively checks for the occurrence of id field in the specified array.
This is where I get the problem. I need an expression that simply checks for id in current object and avoids traversing recursively in images array.
So the expected output is -
[ 1, 2, 3 ]
I tried JSONPath Online Evaluator to verify the results.
Thanks.
According to the documentation of jsonpath they include
.. as Recursive descendant operator;
JSONPath borrows this syntax from E4X
$..* is used when you need to search in All memebers of JSON Structure.
So, if you want restrict the filter at the child you want you need to use $.*.
Here are the syntax
The code you want is:-
const fs = require('fs');
var jp = require('jsonpath');
let rawdata = fs.readFileSync('data.json');
let data = JSON.parse(rawdata);
var id = jp.query(data,'$.*.id');
console.log(id);
Output is
[1,2,3]
You can use the * operator which restricts the filter at the level you need.
To get it to do the thing that you want you would use:
$.*.id
and that will give you the output of:
[
1,
2,
3
]
But let's say you want the image specific id you would use:
$.*.images.*.id
which would give you the output of:
[
1,
2,
3,
4,
5,
6
]
Related
Thank you aspok for your help!
My goal is to get my list to be [3, 3, 4] and then get a count of unique values within it. Can anyone point me in the right direction for doing this?
My script consumes a JSON and puts all F4211_LNID values into a list. [3.1, 3.9, 4]. I need to now round all decimal places down.
I'm not sure if it's doable, but I am trying to use Math.floor(intListItems) to round my array values down. When I try this I receive the following error: Exception No signature of method: static java.lang.Math.floor() is applicable for argument types: (ArrayList) values: [[3.1, 3.9, 4]] Possible solutions: floor(double), log(double), find(), macro(groovy.lang.Closure), acos(double), cos(double)
I see my simplified list in the error, but I can't get it to round down and not sure what the error means.
(UPDATED) My Working Groovy
// Read Input Values
String aInputJson = aInputMap.InputJson ?: "{}"
// Initialize Output Values
def intListItems = []
def uniqueCount = 0
// Parse JSON
def json = new JsonSlurper().parseText( aInputJson )
// Determine Row Numbers
def rowset = json?.fs_DATABROWSE_F4211?.data?.gridData?.rowset
intListItems = rowset.collect{ Math.floor(it.F4211_LNID) }
intListItems.unique()
uniqueCount = intListItems.size()
JSON I am using.
{
"fs_DATABROWSE_F4211": {
"title": "Data Browser - F4211 [Sales Order Detail File]",
"data": {
"gridData": {
"id": 58,
"fullGridId": "58",
"rowset": [
{
"F4211_LNTY": "S",
"F4211_CPNT": 0,
"F4211_MCU": " 114000",
"F4211_DSC2": "NAS133N3EK166",
"F4211_NXTR": "580",
"F4211_LNID": 3.1,
"F4211_DOCO": 2845436
},
{
"F4211_LNTY": "S",
"F4211_CPNT": 0,
"F4211_MCU": " 114000",
"F4211_DSC2": "NAS133N3EK166",
"F4211_NXTR": "580",
"F4211_LNID": 3.9,
"F4211_DOCO": 2845436
},
{
"F4211_LNTY": "S",
"F4211_CPNT": 0,
"F4211_MCU": " 114000",
"F4211_DSC2": "NAS133N3EK166",
"F4211_NXTR": "580",
"F4211_LNID": 4,
"F4211_DOCO": 2845436
}
],
"summary": {
"records": 1,
"moreRecords": false
}
}
},
"errors": [],
"warnings": []
},
"currentApp": "DATABROWSE_F4211",
"timeStamp": "2000-06-01:09.42.02",
"sysErrors": []
}
You are getting the error Exception No signature of method: static java.lang.Math.floor() is applicable for argument types: (ArrayList) because there is no version of Math.floor() that accepts a List as a parameter.
Instead, you need to call Math.floor() on each individual item in the list. The easiest way to do this is in the collect { } call you are already doing.
def flooredList = rowset.collect { Math.floor(it.F4211_LNID) }
assert flooredList == [3.0, 3.0, 4.0]
I need to update (in bulk) many entities.
Each entity has a field that its value is an array.
I want to concat a whole array to the existed array in mongo.
For example:
Assume we have the field 'myField', and (its value) the array saved in mongo is: [4, 5, 6]
I want to concat the array [1, 2, 3] to this field, so the result:
myField: [1, 2, 3, 4, 5, 6]
I tried some options:
pushAll - but it is no longer available.
usePushEach: true, in options - not working, still get the same error:
"Unknown modifier: $pushAll. Expected a valid update modifier or pipeline-style update specified as an array"
I read about concat - but it is not looks compatiable.
Thanks in advance!
you can use $addToSet to add the values to existing array and avoid the duplicates like so,
[
{
id: 1,
values: [
1,
2,
3
]
}
]
db.collection.update({
id: 1
},
{
"$addToSet": {
values: {
"$each": [
5,
7,
1,
44
]
}
}
})
https://mongoplayground.net/p/S3HfWajg9r_
I'm trying to get the position in array of the first object so i can delete it with a function. The problem is: i don't know how to get the "index"value.
{
"2": {
"nome": "sou1",
"email": "adsa#lala.com",
"gênero": "masculino",
"id": 1,
"pos": 0
},
"3": {
"nome": "sou1",
"email": "adsa#lala.com",
"gênero": "masculino",
"id": 1,
"pos": 1
}
}
In this case, my output would be 2.
You can try the below
var someObject={
"2": {
"nome": "sou1",
"email": "adsa#lala.com",
"gênero": "masculino",
"id": 1,
"pos": 0
},
"3": {
"nome": "sou1",
"email": "adsa#lala.com",
"gênero": "masculino",
"id": 1,
"pos": 1
}};
console.log(Object.keys(someObject)[0]);
delete someObject[Object.keys(someObject)[0]];
console.log(someObject);
There is no ordering in JSON objects.
However, for your use case (if I correctly understand that), you can convert the JSON object into an array doing...
const someObject = {
a: "A",
b: "B",
c: "C"
}
const keys = Object.keys(someObject);
console.log(keys); // array of keys
let arr = [];
for(key of keys) {
arr.push(someObject[key]);
}
console.log(arr); // array of objects
...where the index of an element in the keys array (a key in your original JSON object) will be identical to the index of an element in the arr array (the corresponding value for that key in your original JSON object) which it maps to in your original JSON object, in a way giving you what you probably are looking for. Then you can access them using the array indices.
Hope this helps :)
[updated 17:15 on 28/09]
I'm manipulating json data of type:
[
{
"id": 1,
"title": "Sun",
"seeAlso": [
{
"id": 2,
"title": "Rain"
},
{
"id": 3,
"title": "Cloud"
}
]
},
{
"id": 2,
"title": "Rain",
"seeAlso": [
{
"id": 3,
"title": "Cloud"
}
]
},
{
"id": 3,
"title": "Cloud",
"seeAlso": [
{
"id": 1,
"title": "Sun"
}
]
},
];
After inclusion in the database, a node.js search using
db.documents.query(
q.where(
q.collection('test films'),
q.value('title','Sun')
).withOptions({categories: 'none'})
)
.result( function(results) {
console.log(JSON.stringify(results, null,2));
});
will return both the film titled 'Sun' and the films which have a seeAlso/title property (forgive the xpath syntax) = 'Sun'.
I need to find 1/ films with title = 'Sun' 2/ films with seeAlso/title = 'Sun'.
I tried a container query using q.scope() with no success; I don't find how to scope the root object node (first case) and for the second case,
q.where(q.scope(q.property('seeAlso'), q.value('title','Sun')))
returns as first result an item which matches all text inside the root object node
{
"index": 1,
"uri": "/1.json",
"path": "fn:doc(\"/1.json\")",
"score": 137216,
"confidence": 0.6202662,
"fitness": 0.6701325,
"href": "/v1/documents?uri=%2F1.json&database=Documents",
"mimetype": "application/json",
"format": "json",
"matches": [
{
"path": "fn:doc(\"/1.json\")/object-node()",
"match-text": [
"Sun Rain Cloud"
]
}
]
},
which seems crazy.
Any idea about how doing such searches on denormalized json data?
Laurent:
XPaths on JSON are supported by MarkLogic.
In particular, you might consider setting up a path range index to match /title at the root:
http://docs.marklogic.com/guide/admin/range_index#id_54948
Scoped property matching required either filtering or indexed positions to be accurate. An alternative is to set up another path range index on /seeAlso/title
For the match issue it would be useful to know the MarkLogic version and to see the entire query.
Hoping that helps,
I'm fairly new to couchDB and the concept of views and reduces, and I could not find anything that would help me get my data in the format I want to consume it in.
My Data - Each set is it's own document
{
"_id": "2012-10-28",
"scores" : [
{
"bob": 3,
"dole": 5
}
]
}
{
"_id" : "2012-10-29",
"scores" : [
{
"bob": 3,
"dole": 6
}
]
}
I would like a view/reduce that returns something like:
"bob" : {
"2012-10-27": 3,
"2012-10-28": 3,
...
},
"dole": {
"2012-10-27": 5,
"2012-10-28": 6,
...
}
If this is not possible with my source data, I can reorganize it, but it will be tough.
Any help is greatly appreciated. I would also like to know of any good resources that explain the best practices for views and reduces.
Unless all the dates are known and you can hardcode them in the reduce function, I think it's a bit difficult to do what you need with map/reduce functions.
If it is ok to output something like:
{
"key": ["bob", "2012-10-27"],
"value": {"score": 3}
}
Then this map function should work:
var scoresMapFn = function (doc) {
var scores = doc.scores[0];
for (var k in scores) {
emit([k, doc._id], scores[k]);
}
};
Note that the structures of the original document could be optimised in my opinion. You have an array for scores but only have 1 element in it that is an object which has several keys for the names/players). This could be changed to:
{
"_id": "2012-10-28",
"scores": [
{
"name": "bob,
"score": 3
},
{
"name": "dole,
"score": 5
}
]
}
which would make it easier to manipulate.
Hope this helps a bit.