Token Comma expected - can not run JSON query - excel

This is a problem i have working in Excels Power Query.
I have this query saved in a variable named "content" which is passed to the call Web.Contents.
I can not run the query, i get "Token Comma expected" error. Can somebody tell what that is about?
`let
content = "{
"query": [
{
"code": "Region",
"selection": {
"filter": "vs:RegionKommun07",
"values": [
"1283"
]
}
},
{
"code": "Sysselsattning",
"selection": {
"filter": "item",
"values": [
"FÖRV"
]
}
},
{
"code": "Alder",
"selection": {
"filter": "item",
"values": [
"30-34"
]
}
},
{
"code": "Kon",
"selection": {
"filter": "item",
"values": [
"1"
]
}
},
{
"code": "Tid",
"selection": {
"filter": "item",
"values": [
"2015"
]
}
}
],
"response": {
"format": "px"
}
}",
Source = Json.Document(Web.Contents("http://api.scb.se/OV0104/v1/doris/sv/ssd/START/AM/AM0207/AM0207H/BefSyssAldKonK", [Content=Text.ToBinary(content)]))
in
Source`

If you want " inside a quoted string then you need to double them up like "" to escape them.
let
content = "{
""query"": [
{
""code"": ""Region"",
""selection"": {
""filter"": ""vs:RegionKommun07"",
""values"": [
""1283""
]
}
},
...
...
}"
See page 21 here: http://download.microsoft.com/download/8/1/A/81A62C9B-04D5-4B6D-B162-D28E4D848552/Power%20Query%20M%20Formula%20Language%20Specification%20(July%202019).pdf
To include quotes in a text value, the quote mark is repeated, as
follows: "The ""quoted"" text" // The "quoted" text

Related

Return the documents if the array field length greater than 0 in esclient nodejs

I have millions of documents in my es index.
I wanted to fetch the documents where the array field length greater than 0.
My docs looks like this
[
{
"primaryKey": "9c30d9e8-af04-4cc8-afcb-0c1311988c1e",
"language": "all",
"industry": [
"Accounting & auditing"
],
"text": "what's the status of my incident?",
"textId": "d0c70fc4-5e2a-4cab-a5f6-32339e6632dd",
"extractions": [],
"active": true,
"status": "active",
"createdAt": 1620208485092,
"updatedAt": 1620208485092,
"secondaryKey": "5db5f725-ec09-49da-9507-7bb2f94fd741"
},
{
"primaryKey": "9c30d9e8-af04-4cc8-afcb-0c1311988c1e",
"language": "all",
"industry": [
"Accounting"
],
"text": "What is the rating of my incident",
"textId": "4a53533f-293e-440c-aaa9-f7e5ae1436ca",
"extractions": [
{
"name": "Abinas Patra",
"role": "api-user",
"primaryKey": "ed12851d-c18d-4c92-8cc3-1782e41bc9d0"
},
{
"name": "Anil Patra",
"role": "ui-user",
"primaryKey": "933fad33-78b3-4779-a7bd-c62c6e02af75"
}
],
"active": true,
"status": "active",
"createdAt": 1620208485092,
"updatedAt": 1620208485092,
"secondaryKey": "5db5f725-ec09-49da-9507-7bb2f94fd741"
}
]
I am using elasticsearch nodejs client.
I tried in the below way
let dataCount = await esClient.count({
index: "indexName",
type: "docType",
body: {
query: {
bool: {
must: [
{
"script": {
"script": {
"inline": "doc['extractions'].values.length > 0",
"lang": "painless"
}
}
},
{
"match": {
"primaryKey": {
query: primaryKey,
"operator": "and"
}
}
},
{
"match": {
"language": {
query: language,
"operator": "and"
}
}
}
]
}
}
}
});
I get runtime parsing error everytime, i tried with exist field as well.
{"error":{"root_cause":[{"type":"script_exception","reason":"runtime error","script_stack":["org.elasticsearch.search.lookup.LeafDocLookup.get(LeafDocLookup.java:65)","org.elasticsearch.search.lookup.LeafDocLookup.get(LeafDocLookup.java:27)","doc[\'extractions\'].values.length > 1"," ^---- HERE"],
tried this as well
must_not:[
{
"script": {
"script": "_source.extractions.size() > 0"
}
}
]
Can anyone please help here.
thanks :)

Logstash - How to copy a field into an array

I am using logstash 5.6
In my document, I have a subfield "[emailHeaders][reingested-on]", and another field called [attributes], which contains several subfields [string], [double], each of which are arrays. :
{
"emailHeaders": {
"reingested-on": ["1613986076000"]
},
"attributes": {
"string": [
{
"name": "attributeString1",
"value": "attributeStringValue1"
},
{
"name": "attributeString2",
"value": "attributeStringValue2"
}
],
"double": [
{
"name": "attributeDouble1",
"value": 1.0
}
]
}
}
If the element [emailHeaders][reingested-on] is present in the document, I want to copy 1613986076000 (ie. the first element of [emailHeaders][reingested-on]) into [attributes][date] as follows:
{
"emailHeaders": {
"reingested-on": ["1613986076000"]
},
"attributes": {
"string": [
{
"name": "attributeString1",
"value": "attributeStringValue1"
},
{
"name": "attributeString2",
"value": "attributeStringValue2"
}
],
"double": [
{
"name": "attributeDouble1",
"value": 1.0
}
],
"date": [
{
"name": "Reingested on",
"value": 1613986076000
}
]
}
}
Note that if [attributes][date] already exists, and already contains an array of name/value pairs, I want my new object to be appended to the array.
Also, note that [attributes][date] is an array of objects which contain a date in their [value] attribute, as per the mapping of my ElasticSearch index:
...
"attributes": {
"properties": {
...
"date": {
"type": "nested",
"properties": {
"id": {"type": "keyword"},
"name": {"type": "keyword"},
"value": {"type": "date"}
}
},
...
}
},
...
I tried the following logstash configuration, with no success:
filter {
# See https://stackoverflow.com/questions/30309096/logstash-check-if-field-exists : this is supposed to allow to "test" if [#metadata][reingested-on] exists
mutate {
add_field => { "[#metadata][reingested-on]" => "None" }
copy => { "[emailHeaders][reingested-on][0]" => "[#metadata][reingested-on]" }
}
if [#metadata][reingested-on] != "None" {
# See https://stackoverflow.com/questions/36127961/append-array-of-json-logstash-elasticsearch: I create a temporary [error] field, and I try to append it to [attributes][date]
mutate {
add_field => { "[error][name]" => "Reingested on" }
add_field => { "[error][value]" => "[#metadata][reingested-on]" }
}
mutate {
merge => {"[attributes][date]" => "[error]"}
}
}
}
But what I get is:
{
"emailHeaders": {
"reingested-on": ["1613986076000"]
},
"error": {
"name": "Reingested on",
"value": "[#metadata][reingested-on]"
},
"attributes": {
"string": [
{
"name": "attributeString1",
"value": "attributeStringValue1"
},
{
"name": "attributeString2",
"value": "attributeStringValue2"
}
],
"double": [
{
"name": "attributeDouble1",
"value": 1.0
}
]
}
}
My temporary [error] object is created, but its value is wrong: it should be 1613986076000 instead of [#metadata][reingested-on]
Also, it is not appended to the array [attribute][date]. In this example, this array does not exist, so I want it to be created with my temporary object as first element, as per the expected result above.

how to query returns all values in the array that match the criteria in mongoose?

I want to return all values in the array that match the criteria.
but when I query, It returns only one result.
This is the value of my DB.
[{
"channel": "a",
"video": [
{
"name": 1
"status": ''
},
{
"name": 2
"status": 'err'
},
{
"name": 3
"status": 'err'
}
]
},
{
"channel": "b",
"video": [
{
"name": 4
"status": 'err'
},
{
"name": 5
"status": 'err'
},
{
"name": 6
"status": ''
}
]
}]
I want a get result like this
[
{
"channel": "a",
"video": [
{
"name": 2
"status": 'err'
},
{
"name": 3
"status": 'err'
}
]
},
{
"channel": "b",
"video": [
{
"name": 4
"status": 'err'
},
{
"name": 5
"status": 'err'
}
]
}
]
but when I using my code
var errData = await DB.find({
'video.status' : { $in : 'err' }
},
{
'channel': true,
'video' : {
$elemMatch : {
'status': { $in : 'err' }
}
}
} )
it returns like this
[
{
"channel": "a",
"video": [
{
"name": 2
"status": 'err'
}
]
},
{
"channel": "b",
"video": [
{
"name": 4
"status": 'err'
}
]
},
]
how can I fix it in mongoose(don't use aggregate)?
If it is impossible without using an aggregate, how can I make it by using an aggregate?
would you please help me?
As docs explain:
The $elemMatch operator limits the contents of an field from the query results to contain only the first element matching the $elemMatch condition
That's why you get only one element for each document.
Also, it seems (look here) that is not possible without aggregation.
Using aggregation you can use $filter in this way:
db.collection.aggregate([
{
"$set": {
"video": {
"$filter": {
"input": "$video",
"as": "v",
"cond": {"$eq": ["$$v.status","err"]}
}
}
}
}
])
Example here

parsing exception on numbers

I am trying to index data that look like the following :
var json = {
"kwg": {
"kwg0List": [
{
"lemma": "bilingue",
"categories": [
"terme"
],
"occList": [
{
"startTimeSec": 537.1,
"endTimeSec": 537.46,
"value": "bilingue"
},
{
"startTimeSec": 563.2,
"endTimeSec": 563.55,
"value": "bilingue"
}
]
}
]
}
}
Everything works fine. Now, let's say, for whatever reason, that the one of the startTimeSec fields is equal to 10. It's interpreted as a long and not as a double anymore.
I would get the following error : mapper_parsing_exception, telling me that I should have a double and not a long.
Now my question is : is there a way to "force" the long to be cast to a double when indexing, or is previously checking that the data is correctly formatted the only way of doing it?
Trace :
{
"took": 1112,
"errors": true,
"items": [
{
"create": {
"_index": "library",
"_type": "relevance",
"_id": "AViRhRJ-_Tb2laJ1W4JH",
"status": 400,
"error": {
"type": "mapper_parsing_exception",
"reason": "failed to parse",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "mapper [kwg.kwg0List.occList.endTimeSec] of different type, current_type [double], merged_type [long]"
}
}
}
}
]
}

cloudant searching index by multiple values

Cloudant is returning error message:
{"error":"invalid_key","reason":"Invalid key use-index for this request."}
whenever I try to query against an index with the combination operator, "$or".
A sample of what my documents look like is:
{
"_id": "28f240f1bcc2fbd9e1e5174af6905349",
"_rev": "1-fb9a9150acbecd105f1616aff88c26a8",
"type": "Feature",
"properties": {
"PageName": "A8",
"PageNumber": 1,
"Lat": 43.051523,
"Long": -71.498852
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-71.49978935969642,
43.0508382914137
],
[
-71.49978564033566,
43.052210148524
],
[
-71.49791499857444,
43.05220740550381
],
[
-71.49791875962663,
43.05083554852429
],
[
-71.49978935969642,
43.0508382914137
]
]
]
}
}
The index that I created is for field "properties.PageName", which works fine when I'm just querying for one document, but as soon as I try for multiple ones, I would receive the error response as quoted in the beginning.
If it helps any, here is the call:
POST https://xyz.cloudant.com/db/_find
request body:
{
"selector": {
"$or": [
{ "properties.PageName": "A8" },
{ "properties.PageName": "M30" },
{ "properties.PageName": "AH30" }
]
},
"use-index": "pagename-index"
}
In order to perform an $or query you need to create a text (full text) index, rather than a json index. For example, I just created the following index:
{
"index": {
"fields": [
{"name": "properties.PageName", "type": "string"}
]
},
"type": "text"
}
I was then be able to perform the following query:
{
"selector": {
"$or": [
{ "properties.PageName": "A8" },
{ "properties.PageName": "M30" },
{ "properties.PageName": "AH30" }
]
}
}

Resources