CosmosDB: JSON reader was expecting a value but found 'db' - azure

I have a mongodb update query that works when I run it in the shell but the same is throwing an error when I run the query in Query shell of azure cosmos-db. Initially I used the following query.
db.sample.update(
{
_id: ObjectId("690655905jgj580")
},
{
$set: {
user_data: {
eid: "E9076",name: "Jhon",email:"Jhon#xyz.com", posted_on: "1621509348056"
},
}
}
)
The above mentioned query fails to update the record via Query Shell and gives an error.
After looking into other related questions I found that both the key and value must be in quotes so I modified the query as follows. But, still I'm getting the same error: JSON reader was expecting a value but found 'db'
db.sample.update(
{
"_id": ObjectId("690655905jgj580")
},
{
"$set": {
"user_data": {
"eid": "E9076","name": "Jhon","email":"Jhon#xyz.com",
"posted_on": "1621509348056"
},
}
}
)
Can someone help me figure out what's wrong here?

Related

How do I make a field path include a value in same document using MongoDB/pyMongo?

The documents in the MongoDB I am querying can be stripped down to this as an example:
{
"date":"2019-08-15",
"status":"5345",
"foo":
{
"bar":
{
"years":
{
"2018":
{
"const":1234
},
"2019":
{
"const":4321
}
}
}
}
}
I am trying to get the "const" values from this document using pyMongo.
The keys in "years" is varying with the "date" of the document.
I have attempted to use this pipeline where I try to use the year of "date" to get the "const" from this year:
pipeline=[
{'$match':{'status':{'$exists': True}}},
{'$project':
'const_thisYear':{
'$let':{
'vars':{
'yr':{ '$year': {'$convert':{'input': '$date','to': 'date'}}},
'res': '$foo.bar.years'
},
'in': '$$res.$$yr.const'
}
}
}
]
When aggregating I get the following python exception:
OperationFailure: FieldPath field names may not start with '$'.
How do I do this correctly?
Python 3.7.7
You should revise your collection structure to not store data as keys; but irrespective, just using regular python dict manipulation can get you out of the hole:
for doc in db.mycollection.find({'status': {'$exists': True}}, {'foo.bar.years': 1}):
for year, year_value in doc['foo']['bar']['years'].items():
print(year, year_value.get('const'))

Cannot query on a date range, get back no results each time

I'm having a hard time understanding why I keep getting 0 results back from a query I am trying to perform. Basically I am trying to return only results within a date range. On a given table I have a createdAt which is a DateTime scalar. This basically gets automatically filled in from prisma (or graphql, not sure which ones sets this). So on any table I have the createdAt which is a DateTime string representing the DateTime when it was created.
Here is my schema for this given table:
type Audit {
id: ID! #unique
user: User!
code: AuditCode!
createdAt: DateTime!
updatedAt: DateTime!
message: String
}
I queried this table and got back some results, I'll share them here:
"getAuditLogsForUser": [
{
"id": "cjrgleyvtorqi0b67jnhod8ee",
"code": {
"action": "login"
},
"createdAt": "2019-01-28T17:14:30.047Z"
},
{
"id": "cjrgn99m9osjz0b67568u9415",
"code": {
"action": "adminLogin"
},
"createdAt": "2019-01-28T18:06:03.254Z"
},
{
"id": "cjrgnhoddosnv0b67kqefm0sb",
"code": {
"action": "adminLogin"
},
"createdAt": "2019-01-28T18:12:35.631Z"
},
{
"id": "cjrgnn6ufosqo0b67r2tlo1e2",
"code": {
"action": "login"
},
"createdAt": "2019-01-28T18:16:52.850Z"
},
{
"id": "cjrgq8wwdotwy0b67ydi6bg01",
"code": {
"action": "adminLogin"
},
"createdAt": "2019-01-28T19:29:45.616Z"
},
{
"id": "cjrgqaoreoty50b67ksd04s2h",
"code": {
"action": "adminLogin"
},
"createdAt": "2019-01-28T19:31:08.382Z"
}]
Here is my getAuditLogsForUser schema definition
getAuditLogsForUser(userId: String!, before: DateTime, after: DateTime): [Audit!]!
So to test I would want to get all the results in between the last and first.
2019-01-28T19:31:08.382Z is last
2019-01-28T17:14:30.047Z is first.
Here is my code that would inject into the query statement:
if (args.after && args.before) {
where['createdAt_lte'] = args.after;
where['createdAt_gte'] = args.before;
}
console.log(where)
return await context.db.query.audits({ where }, info);
In playground I execute this statement
getAuditLogsForUser(before: "2019-01-28T19:31:08.382Z" after: "2019-01-28T17:14:30.047Z") { id code { action } createdAt }
So I want anything that createdAt_lte (less than or equal) set to 2019-01-28T17:14:30.047Z and that createdAt_gte (greater than or equal) set to 2019-01-28T19:31:08.382Z
However I get literally no results back even though we KNOW there is results.
I tried to look up some documentation on DateTime scalar in the graphql website. I literally couldn't find anything on it, but I see it in my generated prisma schema. It's just defined as Scalar. With nothing else special about it. I don't think I'm defining it elsewhere either. I am using Graphql-yoga if that makes any difference.
(generated prisma file)
scalar DateTime
I'm wondering if it's truly even handling this as a true datetime? It must be though because it gets generated as a DateTime ISO string in UTC.
Just having a hard time grasping what my issue could possibly be at this moment, maybe I need to define it in some other way? Any help is appreciated
Sorry I misread your example in my first reply. This is what you tried in the playground correct?
getAuditLogsForUser(
before: "2019-01-28T19:31:08.382Z",
after: "2019-01-28T17:14:30.047Z"
){
id
code { action }
createdAt
}
This will not work since before and after do not refer to time, but are cursors used for pagination. They expect an id. Since id's are also strings this query does not throw an error but will not find anything. Here is how pagination is used: https://www.prisma.io/docs/prisma-graphql-api/reference/queries-qwe1/#pagination
What I think you want to do is use a filter in the query. For this you can use the where argument. The query would look like this:
getAuditLogsForUser(
where:{AND:[
{createdAt_lte: "2019-01-28T19:31:08.382Z"},
{createdAt_gte: "2019-01-28T17:14:30.047Z"}
]}
) {
id
code { action }
createdAt
}
Here are the docs for filtering: https://www.prisma.io/docs/prisma-graphql-api/reference/queries-qwe1/#filtering
OK so figured out it had to do with the fact that I used "after" and "before" as an argument variable. I have no clue why this completely screws everything up, but it just wont return ANY results if you have this as a argument. Very strange. Must be abstracting some other variable somehow, probably a bug on graphql's end.
As soon as I tried a new variable name, viola, it works.
This is also possible:
const fileData = await prismaClient.fileCuratedData.findFirst({
where: {
fileId: fileId,
createdAt: {
gte: fromdate}
},
});

How to build a NodeJS variable to create a regexp query for ElasticSearch

I'm working at a company that used to have a monolithic PHP/MySQL CMS which controlled the website, but we are now trying to get the website to pull data from our API rather than directly from MySQL. The API is simply ElasticSearch on AWS. I wrote some code which now moves our data from MySQL to ElasticSearch. And now I can get the data I want with a curl call like this:
curl --verbose -d '{"from" : 0, "size" : 10000, "query": { "bool": { "should": [ { "regexp": { "string-of-words-associated-with-this-document": { "value": ".*steel.*" } } }, { "regexp": { "string-of-words-associated-with-this-document": { "value": ".*services.*" } } } ] } } }' -H 'Content-Type: application/json' -X GET "https://search-sameday01-ntsw7b7shy3wu.us-east-1.es.amazonaws.com/crawlers/_search?pretty=true"
This works great. Each document in ElasticSearch has a field that contains the words we want to query against, and we match against that field using regexp queries.
Now I'm writing a new app that checks data coming in from our web crawlers, and looks to see if we have certain names already in our database. The new app is a NodeJS app, so I decided to use this library:
https://github.com/elastic/elasticsearch-js
I need to build up what might be many regexp clauses, so I go into a loop and build up many clauses in an array:
array_of_elasticsearch_clauses_should_match.push( { "regexp": { "string-of-words-associated-with-this-document": { "value": ".*" + word_sanitized + ".*" } } } );
So I thought I could then pass in this variable like this:
es_client.search({
index: 'crawlers',
type: 'sameday',
body: {
query: {
bool: {
should: array_of_elasticsearch_clauses_should_match
}
}
}
}).then(function (resp) {
But I get this error:
Trace: [parsing_exception] [array_of_elasticsearch_clauses_should_match] query malformed, no start_object after query name, with { line=1 & col=75 }
How could I build up the regexp clauses in a variable and then pass it in?

id cannot be used in graphQL where clause?

{
members {
id
lastName
}
}
When I tried to get the data from members table, I can get the following responses.
{ "data": {
"members": [
{
"id": "TWVtYmVyOjE=",
"lastName": "temp"
},
{
"id": "TWVtYmVyOjI=",
"lastName": "temp2"
}
] } }
However, when I tried to update the row with 'id' where clause, the console shows error.
mutation {
updateMembers(
input: {
values: {
email: "testing#test.com"
},
where: {
id: 3
}
}
) {
affectedCount
clientMutationId
}
}
"message": "Unknown column 'NaN' in 'where clause'",
Some results from above confused me.
Why the id returned is not a numeric value? From the db, it is a number.
When I updated the record, can I use numeric id value in where clause?
I am using nodejs, apollo-client and graphql-sequelize-crud
TL;DR: check out my possibly not relay compatible PR here https://github.com/Glavin001/graphql-sequelize-crud/pull/30
Basically, the internal source code is calling the fromGlobalId API from relay-graphql, but passed a primitive value in it (e.g. your 3), causing it to return undefined. Hence I just removed the call from the source code and made a pull request.
P.S. This buggy thing which used my 2 hours to solve failed in build, I think this solution may not be consistent enough.
Please try this
mutation {
updateMembers(
input: {
values: {
email: "testing#test.com"
},
where: {
id: "3"
}
}
) {
affectedCount
clientMutationId
}
}

Mongo text search with AND operation for multiple words partially enered

{
TypeList" : [
{
"TypeName" : "Carrier"
},
{
"TypeName" : "Not a Channel Member"
},
{
"TypeName" : "Service Provider"
}
]
}
Question :
db.supplies.find("text", {search:"\"chann\" \"mem\""})
For above query I want display :
{
TypeName" : "Not a Channel Member"
}
But I am unable to get my result.
What are changes I have to do in query .
Please help me.
The below query will return your desired result.
db.supplies.aggregate([
{$unwind:"$TypeList"},
{$match:{"TypeList.TypeName":{$regex:/.*chann.*mem.*/,$options:"i"}}},
{$project:{_id:0, TypeName:"$TypeList.TypeName"}}
])
If you can accept to get an output like this:
{
"TypeList" : [
{
"TypeName" : "Not a Channel Member"
}
]
}
then you can get around using the aggregation framework which generally helps performance by running the following query:
db.supplies.find(
{
"TypeList.TypeName": /chann.*mem/i
},
{ // project the list in the following way
"_id": 0, // do not include the "_id" field in the output
"TypeList": { // only include the items from the TypeList array...
$elemMatch: { //... where
"TypeName": /chann.*mem/i // the "TypeName" field matches the regular expression
}
}
})
Also see this link: Retrieve only the queried element in an object array in MongoDB collection

Resources