I am trying to solve a problem that occurs when inserting related nodes in Neo4j. Nodes are inserted by several threads using the standard save method of org.springframework.data.neo4j.repository.GraphRepository.
Sometimes the insertion fails when fetching a related node in order to define a relationship. The exception messages are like this: org.neo4j.graphdb.NotFoundException: '__type__' on http://neo4j:7474/db/data/relationship/105550.
Calling this URL from curl returns a JSON object which appears to have __type__ correctly defined, which suggests that the exception is caused by a race between inserting threads.
The method that originates the calls to the repository is annotated #Neo4jTransactional. What atomicity and transaction isolation does #Neo4jTransactional guarantee? And how should I use it for multi-threaded insertions?
Update:
I have now been able to see this happening in the debugger. The code is trying to fetch the node at one end of this relationship, together with all its relationships. It throws an exception because the type attribute is missing. This is the JSON initially returned.
{
"extensions" : { },
"start" : "http://localhost:7474/db/data/node/617",
"property" : "http://localhost:7474/db/data/relationship/533/properties/{key}",
"self" : "http://localhost:7474/db/data/relationship/533",
"properties" : "http://localhost:7474/db/data/relationship/533/properties",
"type" : "CONTAINS",
"end" : "http://localhost:7474/db/data/node/650",
"metadata" : {
"id" : 533,
"type" : "CONTAINS"
},
"data" : { }
}
A few seconds later, the same REST call returns this JSON:
{
"extensions" : { },
"start" : "http://localhost:7474/db/data/node/617",
"property" : "http://localhost:7474/db/data/relationship/533/properties/{key}",
"self" : "http://localhost:7474/db/data/relationship/533",
"properties" : "http://localhost:7474/db/data/relationship/533/properties",
"type" : "CONTAINS",
"end" : "http://localhost:7474/db/data/node/650",
"metadata" : {
"id" : 533,
"type" : "CONTAINS"
},
"data" : {
"__type__" : "ProductRelationship"
}
}
I can't understand why there is such a long delay between inserting the relationship and specifying the type. Why doesn't it all happen at once?
Related
I am new to MongoDB and NodeJS,
When i try to create the JsonSchema with data types, string, integer, date and bool, it is created but always throwing an error as document validation error while inserting the data, So i changed the bsonType of one data type to number, then it started creating collection records, but the observation is it is storing as Double datatype, I read somewhere in the stackoverflow, that it stores like that only, but my question is why is this behavior? WHY THE ERROR IS NOT BEING THROWN AT THE TIME OF CREATION OF THE JSONSCHEMA but it is throwing at the time of data insertion?
Also, if we have nested objects let us say, Customer object with Address as nested object, the main object's int/number values are stored as Double where as inside the address object's pincode storing as Int32. This is also very confusing. what is the difference between these objects but the structure of the schema is same.
What are the other ways to implement and having proper validated schema for MongoDB.
>
db.getCollectionInfos({name:"companysInt1s1"})
[
{
"name" : "companysInt1s1",
"type" : "collection",
"options" : {
"validator" : {
"$jsonSchema" : {
"bsonType" : "object",
"required" : [
"tin"
],
"properties" : {
"tin" : {
"bsonType" : "int",
"minLength" : 2,
"maxLength" : 11,
"description" : "must be a string and is not required, should be 11 characters length"
}
}
}
}
},
"info" : {
"readOnly" : false,
"uuid" : UUID("27cba650-7bd3-4930-8d3e-7e6cbbf517db")
},
"idIndex" : {
"v" : 2,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "invoice.companysInt1s1"
}
}
]
> db.companysInt1s1.insertOne({tin:22222})
2019-02-14T15:04:28.712+0530 E QUERY [js] WriteError: Document failed validation :
WriteError({
"index" : 0,
"code" : 121,
"errmsg" : "Document failed validation",
"op" : {
"_id" : ObjectId("5c653624e382c2ec16c16893"),
"tin" : 22222
}
})
WriteError#src/mongo/shell/bulk_api.js:461:48
Bulk/mergeBatchResults#src/mongo/shell/bulk_api.js:841:49
Bulk/executeBatch#src/mongo/shell/bulk_api.js:906:13
Bulk/this.execute#src/mongo/shell/bulk_api.js:1150:21
DBCollection.prototype.insertOne#src/mongo/shell/crud_api.js:252:9
#(shell):1:1
Am i missing something or any other documentation should i be following? Appreciate your guidance...
You need to insert as NumberInt.
when you run this
db.companysInt1s1.insertOne({tin:22222})
you are actually inserting tin as float.
so the correct way to do it is
db.companysInt1s1.insertOne({tin: NumberInt(22222) })
I am using firebase real-time database. I don't want to get all child nodes for a particular parent node, I am concerned this with a particular node, not the sibling nodes. Fetching all the sibling nodes increases my billing in firebase as extra XXX MB of data is fetched. I am using NodeJs admin library for fetching this.
Adding a sample JSON
{
"phone" : {
"shsjsj" : {
"battery" : {
"isCharging" : true,
"level" : 0.25999999046325684,
"updatedAt" : "2018-05-15 12:45:29"
},
"details" : {
"deviceCodeName" : "sailfish",
"deviceManufacturer" : "Google",
"deviceName" : "Google Pixel",
},
"downloadFiles" : {
"7bfb21ff683f8652ea390cd3a380ef53" : {
"uploadedAt" : 1526141772270,
}
},
"token" : "cgcGiH9Orbs:APA91bHDT3mI5L3N62hqUT2LojqsC0IhwntirCd6x0zz1CmVBz6CqkrbC",
"uploadServer" : {
"createdAt" : 1526221336542,
}
},
"hshssjjs" : {
"battery" : {
"isCharging" : true,
"level" : 0.25999999046325684,
"updatedAt" : "2018-05-15 12:45:29"
},
"details" : {
"deviceCodeName" : "sailfish",
"deviceManufacturer" : "Google",
"deviceName" : "Google Pixel",
},
"downloadFiles" : {
"7bfb21ff683f8652ea390cd3a380ef53" : {
"uploadedAt" : 1526141772270,
}
},
"token" : "cgcGiH9Orbs:APA91bH_oC18U56xct4dRuyw9qhI5L3N62hqUT2LojqsC0IhwntirCd6x0zz1CmVBz6CqkrbC",
"uploadServer" : {
"createdAt" : 1526221336542,
}
}
}
}
In the above sample JSON file, i want to fetch all phone->$deviceId->token. Currently, I am fetching the whole phone object, then I iterate over all the phone ID's to fetch the token. This spikes my database download usage and increases billing. I am only concerned with the token of all the devices. Siblings of the token is unnecessary.
All queries to Realtime Database fetch everything under the location requested. There is no way to limit to certain children under that location. If you want only certain children at a location, but not everything under that location, you'll have to query for each one of them separately. Or, you can restructure or duplicate your data to support the specific queries you want to perform - duplication is common for nosql type databases.
We upgraded from graylog 2.1.3 to 2.3.2 and now receive this message repeatedly. Some parts of the UI load but not Search or Streams. Alerts are still going out. Anyone now how I can fix this? Rollback seems to not work at all.
Could not apply filter [StreamMatcher] on message <d8fa4293-dc7a-11e7-bc81-0a206782e8c1>:
java.lang.IllegalStateException: index set must not be null! (stream id=5a00a043a9b2c72984c581b6 title="My Streams")
What seems to have happened is that some streams did not get the "index_set_id" added in their definition in the Streams Collection in mongo. Here is an example of a bad one:
{
"_id" : ObjectId("5a1d6bb2a9b2c72984e24dc0"),
"creator_user_id" : "admin",
"matching_type" : "AND",
"description" : "EU2 Queue Prod",
"created_at" : ISODate("2017-11-28T13:59:14.546Z"),
"disabled" : false,
"title" : "EU2 Queue Prod",
"content_pack" : null
}
I was able to add the "index_set_id" : "59bb08b469d42f3bcfa6f18e" value in and restore the streams:
{
"_id" : ObjectId("5a1d6bb2a9b2c72984e24dc0"),
"creator_user_id" : "admin",
"index_set_id" : "59bb08b469d42f3bcfa6f18e",
"matching_type" : "AND",
"description" : "EU2 Queue Prod",
"created_at" : ISODate("2017-11-28T13:59:14.546Z"),
"disabled" : false,
"title" : "EU2 Queue Prod",
"content_pack" : null
}
I faced this issue too with other version of Graylog in kubernetes environment.
I took below actions to fix this issue:
From Graylog UI under Stream menu, select more actions next to your stream, in your case its : My stream click > edit stream > select "Default index set" from drop down list.
Do it for all the available streams.
In Node with Mongoose I want to find an object in the collection Content. It has a list of sub-documents called users which has the properties stream, user and added. I do this to get all documents with a certain user's _id property in there users.user field.
Content.find( { 'users.user': user._id } ).sort( { 'users.added': -1 } )
This seems to work (although I'm unsure if .sort is really working here. However, I want to match two fields, like this:
Content.find( { 'users.user': user._id, 'users.stream': stream } } ).sort( { 'users.added': -1 } )
That does not seem to work. What is the right way to do this?
Here is a sample document
{
"_id" : ObjectId("551c6b37859e51fb9e9fde83"),
"url" : "https://www.youtube.com/watch?v=f9v_XN7Wxh8",
"title" : "Playing Games in 360°",
"date" : "2015-03-10T00:19:53.000Z",
"author" : "Econael",
"description" : "Blinky is a proof of concept of enhanced peripheral vision in video games, showcasing different kinds of lens projections in Quake (a mod of Fisheye Quake, using the TyrQuake engine).\n\nDemo and additional info here:\nhttps://github.com/shaunlebron/blinky\n\nThanks to #shaunlebron for making this very interesting proof of concept!\n\nSubscribe: http://www.youtube.com/subscription_center?add_user=econaelgaming\nTwitter: https://twitter.com/EconaelGaming",
"duration" : 442,
"likes" : 516,
"dislikes" : 13,
"views" : 65568,
"users" : [
{
"user" : "54f6688c55407c0300b883f2",
"added" : 1427925815190,
"_id" : ObjectId("551c6b37859e51fb9e9fde84"),
"tags" : []
}
],
"images" : [
{
"hash" : "1ab544648d7dff6e15826cda7a170ddb",
"thumb" : "...",
"orig" : "..."
}
],
"tags" : [],
"__v" : 0
}
Use $elemMatch operator to specify multiple criteria on an array of embedded documents:
Content.find({"users": {$elemMatch: {"user": user.id, "stream": stream}}});
So I have this document within the course collection
{
"_id" : ObjectId("53580ff62e868947708073a9"),
"startDate" : ISODate("2014-04-23T19:08:32.401Z"),
"scoreId" : ObjectId("531f28fd495c533e5eaeb00b"),
"rewardId" : null,
"type" : "certificationCourse",
"description" : "This is a description",
"name" : "testingAutoSteps1",
"authorId" : ObjectId("532a121e518cf5402d5dc276"),
"steps" : [
{
"name" : "This is a step",
"description" : "This is a description",
"action" : "submitCategory",
"value" : "532368bc2ab8b9182716f339",
"statusId" : ObjectId("5357e26be86f746b68482c8a"),
"_id" : ObjectId("53580ff62e868947708073ac"),
"required" : true,
"quantity" : 1,
"userId" : [
ObjectId("53554b56e3a1e1dc17db903f")
]
},...
And I want to do is create a query that returns all courses that have a specific userId in the userId array that is in the steps array for a specific userId. I've tried using $elemMatch like so
Course.find({
"steps": {
"$elemMatch": {
"userId": {
"$elemMatch": "53554b56e3a1e1dc17db903f"
}
}
}
},
But It seems to be returning a empty document.
I think this will work for you, you have the syntax off a bit plus you need to use ObjectId():
db.Course.find({ steps : { $elemMatch: { userId:ObjectId("53554b56e3a1e1dc17db903f")} } })
The $elemMatch usage is not necessary unless you actually have compound sub-documents in that nested array element. And also is not necessary unless the value being referenced could possibly duplicate in another compound document.
Since this is an ObjectId we are talking about, then it's going to be unique, at least within this array. So just use the "dot-notation" form:
Course.find({
"steps.userId": ObjectId("53554b56e3a1e1dc17db903f")
},
Go back and look at the $elemMatch documentation. In this case, the direct "dot-notation" form is all you need