ArangoDB - Create AQL INSERT Scripts - arangodb

I would like to prepare scripts for completing databases. How can I do this?
Something like that:
INSERT { _class: 'Entity', name: 'First'} IN wholesales
INSERT { _class: 'Entity', name: 'Second' } IN wholesales
INSERT { _class: 'Entity', name: 'Three' } IN wholesales
INSERT { _class: 'Entity', name: 'Four' } IN wholesales

Only one INSERT operation per collection and query is allowed in AQL.
You can use a loop to make this work however:
FOR doc IN [
{ _class: 'Entity', name: 'First'},
{ _class: 'Entity', name: 'Second' },
{ _class: 'Entity', name: 'Third' },
{ _class: 'Entity', name: 'Fourth' }
]
INSERT doc INTO wholesales
The documents as well as the collection name can also be passed as bind parameters.
Query:
FOR doc IN #docs INSERT doc INTO ##coll
Bind parameters:
{ "docs": [ { ... }, { ... } ], "#coll": "wholesales" }
Another way to import data is to use arangoimport.

Related

Atomic deep merge for document MongoDB

I want to atomically deep merge a document in MongoDB. (I am using Node.js)
For example,
I have this document in my DB -
{
id: 123,
field1: {
field2: "abc"
}
}
And I am given this object:
{
field1: {
field3: "def"
}
}
So I'm looking for my document in the DB to change to this:
{
id: 123,
field1: {
field2: "abc",
field3: "def"
}
}
Of course I can do this in a non-atomic way by:
First fetching the document from the DB.
Then in JS do the merge and save it to a new object.
And finally overwrite the new item to the DB.
But I want this to happen in an atomic matter (so the read and the write stages will not happen separately). Is there any way of doing this?
I have heard the aggregation methods $merge and $mergeObjects might help me with that. Question is - are they atomic?
Thanks all :)
You can use the $function operator and go recursive.
Consider this doc in the DB:
{
field1: {
field6: "pre",
field2: "abc",
fieldX: {
aa: [1,2,3],
bb: "pre",
cc: "pre"
}
}
};
This is our candidate merge/overwrite doc:
var my_object = {
field1: {
field3: "def",
field6: "post",
fieldX: {
bb: "post",
zz: "post"
}
},
field4: 77,
field22: "qqq"
};
field4 and field22 are "easy"; they just get added to the doc. Where fields exist like field1, we have to test for objects vs. non-object. For objects, we will recursively descend into the db object and the candidate. For non-objects, we will let the candidate "win" and its value simply overwrites that in the db. This pipeline:
c = db.foo.aggregate([
{$replaceRoot: {newRoot: {$function: {
body: function(db_object, my_object) {
var process = function(myo, key, dbo) {
db_value = dbo[key];
if(undefined == db_value) {
// Easy; db does not have entry at all; just set it:
dbo[key] = myo[key];
} else {
// my_value has to be non-null because this function
// is driven from the keyset in myo.
my_value = myo[key];
if(db_value instanceof Object && my_value instanceof Object) {
// Both objects with same name; descend!
walkObj(db_value, my_value)
} else {
// my_object "wins" the slot and just overwrites dbo:
dbo[key] = myo[key];
}
}
};
var walkObj = function(dbo,myo) {
// Drive the walk from my_object; it overwrites/appends the db:
Object.keys(myo).forEach(function(k) {
process(myo, k, dbo);
});
}
walkObj(db_object, my_object); // entry point!
return db_object;
},
args: [ "$$CURRENT", my_object ],
lang: "js"
}}
}}
// Take each doc and put it back using _id as the key:
,{$merge: {
into: "foo",
on: [ "_id" ],
whenMatched: "merge",
whenNotMatched: "fail"
}}
]);
will produce this result in the DB:
{
_id: ObjectId("63f0f5e45d0b16d1033771e7"),
field1: {
field6: 'post',
field2: 'abc',
fieldX: { aa: [ 1, 2, 3 ], bb: 'post', cc: 'pre', zz: 'post' },
field3: 'def'
},
field22: 'qqq',
field4: 77
}

In a SELECT how can I return a mask of a jsonb column according to same/similar string as for json-mask?

We have a jsonb column acting as a document store, and do our object key filtering in the application.
Our jsonb column contains nested arrays and objects with no consistent structure:
CREATE TABLE test(id serial, doc jsonb);
INSERT INTO test(doc) values ('{"name": "name1", "custom":{ "role": "admin", "valid": true}, "tags": ["tag1", "tag2"]}');
INSERT INTO test(doc) values ('{"name": "name2", "custom":{ "role": "admin", "valid": true}, "tags": ["tag1", "tag2"]}');
INSERT INTO test(doc) values ('{"name": "name3", "custom":{ "role": "user", "valid": true}, "tags": ["tag1", "tag2"]}');
INSERT INTO test(doc) values ('{"custom":{ "role": "app", "valid": "on_tuesdays"}}');
INSERT INTO test(doc) values ('{"name": "name4", "custom":{ "role": "admin", "valid": true}, "tags": ["tag1", "tag2"]}');
(dbfiddle)
We use json-mask to filter the response:
import mask from 'json-mask'
let str = 'name,custom(role,valid)'
let output = [];
const results = await runSql();
for(let row of results){
output.push({
doc: mask(row.doc, str)
})
}
output:
[{
doc: {
name: 'name1',
custom: {
role: 'admin',
valid: true
}
}
},{
doc: {
name: 'name2',
custom: {
role: 'admin',
valid: true
}
}
},{
doc: {
name: 'name3',
custom: {
role: 'user',
valid: true
}
}
},{
doc: {
custom: {
role: 'app',
valid: "on_tuesdays"
}
}
},{
doc: {
name: 'name4',
custom: {
role: 'admin',
valid: true
}
}
}]
How can I do the masking in SQL instead? Do I need to build separate jsonb objects for each key/path, then merge them all together using ||, or is there a better way?
From the postgresql json docs it seems like jsonb_object_agg or jsonb_build_object could potentially do what I need, but I would need to destructure the mask string and do some type of looping for the nested objects.
something like (doesnt account for arrays):
select
jsonb_build_object(
'name', test.doc->>'name',
'custom' , (
jsonb_build_object(
'role', test.doc->'custom'->>'role'
)
)
)
...
The best I've come up with. At least the output is precisely the same.
select jsonb_agg(jsonb_strip_nulls(jsonb_build_object(
'name', test.doc->>'name',
'custom' , (
jsonb_build_object(
'role', test.doc->'custom'->>'role',
'valid', test.doc->'custom'->>'valid'
)
)
))) from test;
DB fiddle.

MongoDB - Update/Insert the Array of object

I've one collection i.e 'comments', Which contain two fields. Like below -
Comments -
{
_id: 'unique_string',
dataVar: [{
name: {
type: string
},
value: {
type: string
}
}]
}
You can assume the collection data is like below -
[
_id: 'xxxxxx',
user: 'robin',
dataVar: [{
name: 'abc',
value: 123
}, {
name: 'bcd',
value: 12345
}]
]
Now the problem is - (I'm using mongoose in nodeJS application)
- How to update and insert the data in 'dataVar' ?
- If data is no available then new document should be created.
Case 1.
If user send the post data like
{
_id: 'xxxxxx',
user: 'robin',
dataVar: [{name: 'abc', value: 12345}]
}
then after performing query, the above document (whose _id is 'xxxxxx') should be update like below -
{
_id: 'xxxxxx',
user: 'robin',
dataVar: [
{name: 'abc', value: 12345},
{name: 'bcd', value: 12345}
]
}
Case 2.
If data is not present then new document should be created.
To update record in mongodb, you can use like
comments.findOneAndUpdate({ _id: 'xxxxxx' }, { $push: { dataVar: {$each: [{name: 'abc', value: 123}, {name: 'def', value: 456}] } } })
You can use {upsert: true} option for creating field in collection if not exist.
For updating record, you have to use sub-schema inside your main schema. So that _id field is created in every object of array. So you can uniquely identify object to update inside array.
comments.findOneAndUpdate({ "dataVar._id": 'xxxxxx' }, { $set: { name: 'xyz', value: 789 } })
In this way, you can update object, which is inside array.
If you don't want to use _id, you have at-least one field which contains unique values. So that you can findOneAndUpdate that specific object inside array.

querying nested object arrays from mongodb / nodejs

I am trying to figure out how to query the nested objects inside the Components object. The data was inserted from a parsed json file.
Query
var query = {}
cursor = db.collection("workflows").find(query).toArray(function(err, result) {
if (err) throw err;
console.log(result);
db.close();
});
This data is returned when I run the query above:
At this point i'm just trying to get it to filter in some manner. I've tried Name:'Test WF' and other variations of that but still can't get a filtered response.
[ { _id: 5c77040838f9d322b89bbd82,
texto:
{ _id: 12,
LocalCachePath: 'Z:\\Test\\Cache',
SharedCachePath: [],
Name: 'Test WF',
Desc: 'I\'m testing',
Components: [Array] } },
{ _id: 5c7704164413692978a9dd1a,
texto:
{ _id: 'Workflow-2019.02.22-23.21.15-MKRU',
LocalCachePath: 'Z:\\MAITest\\Cache',
SharedCachePath: [],
Name: 'Test WF',
Desc: 'I\'m testing',
Components: [Array] } },
{ _id: 5c77046335b012379c99951b,
texto:
{ _id: '154',
LocalCachePath: 'Z:\\Test\\Cache',
SharedCachePath: [],
Name: 'Test WF',
Desc: 'I\'m testing',
Components: [Array] } },
{ _id: 5c7704787bde6f36543d1016,
texto:
{ _id: 'Workflow-2019.02.22-23.21.15-MKRU',
LocalCachePath: 'Z:\\Test\\Cache',
SharedCachePath: [],
Name: 'Test WF',
Desc: 'I\'m testing',
Components: [Array] } } ]
Any insight would be helpful i'm stumbling through this one step at a time.
Here's another query that is giving me results but i guess my issue is going to be to parse out my results as variables.
var query = {'texto.Components.0.Name' : {$gt: ''}}
// var query = {'testo.Name' : {$gt: ''} }
cursor = db.collection("workflows").find(query).toArray(function(err, result) {
if (err) throw err;
Use dot notation (e.g. texto.Name) to query and retrieve fields from nested objects, example:
var query = {'texto.Name': 'Test WF'}
Simply
db.getCollection('TestQueries').find({'texto.Name': 'Test WF'})
Regex used for Case Insensitive.
db.getCollection('TestQueries').find({"texto.Name":{
'$regex' : '^test wa$', '$options' : 'i'
}})
Using collation
db.fruit.createIndex( {"texto.Name": 1},{ collation: {
locale: 'en', strength: 2
} } )
db.getCollection('TestQueries').find(
{ "texto.Name": "test wa" } ).collation( { locale: 'en', strength: 2 }
)
You can also use $elemMatch. It is longer, but allows for multiple fields query.
db.getCollection('TestQueries').find({texto: {$elemMatch: {Name: "test wa"} }))
Official docs here

MongoDB remove nested objects

I have a collection like this:
_id: {name: 'name', family: 'family'}
i want to remove some objects by _id using by $in, how i can do this?
for example my query should be something like:
db.persons.remove({_id: {$in: [ { name: 'name1', family: 'family1' }
, { name: 'name2', family: 'family2' }
]
}
})
You can also do this with an $or query and dot notation if your fields are not always in the same order:
db.persons.remove({
"$or": [
{ "_id.name": "name1", "_id.family": "family1" },
{ "_id.name": "name2", "_id.family": "family2" },
}
})
Not the mongoose syntax, but you get the idea. It is logically the same thing but the field order is not dependent as it is with the full object you are specifying to $in.

Resources