I wrote this code for testing. I expected it to execute sequensially. and the out put to be:
1- Search for user no. 1
2- Search for user no. 1
3- search for user no. 1
4 user 1 Exist ( if exist ) or 4- user 1 not exist ( if not)
but the statments printed in a diffrent order, the statment before the find printed together and the statments inside the find printed together.
I added a delay time before executing the next loop to allow the find to complet but not success.
How to execute solve this problem and execute statments in order.
app.get("/alert2", function(req, res) {
for (let y = 1; y < 5; y++) {
sleep(1000).then(() => {
console.log("1- Search for user no. " + y);
console.log("2- Search for user no. " + y);
User.findOne({ firsName: "Abdelmenem" + y }, function(err, foundList) {
console.log("3- Search for user no. " + y);
if (!err) {
if (!foundList) {
console.log("4- user " + y + " not Exist");
} else {
console.log("4- user " + y + " Exist");
}
} else {
console.log(err);
};
});
});
y1 = y;
};
console.log("finished : " + y1);
res.redirect("/");
})
enter image description here
Please, How to add promise and wait to this code?
I'm using the real time database with the following data structure. I can add more machines in the webapp that attributes an autogenerated ID and all the machines has the same data structure.
machines
-autogeneratedID1
-id1 : 1
-id2 : 2
-autogeneratedID2
-id1 : 4
-Id2 : 3
I want to track the changes on the id inside the autogeneratedIDs, if the id1 on the autogeneratedID1 changes from 1 to 3, I want something that returns the change with a timestamp.
I'm trying to use cloud functions with the following code:
exports.foo = functions.database.ref("machines/").onUpdate((change) => {
const before = change.before.val(); // DataSnapshot before the change
const after = change.after.val(); // DataSnapshot after the change
console.log(before);
console.log(after);
return null;
but the before and after objects returns me the JSON of all the strucutre of the database.
My first guess was to compare the 2 JSON objects and detect where the changes were and add a timestamp. Then I want to store the changes in the database.
Is there a way to do this?
Best Regards
Found the answer finally, here it is the code that I'm using:
exports.foo = functions.database.ref("machines/{machineID}/{ValueID}").onUpdate((change,context) => {
const before = change.before.val(); // DataSnapshot before the change
const after = change.after.val(); // DataSnapshot after the change
const machineID = context.params.machineID; //Machine ID
const valueID = context.params.ValueID; //Value ID
console.log("MachineID: " + machineID + " " +"IDChanged: " + valueID + " " + "BeforeValue: " + before +" "+ "AfterValue: " + after);
return null;
});
What's the query or some other quick way to delete all the documents matching the where condition in a collection?
I want something like DELETE * FROM c WHERE c.DocumentType = 'EULA' but, apparently, it doesn't work.
Note: I'm not looking for any C# implementation for this.
This is a bit old but just had the same requirement and found a concrete example of what #Gaurav Mantri wrote about.
The stored procedure script is here:
https://social.msdn.microsoft.com/Forums/azure/en-US/ec9aa862-0516-47af-badd-dad8a4789dd8/delete-multiple-docdb-documents-within-the-azure-portal?forum=AzureDocumentDB
Go to the Azure portal, grab the script from above and make a new stored procedure in the database->collection you need to delete from.
Then right at the bottom of the stored procedure pane, underneath the script textarea is a place to put in the parameter. In my case I just want to delete all so I used:
SELECT c._self FROM c
I guess yours would be:
SELECT c._self FROM c WHERE c.DocumentType = 'EULA'
Then hit 'Save and Execute'. Viola, some documents get deleted. After I got it working in the Azure Portal I switched over the Azure DocumentDB Studio and got a better view of what was happening. I.e. I could see I was throttled to deleting 18 a time (returned in the results). For some reason I couldn't see this in the Azure Portal.
Anyway, pretty handy even if limited to a certain amount of deletes per execution. Executing the sp is also throttled so you can't just mash the keyboard. I think I would just delete and recreate the Collection unless I had a manageable number of documents to delete (thinking <500).
Props to Mimi Gentz #Microsoft for sharing the script in the link above.
HTH
I want something like DELETE * FROM c WHERE c.DocumentType = 'EULA'
but, apparently, it doesn't work.
Deleting documents this way is not supported. You would need to first select the documents using a SELECT query and then delete them separately. If you want, you can write the code for fetching & deleting in a stored procedure and then execute that stored procedure.
I wrote a script to list all the documents and delete all the documents, it can be modified to delete the selected documents as well.
var docdb = require("documentdb");
var async = require("async");
var config = {
host: "https://xxxx.documents.azure.com:443/",
auth: {
masterKey: "xxxx"
}
};
var client = new docdb.DocumentClient(config.host, config.auth);
var messagesLink = docdb.UriFactory.createDocumentCollectionUri("xxxx", "xxxx");
var listAll = function(callback) {
var spec = {
query: "SELECT * FROM c",
parameters: []
};
client.queryDocuments(messagesLink, spec).toArray((err, results) => {
callback(err, results);
});
};
var deleteAll = function() {
listAll((err, results) => {
if (err) {
console.log(err);
} else {
async.forEach(results, (message, next) => {
client.deleteDocument(message._self, err => {
if (err) {
console.log(err);
next(err);
} else {
next();
}
});
});
}
});
};
var task = process.argv[2];
switch (task) {
case "listAll":
listAll((err, results) => {
if (err) {
console.error(err);
} else {
console.log(results);
}
});
break;
case "deleteAll":
deleteAll();
break;
default:
console.log("Commands:");
console.log("listAll deleteAll");
break;
}
And if you want to do it in C#/Dotnet Core, this project may help: https://github.com/lokijota/CosmosDbDeleteDocumentsByQuery. It's a simple Visual Studio project where you specify a SELECT query, and all the matches will be a) backed up to file; b) deleted, based on a set of flags.
create stored procedure in collection and execute it by passing select query with condition to delete. The major reason to use this stored proc is because of continuation token which will reduce RUs to huge extent and will cost less.
##### Here is the python script which can be used to delete data from Partitioned Cosmos Collection #### This will delete documents Id by Id based on the result set data.
Identify the data that needs to be deleted before below step
res_list = "select id from id_del"
res_id = [{id:x["id"]}
for x in sqlContext.sql(res_list).rdd.collect()]
config = {
"Endpoint" : "Use EndPoint"
"Masterkey" : "UseKey",
"WritingBatchSize" : "5000",
'DOCUMENTDB_DATABASE': 'Database',
'DOCUMENTDB_COLLECTION': 'collection-core'
};
for row in res_id:
# Initialize the Python DocumentDB client
client = document_client.DocumentClient(config['Endpoint'], {'masterKey': config['Masterkey']})
# use a SQL based query to get documents
## Looping thru partition to delete
query = { 'query': "SELECT c.id FROM c where c.id = "+ "'" +row[id]+"'" }
print(query)
options = {}
options['enableCrossPartitionQuery'] = True
options['maxItemCount'] = 1000
result_iterable = client.QueryDocuments('dbs/Database/colls/collection-core', query, options)
results = list(result_iterable)
print('DOCS TO BE DELETED : ' + str(len(results)))
if len(results) > 0 :
for i in range(0,len(results)):
# print(results[i]['id'])
docID = results[i]['id']
print("docID :" + docID)
options = {}
options['enableCrossPartitionQuery'] = True
options['maxItemCount'] = 1000
options['partitionKey'] = docID
client.DeleteDocument('dbs/Database/colls/collection-core/docs/'+docID,options=options)
print ('deleted Partition:' + docID)
I have the following function in node.js that makes a query to postgres based based on name_url. Sometimes it works and sometimes it just doesn't work.
Also I'm using lib pg-promise:
exports.getLawyerByUrlName = function (name_url, callback) {
console.log(typeof name_url) //NOTICE: output string
db.one({
text: "SELECT * FROM " + lawyersTable + " WHERE name_url LIKE $1::varchar",
values: name_url,
name: "get-lawyer-by-name_url"
})
.then(function (lawyer) {
callback(lawyer);
})
.catch(function (err) {
console.log("getLawyerByUrlName() " + err)
});
}
When it does not work it throws error:
getLawyerByUrlName() error: invalid input syntax for integer: "roberto-guzman-barquero"
This is a very weird bug I can't seem to catch why its happening. I'm checking before in console.log that I'm actually passing a string:
console.log(typeof name_url) //NOTICE: output string
My table field for name_url is:
CREATE TABLE lawyers(
...
name_url VARCHAR check(translate(name_url, 'abcdefghijklmnopqrstuvwxyz-', '') = '') NOT NULL UNIQUE,
It seems to be unlikely that that particular query could ever throw that error so I'll suggest three possibilities. The first is that the code that's causing the error is actually someplace else and that:
.catch(function (err) {
console.log("getLawyerByUrlName() " + err)
was cut and pasted into a different part of the code.
The 2nd possibility is that the "lawersTable" variable is getting populated with something unexpected.
The 3rd possibility is that my first two scenarios are wrong. ;-)
I want to be able to pass multiple data sets to my view. Here is how I am currently doing it in my controller:
transactions: function (req, res) {
var queryexpenses = 'select * from expense order by name';
Expense.query(queryexpenses, function (err, expense) {
this.expenses = expense;
});
if (req.param('filter')) {
var where = 'where fk_expense = ' + req.param('expensefilter');
where += ' and datePosted > "' + req.param('yearfilter') + '-01-01" ';
where += ' and datePosted < "' + req.param('yearfilter') + '-12-31" ';
} else {
var where = 'where fk_expense IS NULL';
}
var query = 'select * from accounting ' + where + ' order by description';
Accounting.query(query, function (err, trans) {
this.transactions = trans;
});
var total = 0;
_.each(this.transactions, function (element, index, list) {
// format dates
element.datePosted = dateFormat(element.datePosted, 'dd/mm/yyyy');
var tmp0 = element.amount
var tmp1 = tmp0.replace(/ /g, '');
var tmp2 = parseFloat(tmp1);
total += tmp2;
});
this.total = total.toFixed(2);
return res.view();
}
This is the only way I am able to accomplish what Im trying to do but there are problems which I believe are caused by me putting the query objects in the "this" scope. The first problem is the page will crash after server restart on first reload. The second problem is everything seems to happen one step behind. What I mean is if I issue commands on the UI (eg submit a form) nothing will happen unless I take the same action twice.
So how do I pass multiple sets of data to my views without scoping them in "this"?
res.view({
corndogs: [{name: 'Hank the Corndog'}, {name: 'Lenny the Corndog'}]
});
Here is the relevant docs page: http://sailsjs.org/#!documentation/views
Also, it looks like you're not taking full advantage of Waterline for making SQL queries.