How to execute Request one after other - node.js

Library - Tedious
var productsIds = []
const getProductsIdsRequest = new Request(GET_PRODUCT_ID_QUERY, function (err) {
if (err) {
console.log(err);
} else {
connection.close()
}
});
getProductsIdsRequest.on("row", function (columns) {
//collecting products Ids
})
getProductsIdsRequest.on("requestCompleted", async function (rowCount, more) {
//Performing some operations on the data
updateColumnInTable()
})
const updateColumnInTableRequest = new Request(UPLOAD_PRODUCT_ID_QUERY, function (err) {
if (err) {
console.log(err);
} else {
connection.close()
}
});
I need to run two requests, first GET_PRODUCT_ID_QUERY and second UPLOAD_PRODUCT_ID_QUERY. After fetching the product id's and doing some operations on the them, I need to run the 2nd query to update it. How I can do this? I have tried to run the update function inside the "requestCompleted" event, but it's throwing me error that I can't run another request in the final state. Note inside "requestCompleted" event, I am doing some heavy operations on Id's and after that only I need to call the update function(2nd request)

Related

How to query mongoDB to see if matching record exists, and if it does return it to update

Seems like a super basic task, but I just cannot get this to work (not very experienced with mongo or nodeJS).
I have an array of records. I need to check the DB to see if any records with a matching name already exist and if they do grab that record so I can update it.
Right now I am trying this
function hit_the_db(db, record_name, site_id) {
return new Promise((resolve, reject) => {
var record = db.collection('' + site_id + '_campaigns').find({name: record_name}).toArray(function(err, result) {
if (err) {
console.log('...error => ' + err.message);
reject(err);
} else {
console.log('...promise resolved...');
resolve(result);
}
});
console.log('...second layer of select successful, returning data for ' + record.length + ' records...');
return record;
});
}
This query works in another part of the app so I tried to just copy it over, but I am not getting any records returned even though I know there should be with the data I am sending over.
site_id is just a string that would look like ksdlfnsdlfu893hdsvSFJSDgfsdk. The record_name is also just a string that could really be anything but it is previously filtered so no spaces or special characters, most are something along these lines this-is-the-name.
With the names coming through there should be at least one found record for each, but I am getting nothing returned. I just cannot wrap my head around using mongo for these basic tasks, if anyone can help it would be greatly appreciated.
I am just using nodeJS and connecting to mongoDB, there is no express or mongoose or anything like that.
The problem here is that you are mixing callback and promises for async code handling. When you call:
var record = db.collection('' + site_id + '_campaigns').find({name: record_name}).toArray(function(err, result) {
You are passing in a callback function, which will receive the resulting array of mongo records in a parameter called result, but then assigning the immediate returned value to a variable called 'record', which is not going to contain anything.
Here is a cleaned up version of your function.
function hit_the_db(db, site_id, record_name, callback) {
// Find all records matching 'record_name'
db.collection(site_id + 'test_campaigns').find({ name: record_name }).toArray(function(err, results) {
// matching records are now stored in 'results'
if (err) {
console.log('err:', err);
}
return callback(err, results);
});
}
Here is optional code for testing the above function.
// This is called to generate test data
function insert_test_records_callback(db, site_id, record_name, insert_count, callback) {
const testRecords = [];
for (let i = 0; i < insert_count; ++i) {
testRecords.push({name: record_name, val: i});
}
db.collection(site_id + 'test_campaigns').insertMany(testRecords, function(err, result) {
return callback(err);
});
}
// This cleans up by deleting all test records.
function delete_test_records_callback(db, site_id, record_name, callback) {
db.collection(site_id + 'test_campaigns').deleteMany({name: record_name}, function(err, result) {
return callback(err);
});
}
// Test function to insert, query, clean up test records.
function test_callback(db) {
const site_id = 'ksdlfnsdlfu893hdsvSFJSDgfsdk';
const test_record_name = 'test_record_callback';
// First call the insert function
insert_test_records_callback(db, site_id, test_record_name, 3, function(err) {
// Once execution reaches here, insertion has completed.
if (err) {
console.log(err);
return;
}
// Do the query function
hit_the_db(db, site_id, test_record_name, function(err, records) {
// The query function has now completed
console.log('hit_the_db - err:', err);
console.log('hit_the_db - records:', records);
delete_test_records_callback(db, site_id, test_record_name, function(err, records) {
console.log('cleaned up test records.');
});
});
});
}
Output:
hit_the_db - err: null
hit_the_db - records: [ { _id: 5efe09084d078f4b7952dea8,
name: 'test_record_callback',
val: 0 },
{ _id: 5efe09084d078f4b7952dea9,
name: 'test_record_callback',
val: 1 },
{ _id: 5efe09084d078f4b7952deaa,
name: 'test_record_callback',
val: 2 } ]
cleaned up test records.

How to return an object within inside for loop

I want to return "items" which is inside the for loop and also two additional functions."Items" is an object (I would not say variable) which consists of three array elements and that can be more depending on the situation. So I need to return "items" so I can access it outside and I can send it to the client using res.send(). If I send data inside the loop and function, it is returning with an error called "Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client". I found the fix for it but on implementing them, nothing is happening. It is throwing me the same error. I was thinking to do it call back function but I am confused about how to use it in this case. Thanks in advance.
router.get("/send" , async (req, res) => {
try {
res.send("hello")
//await sendData()
getCollectionNames()
} catch (error) {
console.log(error);
}
function getCollectionNames(){
MongoClient.connect(url, function(err, db) {
var db = db.db('admin')
mongoose.connection.db.listCollections().toArray(function (err, names) {
for(let index = 0; index < names.length; index ++){
if (err) {
console.log(err);
}
let name = names[index].name
const collection = db.collection(name)
collection.find().toArray(function(err, items){
console.log(items)
})
}
});
})
}
})

setTimeout() in Node.js

I am writing cloud functions on Cloud Firestore triggers. What I want is when a document is added under some uuid it has to deleted after 2 minutes and assign the same data to another document. I wrote some code regarding that like below
exports.createdOpenOrder = functions.firestore.document('Some/{psId}/Open/{OrderId}').onCreate((snap, context) => {
// Get an object representing the document
console.log("Deleting function execution started:");
const newValue = snap.data();
var OrderId = context.params.OrderId;
var psId = context.params.psId;
setTimeout(delete_cur, 120000);
function delete_cur() {
var data = db.collection('Some').doc(psId).collection('Open').doc(OrderId).delete().then(function() {
console.log("Document successfully deleted!");
// calling another function to reassign
reassign(OrderId);
return;
}).catch(function(error) {
console.error("Error removing document: ", error);
return;
});
}
});
Now my problem is the setTimeout function is not calling exactly after 2 minutes and data is not deleting. Is anything wrong with my code? Please let me know how to write code work perfectly on setTimeout.
To find the problem, put log before, and a catch around, the contents of your setTimeout handler.
Currently you are only trapping exceptions after the delete async function returns. All other exceptions in the chain, before calling delete, are not caught.
function delete_cur() {
console.log('handler called')
try {
var data = db.collection('Some').doc(psId).collection('Open').doc(OrderId).delete().then(function() {
console.log("Document successfully deleted!");
// calling another function to reassign
reassign(OrderId);
return;
}).catch(function(error) {
console.error("Error removing document: ", error);
return;
});
} catch (e) {
console.error('could not invoke delete', e)
}
}

res.send after two forEach have finished executing

const collect = [];
req.body.product.forEach(function(entry) {
mongoClient.connect(databaseServerUrl, function(err, db) {
let testCollection = db.collection('Tests');
testCollection.find({Product: entry}).toArray((err, docs) => {
let waiting = docs.length;
docs.forEach(function (doc) {
collect.push(doc);
finish();
});
function finish() {
waiting--;
if (waiting === 0) {
res.send(collect);
}
}
});
db.close();
});
});
this is only getting back the first set. If I have two nodes in my array of req.body.product for example. I am only getting back the first set. But I need to get back everything not just from one Collection.
Rather than performing two queries and combining the results into one array, I suggest performing a single query that gets all of the results, which would look something like this:
mongoClient.connect(databaseServerUrl, function(err, db) {
const query = { $or: req.body.product.map(Product => ({ Product })) };
db.collection('Tests').find(query).toArray((err, docs) => {
// ...handle `err` here...
res.send(docs);
db.close();
});
});
Note that I haven't tested this since I don't have a MongoDB database in front of me.
your mongoClient.connect() is asyncronous but your loop just execute without waiting for the callback.
Try async forEach loop: enter link description here
This should solve your problem

nodejs pg transactions without nesting

I would like to know if it's possible to run a series of SQL statements and have them all committed in a single transaction.
The scenario I am looking at is where an array has a series of values that I wish to insert into a table, not individually but as a unit.
I was looking at the following item which provides a framework for transactions in node using pg. The individual transactions appear to be nested within one another so I am unsure of how this would work with an array containing a variable number of elements.
https://github.com/brianc/node-postgres/wiki/Transactions
var pg = require('pg');
var rollback = function(client, done) {
client.query('ROLLBACK', function(err) {
//if there was a problem rolling back the query
//something is seriously messed up. Return the error
//to the done function to close & remove this client from
//the pool. If you leave a client in the pool with an unaborted
//transaction weird, hard to diagnose problems might happen.
return done(err);
});
};
pg.connect(function(err, client, done) {
if(err) throw err;
client.query('BEGIN', function(err) {
if(err) return rollback(client, done);
//as long as we do not call the `done` callback we can do
//whatever we want...the client is ours until we call `done`
//on the flip side, if you do call `done` before either COMMIT or ROLLBACK
//what you are doing is returning a client back to the pool while it
//is in the middle of a transaction.
//Returning a client while its in the middle of a transaction
//will lead to weird & hard to diagnose errors.
process.nextTick(function() {
var text = 'INSERT INTO account(money) VALUES($1) WHERE id = $2';
client.query(text, [100, 1], function(err) {
if(err) return rollback(client, done);
client.query(text, [-100, 2], function(err) {
if(err) return rollback(client, done);
client.query('COMMIT', done);
});
});
});
});
});
My array logic is:
banking.forEach(function(batch){
client.query(text, [batch.amount, batch.id], function(err, result);
}
pg-promise offers a very flexible support for transactions. See Transactions.
It also supports partial nested transactions, aka savepoints.
The library implements transactions automatically, which is what should be used these days, because too many things can go wrong, if you try organizing a transaction manually as you do in your example.
See a related question: Optional INSERT statement in a transaction
Here's a simple TypeScript solution to avoid pg-promise
import { PoolClient } from "pg"
import { pool } from "../database"
const tx = async (callback: (client: PoolClient) => void) => {
const client = await pool.connect();
try {
await client.query('BEGIN')
try {
await callback(client)
await client.query('COMMIT')
} catch (e) {
await client.query('ROLLBACK')
}
} finally {
client.release()
}
}
export { tx }
Usage:
...
let result;
await tx(async client => {
const { rows } = await client.query<{ cnt: string }>('SELECT COUNT(*) AS cnt FROM users WHERE username = $1', [username]);
result = parseInt(rows[0].cnt) > 0;
});
return result;

Resources