I'm trying to make a bulk Insert/Update data in a db, but I have got this error, I don't know why, this is my code
This is the function of the promise of the bulkInsert
async bulkInsertFeatureTreatment(req, res) {
const { featuresTreatments } = await decryptRequest(req);
await db
.tx((t) => {
const queries = featuresTreatments.map((featureTreatment) => {
return t.none(
featuresTreatmentsDB.insertFeatureTreatment(featureTreatment)
);
});
return t.batch(queries);
})
.then((data) =>
cryptedResponse(res, response(200, INSERT_DATA_SUCCESS, data))
)
.catch((error) =>
cryptedResponse(res, response(500, INSERT_DATA_NOT_SUCCESS, error))
);
}
this is the function of insert/update, the upper function calls this
insertFeatureTreatment: (featuresTreatments) =>
`INSERT INTO ${TABLE_NAME} (${COL_ID_TREATMENT}, ${COL_ID_FEATURE}, ${COL_VALUE})
VALUES ('${featuresTreatments.id_treatment}', '${featuresTreatments.id_feature}', '${featuresTreatments.value}')
ON CONFLICT (${COL_ID_TREATMENT}, ${COL_ID_FEATURE})
DO UPDATE ${TABLE_NAME}
SET ${COL_VALUE} = '${featuresTreatments.value}'
WHERE ${COL_ID_TREATMENT} = '${featuresTreatments.id_treatment}'
AND ${COL_ID_FEATURE} = '${featuresTreatments.id_feature}'`,
Ok, that was a syntax error, it were about the conflict syntax, so this is the result:
insertFeatureTreatment: (featuresTreatments) =>
INSERT INTO ${TABLE_NAME} (${COL_ID_TREATMENT}, ${COL_ID_FEATURE}, ${COL_VALUE}) VALUES ('${featuresTreatments.id_treatment}', '${featuresTreatments.id_feature}', '${featuresTreatments.value}') ON CONFLICT (${COL_ID_TREATMENT}, ${COL_ID_FEATURE}) DO UPDATE SET ${COL_VALUE} = '${featuresTreatments.value}',
Related
Building a NodeJS REST API.
Trying to send load data from FireBase collection, then sending it to the user (as API response).
Looks like the problem is that it's not waits for the firebase fetch to resolve, but send back a response without the collection data. (tried to use ASYNC-AWAIT but its not working)
exports.getChatMessages = async (req, res, next) => {
const chatId = req.params.chatId
const getChatData = () => {
db
.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc')
.onSnapshot((snapshot) => {
snapshot.docs.forEach(msg => {
console.log(msg.data().messageContent)
return {
authorID: msg.data().authorID,
messageContent: msg.data().messageContent,
timeStamp: msg.data().timeStamp,
}
})
})
}
try {
const chatData = await getChatData()
console.log(chatData)
res.status(200).json({
message: 'Chat Has Found',
chatData: chatData
})
} catch (err) {
if (!err.statusCode) {
err.statusCode(500)
}
next(err)
}
}
As you can see, I've used 2 console.logs to realize what the problem, Terminal logs looks like:
[] (from console.logs(chatData))
All messages (from console.log(msg.data().messageContent))
Is there any way to block the code unti the firebase data realy fetched?
If I correctly understand, you want to send back an array of all the documents present in the messages subcollection. The following should do the trick.
exports.getChatMessages = async (req, res, next) => {
const chatId = req.params.chatId;
const collectionRef = db
.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc');
try {
const chatsQuerySnapshot = await collectionRef.get();
const chatData = [];
chatsQuerySnapshot.forEach((msg) => {
console.log(msg.data().messageContent);
chatData.push({
authorID: msg.data().authorID,
messageContent: msg.data().messageContent,
timeStamp: msg.data().timeStamp,
});
});
console.log(chatData);
res.status(200).json({
message: 'Chat Has Found',
chatData: chatData,
});
} catch (err) {
if (!err.statusCode) {
err.statusCode(500);
}
next(err);
}
};
The asynchronous get() method returns a QuerySnapshot on which you can call forEach() for enumerating all of the documents in the QuerySnapshot.
You can only await a Promise. Currently, getChatData() does not return a Promise, so awaiting it is pointless. You are trying to await a fixed value, so it resolves immediately and jumps to the next line. console.log(chatData) happens. Then, later, your (snapshot) => callback happens, but too late.
const getChatData = () => new Promise(resolve => { // Return a Promise, so it can be awaited
db.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc')
.onSnapshot(resolve) // Equivalent to .onSnapshot((snapshot) => resolve(snapshot))
})
const snapshot = await getChatData();
console.log(snapshot)
// Put your transform logic out of the function that calls the DB. A function should only do one thing if possible : call or transform, not both.
const chatData = snapshot.map(msg => ({
authorID: msg.data().authorID,
messageContent: msg.data().messageContent,
timeStamp: msg.data().timeStamp,
}));
res.status(200).json({
message: 'Chat Has Found',
chatData
})
Right now, getChatData is this (short version):
const getChatData = () => {
db
.collection('chats')
.doc(chatId)
.collection('messages')
.orderBy('timeStamp', 'asc')
.onSnapshot((snapshot) => {}) // some things inside
}
What that means is that the getChatData function calls some db query, and then returns void (nothing). I bet you'd want to return the db call (hopefully it's a Promise), so that your await does some work for you. Something along the lines of:
const getChatData = async () =>
db
.collection('chats')
// ...
Which is the same as const getChatData = async() => { return db... }
Update: Now that I've reviewed the docs once again, I see that you use onSnapshot, which is meant for updates and can fire multiple times. The first call actually makes a request, but then continues to listen on those updates. Since that seems like a regular request-response, and you want it to happen only once - use .get() docs instead of .onSnapshot(). Otherwise those listeners would stay there and cause troubles. .get() returns a Promise, so the sample fix that I've mentioned above would work perfectly and you don't need to change other pieces of the code.
I currently have the following code
router.get('/uri', (request,response) => {
let final = [];
TP.find({userID: request.userID})
.then(tests =>{
tests.forEach(test => {
A.findById(test.assignmentID)
.then(assignment => {
final.push({
testID: test._id,
name: assignment.title,
successRate: `${test.passedTests}/${test.totalTests}`
})
})
.catch(error => {
console.log(error)
})
})
return response.send(final);
})
.catch(err => {
console.log(err);
return response.sendStatus(500);
})
})
The code is supposed to query 2 MongoDB databases and construct an array of objects with specific information which will be sent to the client.
However, I always get an empty array when I call that endpoint.
I have tried making the functions async and make them wait for results of the nested functions but without success - still an empty array.
Any suggestions are appreciated!
forEach doesn't care about promises inside it. Either use for..of loop or change it to promise.all. The above code can be simplified as
router.get('/uri', async (request,response) => {
const tests = await TP.find({userID: request.userID});
const final = await Promise.all(tests.map(async test => {
const assignment = await A.findById(test.assignmentID);
return {
testID: test._id,
name: assignment.title,
successRate: `${test.passedTests}/${test.totalTests}`
};
}));
return response.send(final);
});
Hope this helps.
In Azure, I have a javascript HTTPTrigger Function App with:
const azure = require('azure-storage')
const tableSvc = azure.createTableService(
process.env.COSMOS_TABLE_ACCOUNT,
process.env.COSMOS_TABLE_KEY,
process.env.COSMOS_TABLE_ENDPOINT
)
const entGen = azure.TableUtilities.entityGenerator
const testData = {
PartitionKey: entGen.String('test'),
RowKey: entGen.String(1),
name: entGen.String('It works!')
}
const insertTestData = () => new Promise((resolve, reject) => {
tableSvc.insertEntity('tests', testData, (error, res) => {
if (error) return reject(error)
resolve(res)
})
})
...
I've confirmed that the environment variables are all set and populated with the values from Azure Cosmos DB -> Cosmos Table Instance -> Connection String.
I've also tried connecting with:
const tableSvc = azure.createTableService(
process.env.COSMOS_TABLE_CONNECTION_STRING
)
When I call insertTestData(), I'm getting back an error back from the .insertEntity callback with an empty object: {}. No entities are being added to my tests table, as confirmed by the Data Explorer.
Any ideas how to perform this operation or get more information in my debugger? I have an Insight monitor attached to the process, but it reports a successful completion.
I noticed that you're passing a numeric value for RowKey attribute.
RowKey: entGen.String(1)
When I used the code, it complained about that.
When I changed the code to:
RowKey: entGen.String('1')
I was able to insert the entity.
Here's my complete code:
const azure = require('azure-storage')
const tableSvc = azure.createTableService(
'account-name',
'account-key',
'https://account-name.table.core.windows.net'
)
const entGen = azure.TableUtilities.entityGenerator
const testData = {
PartitionKey: entGen.String('test'),
RowKey: entGen.String('1'),
name: entGen.String('It works!')
}
console.log(testData);
const insertTestData = () => new Promise((resolve, reject) => {
tableSvc.insertEntity('test', testData, (error, res) => {
if (error) return reject(error)
resolve(res)
})
})
console.log('----------------');
insertTestData()
.then((result) => {
console.log('result');
console.log(result);
})
.catch((error) => {
console.log('error');
console.log(error);
})
I used azure-storage NPM package (version 2.10.2).
I am using Knex.js to insert values from an array into a PostgreSQL database. The problem I keep running into is that Knex will hang after inserting rows in the database.
I've been struggling with this for several hours, and have tried a variety of solutions, including Get Knex.js transactions working with ES7 async/await, Make KnexJS Transactions work with async/await, and Knex Transaction with Promises.
No matter which flavor I try, I come back to the hang. I'm pretty sure I'm missing something obvious, but it's possible I haven't had enough coffee.
Here's my test code:
const testArray = [
{line: 'Canterbury Tales'},
{line: 'Moby Dick'},
{line: 'Hamlet'}
];
const insertData = (dataArray) => {
return new Promise( (resolve, reject) => {
const data = dataArray.map(x => {
return {
file_line: x.line
};
});
let insertedRows;
db.insert(data)
.into('file_import')
.then((result) => {
insertedRows = result.rowCount;
resolve(insertedRows);
})
});
}
const testCall = (b) => {
insertData(b).then((result) => {
console.log(`${result} rows inserted.`);
})
}
testCall(testArray);
This returns the following:
3 rows inserted.
EDIT: Updating with solution
Thanks to #sigmus, I was able to get this working by adding db.destroy(). Here's the updated code block, fully functional:
const testArray = [
{line: 'Canterbury Tales'},
{line: 'Moby Dick'},
{line: 'Hamlet'}
];
const insertData = (dataArray) => {
return new Promise( (resolve, reject) => {
const data = dataArray.map(x => {
return {
file_line: x.line
};
});
let insertedRows;
db.insert(data)
.into('file_import')
.then((result) => {
insertedRows = result.rowCount;
resolve(insertedRows);
})
.finally(() => {
db.destroy();
});
});
}
const testCall = (b) => {
insertData(b).then((result) => {
console.log(`${result} rows inserted.`);
process.exit(0);
})
}
testCall(testArray);
If you add process.exit(0); right after console.log(`${result} rows inserted.`); the script should exit.
It may be the case it's a connection pool issue, try using destroy like explained here: https://knexjs.org/#Installation-pooling
This code results in "Connection not yet open." error. The pool is connected and available for the first select where I am getting records to update.
After I process my data I have an array of UPDATE statements. Then the inline async function runs and results in the error above.
I have also attempted to run multiple UPDATE statements with a single query. This results in an UNKNOWN error unless the query updates array has only a single array member.
This is running against SQL Server 2000.
const doQuery = async (pool, sqlStr) => {
return await pool.request().query(sqlStr);
};
const updateResidental = async args => {
let toUpdateSql = `SELECT * FROM blah WHERE blah=blah)`;
const toUpdate = (await doQuery(args.pool, toUpdateSql)).recordset;
const sqlStrings = ['UPDATE blah1;','UPDATE blah2;','UPDATE blah3;'];
(async pool => {
return await Promise.all(sqlStrings.map(async sqlStr => {
return await doQuery(pool, sqlStr);
})).then(results => {
console.log(results);
}).catch(err => {
console.log(err)
});
})(args.pool);
}