So i try to read a CSV file to JSON array using node js, and i try to update it to mongodb, is it possible way to looping the data and update it to database base on the JSON using this code, if its not possible is there a way to do it asynchronously using node, because i keep getting promise issue?
here is my code:
import csvtojson from "csvtojson";
import { MongoClient } from "mongodb";
const csvFilePath = "./data.csv";
console.log(" ");
const uri = "mongodb://localhost:27017";
async function listDatabases(client) {
const databasesList = await client.db().admin().listDatabases();
console.log("Databases:");
databasesList.databases.forEach((db) => console.log(` - ${db.name}`));
}
async function main() {
const client = new MongoClient(uri);
try {
await client.connect();
await listDatabases(client);
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
main().catch(console.error);
async function updateData() {
const client = new MongoClient(uri);
const data = await csvtojson().fromFile(csvFilePath);
console.log(data.length);
data.map((datas) => {
// console.log(datas);
console.log(Object.keys(datas), Object.values(datas));
});
try {
await client.connect();
const db = client.db("nabatiform");
const stuff = db.collection("users");
data.map((datas) => {
// console.log(datas);
console.log(Object.keys(datas), Object.values(datas));
const result = await stuff.findOneAndUpdate({})
});
console.log(result);
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
updateData().catch(console.error);
here is my JSON that i read from CSV:
[{
NIK: '22000028',
Plant: 'Majalasgka',
fullname: 'FERI FsaYAH',
Group_Shift: 'R1',
Gedung_Zona: 'Gas A',
Sector: 'SEKTOas 08',
SPV: 'TasI SUasWATI'
},
{
NIK: '22000330',
Plant: 'Majaaka',
fullname: 'AYasdMAYANTI',
Group_Shift: 'NSHT',
Gedung_Zona: 'GEDU',
Sector: 'SE-08',
SPV: 'TI'
},
]
here is what its look like on my document on mongodb:
{
"_id": {
"$oid": "6369b17b11e02557349d8de5"
},
"fullname": "EGA PERMANA SAPUTRA",
"password": "$2b$10$TuKKwzIxmqvnJfR8LRV/zu1s.Gqpt4yANLAcNNFQ6pqTuLL82.00q",
"NIK": "17000691",
"status": "active",
"department": "Prodaucasdfation",
"position": "Foreasdman",
"Group_Shift": "R1",
"role": "user",
"__v": 0,
"createdAt": 1667871099,
"updatedAt": 1669025651,
"userInformation": {},
"plant": "Majasangka"
}
Use a forEach() loop to push each findOneAndUpdate() function to an array. Then execute all the promises asynchronously using Promise.all(). This is much faster than using await inside a map() or for loop.
async function updateData() {
const promises = [];
const client = new MongoClient(uri);
const data = await csvtojson().fromFile(csvFilePath);
console.log(data.length);
data.map((datas) => {
// console.log(datas);
console.log(Object.keys(datas), Object.values(datas));
});
try {
await client.connect();
const db = client.db("nabatiform");
const stuff = db.collection("users");
data.forEach((datas) => {
console.log(Object.keys(datas), Object.values(datas));
// Push each promise to array
promises.push(stuff.findOneAndUpdate({}));
});
// Execute all promises
await Promise.all(promises);
console.log(result);
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
use await while calling updateData()
await updateData().catch(console.error);
I whould suggest for loop instead of map because if you want yo use async and wait in the map it just won't work while for loop does not have that problem.
I'm no javascript wizard, but hopefully this is helpful.
Perhaps a little modification of your updateData function will make the updates. I don't really have a way to test it.
async function updateData() {
const client = new MongoClient(uri);
const data = await csvtojson().fromFile(csvFilePath);
console.log(data.length);
data.map((datas) => {
// console.log(datas);
console.log(Object.keys(datas), Object.values(datas));
});
try {
await client.connect();
const db = client.db("nabatiform");
const stuff = db.collection("users");
// Below here is what I modified.
data.forEach(element => {
const filter = Object.fromEntries(Object.entries(element).filter(elem =>
elem[0] == "NIK"
));
const updateFields = Object.fromEntries(Object.entries(element).filter(elem =>
elem[0] != "NIK"
));
const update = {$set: updateFields};
const result = await stuff.findOneAndUpdate(filter, update);
});
console.log(result);
} catch (e) {
console.error(e);
} finally {
await client.close();
}
}
Related
I have created nodejs application by organising as module structure , The problem I am facing is that a mongodb insertion return undefined value from one of my controller, The issue I found is that my async funtion doesn't wait to complete my mongodb operation But I could not find a solution for that, my route and controller code is given below
route.js
const {
createEvent, editEvent
} = require('./controller');
router.post("/event/create", validateEventManage, isRequestValidated, async(req, res) => {
let data = {};
data.body = req.body;
try{
let event = await createEvent(req.body);
console.log(event) // returned undefined
data.event = event;
res.status(200).json(data);
}catch(error){
console.log(error)
res.status(200).json({error:error});
}
});
controller.js
exports.createEvent = async(data) => {
// return "test" // This works correctly
const eventObj = {
name : data.name,
description : data.desc,
type : data.type,
startDate : new Date()
}
const event = await new Event(eventObj);
await event.save((error,event)=>{
if(error) {
return error;
}
if(event){
return event;
}
});
}
You should not await the new Event constructor.
Also, since you are using async - await you can
remove the callback from the save and try ... catch the error to handle it:
exports.createEvent = async (data) => {
// return "test" // This works correctly
const eventObj = {
name: data.name,
description: data.desc,
type: data.type,
startDate: new Date(),
};
try {
const event = new Event(eventObj);
await event.save();
return event;
} catch (error) {
return error;
}
};
I created a plugin for simple queries with caching and connection pooling. When i respond with that plugin (function), response is slower than before. So i wonder if I got the plugin thing wrong. Is this a correct use or am I making a mistake somewhere?
db.js
const fp = require('fastify-plugin')
const oracledb = require('oracledb');
oracledb.outFormat = oracledb.OUT_FORMAT_OBJECT;
oracledb.autoCommit = true;
module.exports = fp(async function (fastify, opts) {
fastify.decorate('simpleSelectWithCache', async function (key, ttl, sql) {
let cached = await fastify.cache.get(key);
if (cached) {
console.log('Cached:', cached.item);
return cached.item;
} else {
let connection;
try {
connection = await oracledb.getConnection();
const data = await connection.execute(sql);
fastify.cache.set(key, data.rows, ttl);
console.log('Real:', data.rows);
return data.rows;
// oracledb.getPool()._logStats(); // show pool statistics. _enableStats must be true
} catch (error) {
console.error(err);
} finally {
if (connection) await connection.close();
}
}
})
})
api.js
module.exports = async function (fastify, opts) {
fastify.get(
'/cached',
{
schema: {
description: 'Shared Api',
tags: ['Shared'],
},
},
async function (req, reply) {
const data = await fastify.simpleSelectWithCache('shared-cached', 60*1000, 'SELECT id FROM users WHERE id < 50')
reply.send(data);
}
);
};
Is this a correct use or am I making a mistake somewhere?
The connection is a heavy operation and, for every query, a new connection (aka a new socket) is created between your server and DB.
To optimize your plugin you need to create the connection pool at start:
module.exports = fp(async function (fastify, opts) {
await oracledb.createPool({
user: opts.user,
password: opts.password,
connectString: opts.connectString
})
fastify.decorate('simpleSelectWithCache', async function (key, ttl, sql) {
const cached = await fastify.cache.get(key)
if (cached) {
console.log('Cached:', cached.item)
return cached.item
} else {
let connection
try {
connection = await oracledb.getConnection()
const data = await connection.execute(sql)
fastify.cache.set(key, data.rows, ttl)
console.log('Real:', data.rows)
return data.rows
// oracledb.getPool()._logStats(); // show pool statistics. _enableStats must be true
} catch (error) {
console.error(error)
} finally {
if (connection) await connection.close()
}
}
})
fastify.addHook('onClose', (instance, done) => {
oracledb.getPool().close(10)
.then(done)
.catch(done)
})
})
// then register your plugin
fastify.register(myOraclePlugin, {
user: 'ora'
password: '1234',
connectString: 'foo'
})
I am trying to batch commit but when both the query returns something i am getting the error Cannot modify a WriteBatch that has been committed. how should i solve this issue? can anyone suggest any solution to do this is it to initialize two arrays and push the batch.commit() and then resolve the promise
module.exports = async (change) => {
try {
const timerSnapshot = change.before.data().timestamp;
console.log(timerSnapshot, "before");
const timerTimestampMinusOne = momentTz(timerSnapshot)
.tz("Asia/Kolkata")
.subtract(1, "days")
.valueOf();
const timerMinusThreeHours = momentTz(timerSnapshot)
.tz("Asia/Kolkata")
.subtract(3, "hours")
.valueOf();
const [profileSnapshot, threeHourReminder] = await Promise.all([
db
.collection("Profiles")
.where("lastQueryFrom", ">", timerTimestampMinusOne)
.get(),
db
.collection("Profiles")
.where("lastQueryFrom", ">", timerMinusThreeHours)
.get(),
]);
const batch = db.batch();
const batchArray = [];
console.log(profileSnapshot.docs.length, "profile");
console.log(threeHourReminder.docs.length, "three");
profileSnapshot.forEach((doc) => {
batch.set(
db.collection("Timers").doc(getISO8601Date()),
{
timestamp: momentTz().tz("Asia/Kolkata").valueOf(),
},
{ merge: true }
);
batchArray.push(batch.commit());
});
threeHourReminder.forEach((doc) => {
batch.set(
db.collection("Timers").doc(getISO8601Date()),
{
timestamp: momentTz().tz("Asia/Kolkata").valueOf(),
},
{ merge: true }
);
batchArray.push(batch.commit());
});
await Promise.all(batchArray);
} catch (error) {
console.error(error);
}
};
I have a function that will take an array of jobs as a parameter in it. This function will check the existence of each job in the database through its id.
If a job is not to present in the database, that particular job needs to be pushed into an array called latestJobs. I'm calling this function in my main.js file. But the code breaks and stops.
Below is my main.js code:
module.exports.app = async () => {
try {
...
const jobs = await getJobsForCountries(body);
const latestJobs = await filterPreDraftedJobs(jobs);
console.log('latestJobs', latestJobs);
} catch (e) {
console.error('Error:- ', e); // Comes to here
}
};
My checker function looks like:
module.exports = async (jobs) => {
let latestJobs = [];
for (const job of jobs) {
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: job.Id
}
};
await dynamoDb.get(params, (err, data) => {
if (err) {
latestJobs.push(job);
console.log('Job not found in DB');
}
}).promise();
}
return latestJobs;
};
How can I fix this issue? I want the latestJobs which will not present in the database. Is there a function for dynamodb which can do this for me?
You are mixing callback, promise and await style. I would do it like this
module.exports = async (jobs) => {
let latestJobs = [];
for (const job of jobs) {
const params = {
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: job.Id
}
};
try {
const result = await dynamoDb.get(params).promise();
if (result) {
return;
}
} catch(err) {
latestJobs.push(job);
}
}
return latestJobs;
};
Also, make sure that table is created and the region and name you are passing is correct.
I am not much familiar with dynamoDB but looking at the above conversation code must be something like this. I have tried to improve performance and making sure the code is modular and readable.
async function addUpdateJobs(jobs)
{
let paramsArray = [];
for (const job of jobs)
{
const jobParams = {
params:{
TableName: process.env.DYNAMODB_TABLE,
Key: {
id: job.Id
}
},
job:job
};
paramsArray.push(jobParams );
}
return await this.getJobs(paramsArray);
}
function getJobs(paramsArray)
{
let latestJobs = [];
paramsArray.each(async (jobParam)=>
{
try
{
const result = await dynamoDb.get(jobParam.params).promise();
if (result)
{
return;
}
} catch (err)
{
latestJobs.push(jobParam.job);
}
});
return latestJobs;
}
PS: Also I was gonig through error handling in amazondynamodb.
To illuminate the problem I'm having getting a nodejs/mssql application working, I've attempted to code two functionally equivalent versions of a simple (prepared) INSERT statement wrapped in a transaction.
The callbacks version works - inserts a row into my Sql Server db.
The async / await version throws an error -
TransactionError: Can't commit transaction. There is a request in progress.
I have tried many variations of the failing version (statement reordering where plausible), but the version included below is the version that most closely mimics the logic of the working, callbacks version.
Thank you!
var sql = require('mssql'); // mssql: 4.1.0; tedious: 2.2.4; node: v8.4.0
var cfg = {
"db": "sqlserver",
"domain": "XXXXXX",
"user": "cseelig",
"password": "xxxxxx",
"server": "xxxxxx.xxxxxx.xxxxxx.xxxxxx",
"port": 1433,
"stream": false,
"options": {
"trustedConnection": true
},
"requestTimeout": 900000,
"connectionTimeout": 30000,
"pool": {
"max": 3,
"min": 0,
"idleTimeoutMillis": 30000
}
};
var statement = "insert into wng_dw.dbo.D_LIB_Google_Search_Query (query, LastUpdateDate) values (#query, GetDate())";
// I only run one or the other -
main1("12347"); // fails
main2("98765:); // works
async function main1(val) {
try {
const conn = await new sql.connect(cfg);
const transaction = new sql.Transaction();
await transaction.begin();
const ps = new sql.PreparedStatement(transaction);
ps.input('query', sql.VarChar(200));
await ps.prepare(statement);
await ps.execute( {"query": val} );
await ps.unprepare();
await transaction.commit();
sql.close;
} catch(err){
console.log("Error: " + err);
};
process.exit(0);
}
async function main2(val) {
sql.connect(cfg, err => {
const transaction = new sql.Transaction();
transaction.begin(err => {
const ps = new sql.PreparedStatement(transaction);
ps.input('query', sql.VarChar(200));
ps.prepare(statement, err => {
ps.execute( {"query": val}, (err, result) => {
ps.unprepare(err => {
transaction.commit(err => {
sql.close();
});
});
});
});
});
});
}
The transaction.begin does not return a Promise. You could simply promisfy it. Something like the following:
await new Promise(resolve => transaction.begin(resolve));
const request = new sql.Request(transaction);
//...
await transaction.commit();
After the commit and rollback, the "request" object could not be used anymore. Otherwise it will show the error regarding the transaction didn't begin ....
Hope this help.
Before you can commit or rollback a transaction, all statements have to be unprepared.
You have to await the unprepare statement too, otherwise the request is still in progress and the execute promise hasn't resolved yet.
Use a a little wrapper to make things easy:
import * as dotenv from 'dotenv'
import mssql from 'mssql'
dotenv.config()
const sqlServerConfig = {
server: process.env.SQL_SERVER,
user: process.env.QS_USER,
password: process.env.QS_PASS,
options: { enableArithAbort: false },
}
let pool: mssql.ConnectionPool
let transaction: mssql.Transaction
const statements: mssql.PreparedStatement[] = []
export const connect = async (): Promise<void> => {
pool = new mssql.ConnectionPool({ ...sqlServerConfig, database: process.env.DATABASE })
await pool.connect()
}
export const disconnect = async (): Promise<void> => {
if (typeof pool == 'undefined') return
if (pool.connected) await pool.close()
}
export const begin = async (): Promise<void> => {
transaction = new mssql.Transaction(pool)
await transaction.begin()
}
export const unprepare = async (statement: mssql.PreparedStatement): Promise<void> => {
if (typeof statement == 'undefined') return
if (statement.prepared) await statement.unprepare()
}
export const commit = async (): Promise<void> => {
await transaction.commit()
}
export const rollback = async (): Promise<void> => {
for (const statement of statements) {
await unprepare(statement)
}
if (typeof transaction == 'undefined') return
await transaction.rollback()
}
export const createStatement = (): mssql.PreparedStatement => {
const statement = new mssql.PreparedStatement(transaction)
statements.push(statement)
return statement
}
Usage:
try {
await connect()
await begin()
const myStatement = createStatement()
..... bind parameters
..... prepare statement
for ( ..... ) {
await myStatement.execute( ..... )
}
await unprepare(myStatement)
await commit()
await disconnect()
exit(0)
}
catch(e) {
log.error(e)
await rollback()
await disconnect()
exit(1)
}
You create a prepared statement with createStatement(). createStatement keeps track of the statements so in case you rollback they will be unprepared for you when you call rollback.