How Can I make this simple For-loop work? - nestjs

Basically, I'm trying to create a Many-To-One relationship between 'Needs' and 'Pgroup'. However, for my use, I would be getting my create-requests with all the data required to create One Pg with all the Needs; in one shot; like so...
This is my code I have worked on so
(A Little Back-ground)
async new(data: PgroupEntity) {
// const pgp = await this.pgrouprepository.create(
// await this.pgrouprepository.save(pgp);
const pp = await this.pgrouprepository.findOne({ where: { id: 'c682620d-9717-4d3c-bef9-20a31d743a99' } });
This is where the code starts
for (let item in data.needs ) {
const need = await this.needrepository.create({...data.needs[item], pgroup: pp});
await this.needrepository.save(need);
return need;
}
}
For some reason, this for-loop doesn't work. It only iterates once. This code below works
const need = await this.needrepository.create({...data.needs[2], pgroup: pp});
await this.needrepository.save(need);
But I'm unable to save more than one need at a time.

Try this
async new(data: PgroupEntity) {
// const pgp = await this.pgrouprepository.create(data);
// await this.pgrouprepository.save(pgp);
const pp = await this.pgrouprepository.findOne({ where: { id: 'bad6eb03-655b-4e29-8d70-7fd63a7fe7d7' } });
data.needs.forEach(item => {
const need = this.needrepository.create({...item, pgroup: pp});
this.needrepository.save(need);
return need;
});
return data;
}

Related

async function doesn't wait of inside await in nodejs

I am implementing function monthlyRevenue.
Simply, it will return total monthly revenue,and it takes arguments of station array which will make revenues, month and year.
Problem
Inside of this function I have getStationPortion which will fetch the revenue portion of user's.
So I would like to make it return object like this.
stationsPortion = {station1 : 30, station2 : 20}
In the monthlyRevenue
const stationPortions = await getStationPortions(stations)
console.log("portion map", stationPortions //it will be shown very beginning with empty
getStationPortions
const getStationPortions = async (stations) => {
let stationPortions = {}
stations.map(async (value) => {
const doc = await fdb.collection('Stations').doc(value).get()
if (!doc.exists) {
console.log("NO DOC")
} else {
stationPortions[value] = doc.data().salesPortion
console.log(stationPortions) //it will be shown at the last.
}
})
return stationPortions
}
I thought that async function should wait for the result, but it does not.
I am kind of confusing if my understanding is wrong.
Thank you
(by the way, fdb is firebase admin(firestore)
Working code
const getStationPortions = async (stations) => {
let stationPortions = {}
await Promise.all(stations.map(async (value) => {
const doc = await fdb.collection('Stations').doc(value).get()
if (!doc.exists) {
console.log("NO DOC")
} else {
stationPortions[value] = doc.data().salesPortion
console.log(stationPortions)
}
}))
return stationPortions
}
module.exports = router;

Get all documents in collection using Cloud Firestore

I read several documentation but I don't understand why I should use an extra layer(foreach) in my code when I read all of the data inside a collection using Firebase (Cloud Firestore).
Here is the original documentation:
https://firebase.google.com/docs/firestore/query-data/get-data#get_all_documents_in_a_collection
Here is my code:
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
const snapshot = await this.firestore.collection('users').get();
snapshot.forEach((collection) => {
collection.docs.forEach(doc => {
users.push(doc.data() as User);
});
});
return users;
}
As I understand it should work like this:
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
const snapshot = await this.firestore.collection('users').get();
snapshot.forEach(doc => {
users.push(doc.data() as User);
});
return users;
}
Error message:
"Property 'data' does not exist on type 'QuerySnapshot'."
.collection().get() does NOT return an array; it returns a QuerySnapshot, which has a property .docs, which is an array of QueryDocumentSnapshot, each of which has a property .data, which is the data read from the document.
Documentation
https://firebase.google.com/docs/reference/js/firebase.firestore.CollectionReference
In new modular firebase firestore(version 9.+) it should be like this:
import { getFirestore, collection, query, getDocs } from 'firebase/firestore/lite'
async readAll() {
const firestore = getFirestore()
const collectionRef = collection(firestore, '/users')
let q = query(collectionRef, orderBy('createTimestamp', 'desc'))
const querySnapshot = await getDocs(q)
const items = []
querySnapshot.forEach(document => {
items.push(document.data())
})
return items
}
I could not find any parameter on querySnapshot directly that is something like .docs was and included whole array before. So it is kinda like onSnapshot is and was.
Based on #LeadDreamer answer, I could manage to simplify the code
async loadUsers(): Promise<User[]> {
const users = new Array<User>();
await this.firestore.collection('users').get().subscribe(querySnapshot => {
querySnapshot.docs.forEach(doc => {
users.push(doc.data() as User);
});
});
return users;
}
There seems to be no other way but to iterate.
const q = query(collection(db, "item"));
getDocs(q).then( response => {
const result = response.docs.map(doc=>({
id: doc.id,
...doc.data(),
}))
console.log(result);
}).catch(err=>console.log(err))

How to make Mongoose update work with await?

I'm creating a NodeJS backend where a process reads in data from a source, checks for changes compared to the current data, makes those updates to MongoDB and reports the changes made. Everything works, except I can't get the changes reported, because I can't get the Mongoose update action to await.
The returned array from this function is then displayed by a Koa server. It shows an empty array, and in the server logs, the correct values appear after the server has returned the empty response.
I've digged through Mongoose docs and Stack Overflow questions – quite a few questions about the topic – but with no success. None of the solutions provided seem to help. I've isolated the issue to this part: if I remove the Mongoose part, everything works as expected.
const parseJSON = async xmlData => {
const changes = []
const games = await Game.find({})
const gameObjects = games.map(game => {
return new GameObject(game.name, game.id, game)
})
let jsonObj = require("../sample.json")
Object.keys(jsonObj.items.item).forEach(async item => {
const game = jsonObj.items.item[item]
const gameID = game["#_objectid"]
const rating = game.stats.rating["#_value"]
if (rating === "N/A") return
const gameObject = await gameObjects.find(
game => game.bgg === parseInt(gameID)
)
if (gameObject && gameObject.rating !== parseInt(rating)) {
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true }
).exec()
changes.push(
`${updated.name}: ${gameObject.rating} -> ${updated.rating}`
)
} catch (error) {
console.log(error)
}
}
})
return changes
}
Everything works – the changes are found and the database is updated, but the reported changes are returned too late, because the execution doesn't wait for Mongoose.
I've also tried this instead of findOneAndUpdate():
const updated = await Game.findOne()
.where("_id")
.in([gameObject.id])
.exec()
updated.rating = rating
await updated.save()
The same results here: everything else works, but the async doesn't.
As #Puneet Sharma mentioned, you'll have to map instead of forEach to get an array of promises, then await on the promises (using Promise.all for convenience) before returning changes that will then have been populated:
const parseJSON = async xmlData => {
const changes = []
const games = await Game.find({})
const gameObjects = games.map(game => {
return new GameObject(game.name, game.id, game)
})
const jsonObj = require("../sample.json")
const promises = Object.keys(jsonObj.items.item).map(async item => {
const game = jsonObj.items.item[item]
const gameID = game["#_objectid"]
const rating = game.stats.rating["#_value"]
if (rating === "N/A") return
const gameObject = await gameObjects.find(
game => game.bgg === parseInt(gameID)
)
if (gameObject && gameObject.rating !== parseInt(rating)) {
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true }
).exec()
changes.push(
`${updated.name}: ${gameObject.rating} -> ${updated.rating}`
)
} catch (error) {
console.log(error)
}
}
})
await Promise.all(promises)
return changes
}
(The diff, for convenience:
9,10c9,10
< let jsonObj = require("../sample.json")
< Object.keys(jsonObj.items.item).forEach(async item => {
---
> const jsonObj = require("../sample.json")
> const promises = Object.keys(jsonObj.items.item).map(async item => {
33a34
> await Promise.all(promises)
)
EDIT: a further refactoring would be to use that array of promises for the change descriptions themselves. Basically changePromises is an array of Promises that resolve to a string or null (if there was no change), so a .filter with the identity function will filter out the falsy values.
This method also has the advantage that changes will be in the same order as the keys were iterated over; with the original code, there's no guarantee of order. That may or may not matter for your use case.
I also flipped the if/elses within the map function to reduce nesting; it's a matter of taste really.
Ps. That await Game.find({}) will be a problem when you have a large collection of games.
const parseJSON = async xmlData => {
const games = await Game.find({});
const gameObjects = games.map(game => new GameObject(game.name, game.id, game));
const jsonGames = require("../sample.json").items.item;
const changePromises = Object.keys(jsonGames).map(async item => {
const game = jsonGames[item];
const gameID = game["#_objectid"];
const rating = game.stats.rating["#_value"];
if (rating === "N/A") {
// Rating from data is N/A, we don't need to update anything.
return null;
}
const gameObject = await gameObjects.find(game => game.bgg === parseInt(gameID));
if (!(gameObject && gameObject.rating !== parseInt(rating))) {
// Game not found or its rating is already correct; no change.
return null;
}
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true },
).exec();
return `${updated.name}: ${gameObject.rating} -> ${updated.rating}`;
} catch (error) {
console.log(error);
}
});
// Await for the change promises to resolve, then filter out the `null`s.
return (await Promise.all(changePromises)).filter(c => c);
};

Parse Server edit Relations on Object very slow

I've got the following function which works as expected on Parse Server cloud code, however it's painfully slow.
The nested for loops which are internally calling queries and save functions are undoubtedly the root cause.
How can I refactor this code so that there is some async processing or even better what methods are there to remove / edit the relations on an object, the documentation around this is very poor.
ClientLabels.applyClientLabels = async (req, res) => {
const { clients, labels } = req.params;
const user = req.user;
const objectIds = clients.map((client) => client.objectId);
const clientSaveList = [];
const clientClass = Parse.Object.extend('Clients');
const query = new Parse.Query(clientClass);
query.containedIn("objectId", objectIds);
const queryResult = await query.find({ sessionToken: user.getSessionToken() })
try {
for (const client of queryResult) {
const labelRelation = client.relation('labels');
const relatedLabels = await labelRelation.query().find({ sessionToken: user.getSessionToken() });
labelRelation.remove(relatedLabels);
for (const label of labels) {
label.className = "ClientLabels";
const labelRelationObj = Parse.Object.fromJSON(label)
labelRelation.add(labelRelationObj);
};
clientSaveList.push(client);
};
const saved = await Parse.Object.saveAll(clientSaveList, { sessionToken: user.getSessionToken() })
res.success(saved);
} catch (e) {
res.error(e);
};
}
Explanation of some weirdness:
I am having to call Parse.Object.fromJSON in order to make the client side label object a ParseObjectSubClass and allow operations on it such as adding relations.
You cannot use include on a relation query as you would with a Pointer, so there needs to be a query for relations all on it's own. An array of pointers was ruled out as there is going to be an unknown amount of labels applied.
There are a few things that can be done: (1) The creation of labels in the inner loop is invariant relative to the outer loop, so that can be done one time, at the start. (2) There's no need to query the relation if you're just going to remove the related objects. Use unset() and add to replace the relations. (3) This won't save much computation, but clientSaveList is superfluous, we can just save the query result...
ClientLabels.applyClientLabels = async (req, res) => {
const { clients, labels } = req.params;
const objectIds = clients.map((client) => client.objectId);
let labelObjects = labels.map(label => {
label.className = "ClientLabels";
return Parse.Object.fromJSON(label)
});
const query = new Parse.Query('Clients');
query.containedIn("objectId", objectIds);
const sessionToken = req.user.getSessionToken;
const queryResult = await query.find({ sessionToken: sessionToken })
try {
for (const client of queryResult) {
client.unset('labels');
client.relation('labels').add(labelObjects);
};
const saved = await Parse.Object.saveAll(queryResult, { sessionToken: sessionToken })
res.success(saved);
} catch (e) {
res.error(e);
};
}

Is this the proper way to write a multi-statement transaction with Neo4j?

I am having a hard time interpretting the documentation from Neo4j about transactions. Their documentation seems to indicate preference to doing it this way rather than explicitly declaring tx.commit() and tx.rollback().
Does this look best practice with respect to multi-statement transactions and neo4j-driver?
const register = async (container, user) => {
const session = driver.session()
const timestamp = Date.now()
const saltRounds = 10
const pwd = await utils.bcrypt.hash(user.password, saltRounds)
try {
//Start registration transaction
const registerUser = session.writeTransaction(async (transaction) => {
const initialCommit = await transaction
.run(`
CREATE (p:Person {
email: '${user.email}',
tel: '${user.tel}',
pwd: '${pwd}',
created: '${timestamp}'
})
RETURN p AS Person
`)
const initialResult = initialCommit.records
.map((x) => {
return {
id: x.get('Person').identity.low,
created: x.get('Person').properties.created
}
})
.shift()
//Generate serial
const data = `${initialResult.id}${initialResult.created}`
const serial = crypto.sha256(data)
const finalCommit = await transaction
.run(`
MATCH (p:Person)
WHERE p.email = '${user.email}'
SET p.serialNumber = '${serial}'
RETURN p AS Person
`)
const finalResult = finalCommit.records
.map((x) => {
return {
serialNumber: x.get('Person').properties.serialNumber,
email: x.get('Person').properties.email,
tel: x.get('Person').properties.tel
}
})
.shift()
//Merge both results for complete person data
return Object.assign({}, initialResult, finalResult)
})
//Commit or rollback transaction
return registerUser
.then((commit) => {
session.close()
return commit
})
.catch((rollback) => {
console.log(`Transaction problem: ${JSON.stringify(rollback, null, 2)}`)
throw [`reg1`]
})
} catch (error) {
session.close()
throw error
}
}
Here is the reduced version of the logic:
const register = (user) => {
const session = driver.session()
const performTransaction = session.writeTransaction(async (tx) => {
const statementOne = await tx.run(queryOne)
const resultOne = statementOne.records.map((x) => x.get('node')).slice()
// Do some work that uses data from statementOne
const statementTwo = await tx.run(queryTwo)
const resultTwo = statementTwo.records.map((x) => x.get('node')).slice()
// Do final processing
return finalResult
})
return performTransaction.then((commit) => {
session.close()
return commit
}).catch((rollback) => {
throw rollback
})
}
Neo4j experts, is the above code the correct use of neo4j-driver ?
I would rather do this because its more linear and synchronous:
const register = (user) => {
const session = driver.session()
const tx = session.beginTransaction()
const statementOne = await tx.run(queryOne)
const resultOne = statementOne.records.map((x) => x.get('node')).slice()
// Do some work that uses data from statementOne
const statementTwo = await tx.run(queryTwo)
const resultTwo = statementTwo.records.map((x) => x.get('node')).slice()
// Do final processing
const finalResult = { obj1, ...obj2 }
let success = true
if (success) {
tx.commit()
session.close()
return finalResult
} else {
tx.rollback()
session.close()
return false
}
}
I'm sorry for the long post, but I cannot find any references anywhere, so the community needs this data.
After much more work, this is the syntax we have settled on for multi-statement transactions:
Start session
Start transaction
Use try/catch block after (to enable proper scope in catch block)
Perform queries in the try block
Rollback in the catch block
.
const someQuery = async () => {
const session = Neo4J.session()
const tx = session.beginTransaction()
try {
const props = {
one: 'Bob',
two: 'Alice'
}
const tx1 = await tx
.run(`
MATCH (n:Node)-[r:REL]-(o:Other)
WHERE n.one = $props.one
AND n.two = $props.two
RETURN n AS One, o AS Two
`, { props })
.then((result) => {
return {
data: '...'
}
})
.catch((err) => {
throw 'Problem in first query. ' + e
})
// Do some work using tx1
const updatedProps = {
_id: 3,
four: 'excellent'
}
const tx2 = await tx
.run(`
MATCH (n:Node)
WHERE id(n) = toInteger($updatedProps._id)
SET n.four = $updatedProps.four
RETURN n AS One, o AS Two
`, { updatedProps })
.then((result) => {
return {
data: '...'
}
})
.catch((err) => {
throw 'Problem in second query. ' + e
})
// Do some work using tx2
if (problem) throw 'Rollback ASAP.'
await tx.commit
session.close()
return Object.assign({}, tx1, { tx2 })
} catch (e) {
tx.rollback()
session.close()
throw 'someQuery# ' + e
}
}
I will just note that if you are passing numbers into Neo4j, you should wrap them inside the Cypher Query with toInteger() so that they are parsed correctly.
I included examples of query parameters also and how to use them. I found it cleans up the code a little.
Besides that, you basically can chain as many queries inside the transaction as you want, but keep in mind 2 things:
Neo4j write-locks all involved nodes during a transaction, so if you have several processes all performing operations on the same node, you will see that only one process can complete a transaction at a time. We made our own business logic to handle write issues and opted to not even use transactions. It is working very well so far, writing 100,000 nodes and creating 100,000 relationships in about 30 seconds spread over 10 processes. It took 10 times longer to do in a transaction. We experience no deadlocking or race conditions using UNWIND.
You have to await the tx.commit() or it won't commit before it nukes the session.
My opinion is that this type of transaction works great if you are using Polyglot (multiple databases) and need to create a node, and then write a document to MongoDB and then set the Mongo ID on the node.
It's very easy to reason about, and extend as needed.

Resources