I'm using MikroORM in my newest project and I was wondering, is it possible to return the newly created record id?
Thank you for your help
It will be available on the entity once you flush:
const user = new User();
em.persist(user);
await em.flush();
console.log(user.id);
Related
I have a project in nodejs with a mongodb database and i would like to create a brand new deployment of the project for another customer so i need the same database structure without old data.
What should i do?
Do i have to create all collections manually or there's a way to script just the database schema?
EDIT
I have been told that mongoose will automatically create the collection if it doesn't exist when the connection to the database is opened
I always prefer using the MVC model.
I will share on of my model file with you to understand how you can write a re-usable database schema code.
const expenseCollection = require('../db').db().collection("expenses");
const ObjectID = require('mongodb').ObjectID
const Money = require('./Money')
let Expense= function(data, userId){
this.data = data,
this.authorId = new ObjectID(userId)
this.errors =[]
}
//MAZZA BELOW ARE THE FIELDS MENTIONED THAT YOU MAY WANT TO HAVE IN YOUR SCHEMA
Expense.prototype.cleanUp = function(){
this.data ={
amount: Number(this.data.amountSpent),
item: this.data.item,
purchaseDate: new Date(this.data.purchaseDate),
notes: this.data.notes,
status: false,
expType: this.data.expType,
authorId : this.authorId,
}
}
module.exports = Expense
Also, I will share with you how can you pass the data in this constructor.
(From one of the of the controller file)
const moneyController = require('./moneyController')
const Expense = require('../models/Expense')
exports.saveExpense = async function(req, res){
let data = {
balance:0,
}
let money = new Money(data)
await money.addMoney(req.session.user_id)
await money.subtractBal(req.body.amountSpent, req.session.user._id )
//MAZZA FOCUS ON THE BELOW LINE:
let expense = new Expense(req.body, req.session.user._id)
await expense.saveExpense()
res.redirect('/')
}
Primarily, the model will be a file, wherein you can write the reusable schema script.
Consider making separate model for each collection, that is 1 collection = 1 new model in your code.
Also, in the above code that I shared with you, it will automatically create the collection even if it does not already exist in the mongodb.
Is it possible in MikroORM to validate the current SQL database schema (Postgres)?
I would like run this validation on application start as Hibernate does with hbm2ddl.auto=validate
You can use schema generator programatically, this way you can easily check if the schema is up to date after the init:
const orm = await MikroORM.init();
const diff = await orm.schema.getUpdateSchemaSQL();
if (diff) {
throw new Error('schema is out of sync');
}
I will consider adding something similar to v6, at least some syntax sugar in the schema generator.
I am trying to add/update a record in a dynamoDb i am sure that this method will add a record, but i am really confused wether it will update if the record is in place or need to write a separate update method
const lookUpTableModel = require('../src/modelGenerator');
const { LookUpTable } = lookUpTableModel({ table: personLookUpTable, entity: 'PersonLookUP' });
const item = {
neo4jkey: personObj.key,
saleforceId: body.Id,
};
LookUpTable.put(item);
if not this method helps in update please guide me.
Existing Cosmos DB documents need to be altered/updated with a new property & also existing documents of other collections need to be updated with the same new property along with its value.
Is there any recommended way or tool available to update existing documents on Cosmos DB, or is writing the custom c# application/PowerShell script using Cosmos DB SDK is the only option?
Example:
Existing user document
{
id:user1#mail.com,
name: "abc",
country: "xyz"
}
Updated user document
{
id:user1#mail.com,
name: "abc",
country: "xyz",
guid:"4334fdfsfewr" //new field
}
Existing order document of the user
{
id:user1#mail.com,
user: "user1#mail.com",
date: "09/28/2020",
amt: "$45"
}
Updated order document of the user
{
id:user1#mail.com,
user: "user1#mail.com",
userid: "4334fdfsfewr", // new field but with same value as in user model
date: "09/28/2020",
amt: "$45"
}
I'd probably go with:
Update user documents through a script
Have Azure Function with Cosmosdb trigger that would listen to changes on users documents and update orders appropriately
[UPDATE]
whatever type of script you feel best with: PS, C#, Azure Functions...
now, what do you mean they need to be altered with the new property "on the same time"? i'm not sure that's possible in any way. if you want such an effect then i guess your best bet is:
create new collection/container for users
have an Azure Function that listens to a change feed for your existing users container (so, with StartFromBeginning option)
update your documents to have new field and store them in a newly created container
once done, switch your application to use new container
its your choice how would you change other collections (orders): using changeFeed & Azure Functions from old or new users container.
PS.
Yes, whatever flow i'd go with, it would still be Azure Functions with Cosmos DB trigger.
I have added some solution for .Net Core API 3.0 or higher version.
// You can put any filter for result
var result = _containers.GetItemLinqQueryable<MessageNoteModel>().Where(d => d.id == Id
&& d.work_id.ToLower() == workId.ToLower()).ToFeedIterator();
if (result.HasMoreResults)
{
var existingDocuments = result.ReadNextAsync().Result?.ToList();
existingDocuments.ForEach(document =>
{
//Creating the partition key of the document
var partitionKey = new PartitionKey(document?.work_id);
document.IsConversation = true;
//Inserting/Updating the message in to cosmos db collection: Name
_containers.Twistle.ReplaceItemAsync(document, document.document_id, partitionKey);
});
}
We had the same issue of updating the Cosmos DB schema for existing documents. We were able to achieve this through a custom JsonSerializer.
We created CosmosJsonDotNetSerializer inspired from Cosmos DB SDK. CosmosJsonDotNetSerializer exposes the FromStream method that allows us to deal with raw JSON. You can update the FromStream method to update document schema to your latest version. Here is the pseudo-code:
public override T FromStream<T>(Stream stream)
{
using (stream)
{
if (typeof(Stream).IsAssignableFrom(typeof(T)))
{
return (T)(object)stream;
}
using (var sr = new StreamReader(stream))
{
using (var jsonTextReader = new JsonTextReader(sr))
{
var jsonSerializer = GetSerializer();
return UpdateSchemaVersion<T>(jsonSerializer.Deserialize<JObject>(jsonTextReader));
}
}
}
}
private T UpdateSchemaVersonToCurrent<T>(JObject jObject)
{
// Add logic to update JOjbect to the latest version. For e.g.
jObject["guid"] = Guid.NewGuid().ToString();
return jObject.ToObject<T>();
}
You can set Serializer to CosmosJsonDotNetSerializer in CosmosClientOptions while creating CosmosClient.
var cosmosClient = new CosmosClient("<cosmosDBConnectionString>",
new CosmosClientOptions
{
Serializer = new CosmosJsonDotNetSerializer()
};
This way, you always deal with the latest Cosmos document throughout the code, and when you save the entity back to Cosmos, it is persisted with the latest schema version.
You can take this further by running schema migration as a separate process, for example, inside an Azure function, where you load old documents, convert them to the latest version and then save it back to Cosmos.
I also wrote a post on Cosmos document schema update that explains this in detail.
I'm trying to create a new Mongoose document first
let newTrade = new TradeModel({
userId: userId,
symbol: symbol
})
Then I need to send this item to another server, to get the other details
let orderReceived = await sendOrderToServer(newTrade);
And then I want to merger this in with the new document and save
newTrade = {...newTrade, ...orderReceived}
But once I alter the original document, I loose access to the .save() method. I cant run .save() first becasue its missing required fields. I really just need the Trade._id first before sending to the other server which is why I'm doing it this way. Any suggestions? thanks.
You can use the mongoose.Types.ObjectId() constructor to create an id and then send that to your server, when the response comes back, create a document based on that.
EDIT: Adding few examples for clarity
let newTradeId = new mongoose.Types.ObjectId(); // With "new" or without, Javascript lets you use object constructors without instantiating
let orderReceived = await sendOrderToServer(newTradeId);
let newTrade = new TradeModel({ ...orderReceived }); // Create the newTrade by destructuring the order received.
// TADA! We are done!