I have some strange behaviour happening with my Redis-OM application, I understand that this is still a very BETA version of the software but I just wanted to make sure I wasn't doing something dumb (which I might be)
So I am setting up an application to keep a playlist of video ID's within a Room that I store temporarily in a Redis Cloud Database.
I have a function that creates a Room, one that fetches the Room details (everything currently in the room) and one that adds a new video to the Playlist within that room. (see below) - NOTE: the data variable within createRoom(data) is just a string of the Room ID
class Room extends Entity {}
let schema = new Schema(
Room,
{
code: { type: 'string' },
playlist: {
type: 'array',
videos: {
type: 'object',
},
},
},
{
dataStructure: 'JSON',
}
);
export async function createRoom(data) {
await connect();
const repository = new Repository(schema, client);
const room = repository.createEntity(data);
const id = await repository.save(room);
await client.execute(['EXPIRE', `Room:${id}`, 43200]);
return id;
}
export async function getRoom(code) {
await connect();
const repository = new Repository(schema, client);
const room = await repository
.search()
.where('code')
.equals(code)
.returnFirst();
return room;
}
export async function addVideoToRoom(code, videoDetails) {
const room = await getRoom(code);
await room.playlist.push(videoDetails);
await connect();
const repository = new Repository(schema, client);
const id = await repository.save(room);
return id;
}
The primary issue that I'm having is adding a second video to the playlist. What happens is
Create Room - adds a new room to the DB
Search for video
Click to add video to DB (video is successfully added)
Click to add second video to DB (fails because getRoom(code)fails - returns null)
This was working yesterday, however I'm not sure why it no longer works.
If anyone has any idea's why that might be please let me know, I have a feeling it may be how I'm handling clients or indexes with Redis so I have popped my functions for those below too.
const client = new Client();
async function connect() {
if (!client.isOpen()) {
await client.open(process.env.REDIS_URL);
}
}
export async function createIndex() {
await connect();
const repository = new Repository(schema, client);
await repository.dropIndex();
await repository.createIndex();
}
Thanks very much programmers of Stack - If I'm being super dumb I do apologise.
Redis OM for Node.js supports neither nested objects nor a type of 'object' within the Schema. Valid types are 'string', 'number', 'boolean', and 'array'. Arrays are only arrays of strings. The rest are self-explanatory.
If you want to have a Room that has multiple Videos, you need to define a Room entity, perhaps with a playlist that is defined as an array not of objects, but of Video ids.
Details on this can be found in the README.
Related
This is my first time of using bulkWrite to carry out updates via mongoose. I am building a blog application and I am using it to learn MERN stack. I have a Post model. The Post model has object value which is an array. This is an example of it:
const PostSchema = new mongoose.Schema(
{
postLikes:{
type: Array,
default: []
}
}
)
The postLikes contain mongodb object ids of users who liked a post.
I have a logic for deleting selected users or all users by an admin. The like system does not come with a Like Model of it own. I simply used an array system inside the post model. After deleting a user, I would like to update all post models with likes of the selected users. Some users may have multiple likes across different posts.
In my node, I created a variable like this:
const {selectedIds} = req.body;
The selectedIds came from reactjs like this:
const [selectedUsers, setSelectedUsers] = useState([]);
const arrayOfSelectedUserId = (userId) =>{
setSelectedUsers(prevArray => [...prevArray, userId]);
);
}
For the request, I did it like this:
const response = await axiosPrivate.post(`/v1/users/deleteSelected`, selectedIds, { withCredentials: true,
headers:{authorization: `Bearer ${auth.token}`}})
In nodejs, the selectedUsers ids was passed to this variable:
const {selectedIds} = req.body;
I created the logic this way:
const findIntersection = (array1, array2) => {
return array1.filter((elem) => {
return array2.indexOf(elem) !== -1;
});
}
const filteredPost = posts.filter((singleFilter) => {
const intersection = findIntersection(selectedIds, singleFilter.postLikes);
return singleFilter.postLikes.length !== 0 && intersection.length !== 0;
});
const updatedPosts = filteredPost.map((obj)=>{
const intersection = findIntersection(selectedIds, obj.postLikes);
console.log(intersection )
return {
updateOne: {
filter: { _id: obj._id },
update: { $pull: { postLikes: { $in: intersection } } },
},
};
});
Post.bulkWrite(updatedPosts).then((res) => {
console.log("Documents Updated", res.modifiedCount)
})
The console.log shows the text Document updated and showed number of documents updated. However, if I check my database, the update won't reflect. This means that the selected users' ID is still in the array.
Is there a better method? What Am I doing wrong?
I want to create common crud operations in Express js with Sequelize.
I have created getAll function as below.
exports.getAll = (module, res,next) => {
module
.findAndCountAll({
where: {
CreatedBy: 1,
isDeleted: false,
isActive: true
},
offset: 0,
limit: 10,
}).then((result) => {
res.status(200).json({
message: "data Fetched from database",
statusCode: 200,
result: result,
});
}).catch((error) => {
console.log(error);
});
}
and I am calling this common function in Controller function as below by passing name of Model e.g. category
crudOperations.getAll(category, res);
It is working fine. but how do I create function for post data ?
For posting data, I want to use Sequelize's magic methods (because one user can associated with many category as below)
user.hasMany(category, {
foreignKey: 'CreatedBy'
});
Example, I want to add category with respect to user, so I want to use magic method as below.
req.user
.createCategory({
name: name,
})
How should I pass user and createCategory as parameter to common function?
How do I pass data to function?
Is it good practice to create common function for CRUD? or should go with writing function for each module?
I am building application architecture and design. When we are talking about creating common methods or services, it would completely depend on the use case or types of operation we are performing.
Sequelize has already basic methods which perform common operations. Following is my idea which might put more light on your way. Please refer to below pseudo-code.
Answer 1.
Following is the method that might help you, that how do you organize your functions. It is my basic thought, so there might room for an error.
BaseModel.helper.js
class BaseModelHelper{
static async find(params){
const {model, where, attributes, include=[] offset:0, limit: 10} = params;
objFind = {};
if(Object.keys(where).length > 0){
objFind = {...objFind, where}
}
if(attributes.length > 0){
objFind = {...objFind, attributes}
}
if(include.length > 0){
objFind = {...objFind, include}
}
model.find({
where: where,
attributes: attributes,
include,
offset,
limit
}).then((data)=>{
return data;
}).catch((err)=>{
throw new Error(err);
});
}
static async create(params){
const {model, properties} = params;
model.create(properties)
.then((data)=>{
return data;
}).catch((err)=>{
throw new Error(err);
});
}
static async update(params){
const {model, newData, where} = params;
model.update(newData, {
where
})
.then((data)=>{
return data;
}).catch((err)=>{
throw new Error(err);
});
}
static async delete(params){
const {model, where} = params;
model.destroy(where)
.then((data)=>{
return data > 0 ? true : false;
}).catch((err)=>{
throw new Error(err);
});
}
}
module.exports = BaseModelHelper;
Data.services.js
const BaseModelHelper = require("BaseModel.helper.js);
class DataServices{
static async add(){
// Owner table entry
const {id} = await BaseModelHelper.create({
model: "Owner'
properties:{
name: 'Loren',
role: 'admin',
},
});
// Cat table entry
const {id: petId} = await BaseModelHelper.create({
model: "Pet'
properties:{
owner_id: 'c0eebc45-9c0b',
type: 'cat',
}
});
}
static async findWithRelations(){
const arrData = await BaseModelHelper.find({
model: 'User',
where: {
role: 'admin'
},
attributes: ['id', 'username', 'age'],
include: [{
model: pet,
through: {
attributes: ['createdAt', 'startedAt', 'finishedAt'],
where: {completed: true}
}
}]
});
}
static async findBelongs(){
const arrData = await BaseModelHelper.find({
model: 'User',
where: {
role: 'admin'
},
attributes: ['id', 'username', 'age']
});
}
static async update(){
const arrData = await BaseModelHelper.update({
Model: 'Pet',
newData:{
name: 'lina',
}
where: {
name: 'suzi',
}
});
}
static async delete(){
const arrData = await BaseModelHelper.delete({
Model: 'Pet',
where: {
name: 'lina',
}
});
}
}
module.exports = DataServices;
The above way I have described has one benefit is to you don't go to do error handling every place, if you have to manage to centralize error handler. An when an error occurred it throws an error and catches by centralizing the error handler of Express.
The above class of common db operation is build based on my experience and the frequency of the operations we have performed. I know that there is room for more possibilities than we expect, but with the method, I have suggested you might not face many obstacles.
Answer 2. You should create a whole qualified query object at your service level. Once its build at the service level, then only you will pass it to our BaseModelHelper.
Answer 3. Same thing you should create your data object at the service level. If the data object builds from multiple tables, then you first encapsulate at the service level, then you should pass to the BaseModelHelper method.
Answer 4. Yes, I also favor the same. But one thing you should keep in mind that there is always room for improvement. If you wish to create a method for each module then you should copy this BaseModelHelper to everywhere else I suggest inheriting the file at the service level.
All the database operation objects build at the service level, not the controller level. Your database service and object preparation services might be different so it will give a more clear picture. The object preparation service might scope to include more different services.
Again above approach is my thought process and I suggested you based on my experience. Again there is more room for improvement, that you might take to create.
I'm building an API with Restify which is based on Express. I'm using Typeorm and I'm wondering what is the best way to update different properties which came from user input.
Essentially I have a route like this:
server.put('/users/:id', errorHandler(update));
which fire this method:
const update = async (req: Request, res: Response) => {
const user = { ...req.body, id: req.params.id } as User;
res.send(await userService.update(user));
}
as you can see I used the spread operator to create an User entity. Then, inside userService.update I have the following:
export const update = async (user: User): Promise<User> => {
const repository = getRepository(User);
const entity = await repository.findOne({ id: user.id });
if (!entity) throw new errors.ResourceNotFoundError(`There is no user with id of ${user.id}`);
Object.assign(entity, user, { id: entity.id, chat_id: entity.chat_id, project_id: entity.project_id, deleted: false });
return await repository.save(entity);
}
as you can see, I want prevent that the data provided by the API consumer will replace some important properties like: id, chat_id, project_id, deleted, so I used the method Object.assign to achieve this.
Is this a good way? What do you suggest for improve this?
You can use update method of typeorm like this, it will partially update the values that you give as a second argument.
// this will find a user with id ${user.id} and will only
// change the fields that is specified in the user object
await repository.update(user.id, user);
// check if updated for debugging
const updatedUser = await repository.findOne(user.id);
console.log(updatedUser, null, 2)
If you want to create a new record of the existing user in db, then you only need to change it's id. To do that
Deep clone the object so there will be another user object with new reference
Remove id field from the deep cloned object and use insert afterwards
// Deep clone user object
const clonedUser = JSON.parse(JSON.stringify(user))
// Delete id field from the deep clone
delete clonedUser.id;
// create a new user with different id
await repository.insert(clonedUser);
You can filter your important properties.
And pass the user id to the update method of your userService.
const { id, chat_id, project_id, deleted, ...user } = req.body;
const { id } = req.params;
res.send(await userService.update(id, user));
This will make sure user object don't have the properties(that is important).
And you can change your update method like below:
export const update = (userId: string, user: User): Promise<User> => {
return getRepository(User).update(userId, user);
}
I need to retrieve an object and also get the relations and nested relations.
So, I have the three models below:
User model:
module.exports = {
attributes: {
name: {
type: 'string'
},
pets: {
collection: 'pet',
via: 'owner',
}
}
Pet model:
module.exports = {
attributes: {
name: {
type: 'string'
},
owner: {
model: 'user'
},
vaccines: {
collection: 'vaccine',
via: 'pet',
}
}
Vaccine model:
module.exports = {
attributes: {
name: {
type: 'string'
},
pet: {
model: 'pet'
}
}
Calling User.findOne(name: 'everton').populate('pets').exec(....) I get the user and associated Pets. How can I also get the associated vaccines with each pet? I didn't find references about this in the official documentation.
I've ran into this issue as well, and as far as I know, nested association queries are not built into sails yet (as of this post).
You can use promises to handle the nested population for you, but this can get rather hairy if you are populating many levels.
Something like:
User.findOne(name: 'everton')
.populate('pets')
.then(function(user) {
user.pets.forEach(function (pet) {
//load pet's vaccines
});
});
This has been a widely discussed topic on sails.js and there's actually an open pull request that adds the majority of this feature. Check out https://github.com/balderdashy/waterline/pull/1052
While the answer of Kevin Le is correct it can get a little messy, because you're executing async functions inside a loop. Of course it works, but let's say you want to return the user with all pets and vaccines once it's finished - how do you do that?
There are several ways to solve this problem. One is to use the async library which offers a bunch of util functions to work with async code. The library is already included in sails and you can use it globally by default.
User.findOneByName('TestUser')
.populate('pets')
.then(function (user) {
var pets = user.pets;
// async.each() will perform a for each loop and execute
// a fallback after the last iteration is finished
async.each(pets, function (pet, cb) {
Vaccine.find({pet: pet.id})
.then(function(vaccines){
// I didn't find a way to reuse the attribute name
pet.connectedVaccines = vaccines;
cb();
})
}, function(){
// this callback will be executed once all vaccines are received
return res.json(user);
});
});
There is an alternative approach solving this issue with bluebird promises, which are also part of sails. It's probably more performant than the previous one, because it fetches all vaccines with just one database request. On the other hand it's harder to read...
User.findOneByName('TestUser')
.populate('pets')
.then(function (user) {
var pets = user.pets,
petsIds = [];
// to avoid looping over the async function
// all pet ids get collected...
pets.forEach(function(pet){
petsIds.push(pet.id);
});
// ... to get all vaccines with one db call
var vaccines = Vaccine.find({pet: petsIds})
.then(function(vaccines){
return vaccines;
});
// with bluebird this array...
return [user, vaccines];
})
//... will be passed here as soon as the vaccines are finished loading
.spread(function(user, vaccines){
// for the same output as before the vaccines get attached to
// the according pet object
user.pets.forEach(function(pet){
// as seen above the attribute name can't get used
// to store the data
pet.connectedVaccines = vaccines.filter(function(vaccine){
return vaccine.pet == pet.id;
});
});
// then the user with all nested data can get returned
return res.json(user);
});
I want to give users the ability to create collections in my Node app. I have really only seen example of hard coding in collections with mongoose. Anyone know if its possible to create collections dynamically with mongoose? If so an example would be very helpful.
Basically I want to be able to store data for different 'events' in different collections.
I.E.
Events:
event1,
event2,
...
eventN
Users can create there own custom event and store data in that collection. In the end each event might have hundreds/thousands of rows. I would like to give users the ability to perform CRUD operations on their events. Rather than store in one big collection I would like to store each events data in a different collection.
I don't really have an example of what I have tried as I have only created 'hard coded' collections with mongoose. I am not even sure I can create a new collection in mongoose that is dynamic based on a user request.
var mongoose = require('mongoose');
mongoose.connect('localhost', 'events');
var schema = mongoose.Schema({ name: 'string' });
var Event1 = mongoose.model('Event1', schema);
var event1= new Event1({ name: 'something' });
event1.save(function (err) {
if (err) // ...
console.log('meow');
});
Above works great if I hard code 'Event1' as a collection. Not sure I create a dynamic collection.
var mongoose = require('mongoose');
mongoose.connect('localhost', 'events');
...
var userDefinedEvent = //get this from a client side request
...
var schema = mongoose.Schema({ name: 'string' });
var userDefinedEvent = mongoose.model(userDefinedEvent, schema);
Can you do that?
I believe that this is a terrible idea to implement, but a question deserves an answer. You need to define a schema with a dynamic name that allows information of 'Any' type in it. A function to do this may be a little similar to this function:
var establishedModels = {};
function createModelForName(name) {
if (!(name in establishedModels)) {
var Any = new Schema({ any: Schema.Types.Mixed });
establishedModels[name] = mongoose.model(name, Any);
}
return establishedModels[name];
}
Now you can create models that allow information without any kind of restriction, including the name. I'm going to assume an object defined like this, {name: 'hello', content: {x: 1}}, which is provided by the 'user'. To save this, I can run the following code:
var stuff = {name: 'hello', content: {x: 1}}; // Define info.
var Model = createModelForName(name); // Create the model.
var model = Model(stuff.content); // Create a model instance.
model.save(function (err) { // Save
if (err) {
console.log(err);
}
});
Queries are very similar, fetch the model and then do a query:
var stuff = {name: 'hello', query: {x: {'$gt': 0}}}; // Define info.
var Model = createModelForName(name); // Create the model.
model.find(stuff.query, function (err, entries) {
// Do something with the matched entries.
});
You will have to implement code to protect your queries. You don't want the user to blow up your db.
From mongo docs here: data modeling
In certain situations, you might choose to store information in
several collections rather than in a single collection.
Consider a sample collection logs that stores log documents for
various environment and applications. The logs collection contains
documents of the following form:
{ log: "dev", ts: ..., info: ... } { log: "debug", ts: ..., info: ...}
If the total number of documents is low you may group documents into
collection by type. For logs, consider maintaining distinct log
collections, such as logs.dev and logs.debug. The logs.dev collection
would contain only the documents related to the dev environment.
Generally, having large number of collections has no significant
performance penalty and results in very good performance. Distinct
collections are very important for high-throughput batch processing.
Say I have 20 different events. Each event has 1 million entries... As such if this is all in one collection I will have to filter the collection by event for every CRUD op.
I would suggest you keep all events in the same collection, especially if event names depend on client code and are thus subject to change. Instead, index the name and user reference.
mongoose.Schema({
name: { type: String, index: true },
user: { type: mongoose.Schema.Types.ObjectId, ref: 'User', index: true }
});
Furthermore I think you came at the problem a bit backwards (but I might be mistaken). Are you finding events within the context of a user, or finding users within the context of an event name? I have a feeling it's the former, and you should be partitioning on user reference, not the event name in the first place.
If you do not need to find all events for a user and just need to deal with user and event name together you could go with a compound index:
schema.index({ user: 1, name: 1 });
If you are dealing with millions of documents, make sure to turn off auto index:
schema.set('autoIndex', false);
This post has interesting stuff about naming collections and using a specific schema as well:
How to access a preexisting collection with Mongoose?
You could try the following:
var createDB = function(name) {
var connection = mongoose.createConnection(
'mongodb://localhost:27017/' + name);
connection.on('open', function() {
connection.db.collectionNames(function(error) {
if (error) {
return console.log("error", error)
}
});
});
connection.on('error', function(error) {
return console.log("error", error)
});
}
It is important that you get the collections names with connection.db.collectionNames, otherwise the Database won't be created.
This method works best for me , This example creates dynamic collection for each users , each collection will hold only corresponding users information (login details), first declare the function dynamicModel in separate file : example model.js
/* model.js */
'use strict';
var mongoose = require('mongoose'),
Schema = mongoose.Schema;
function dynamicModel(suffix) {
var addressSchema = new Schema(
{
"name" : {type: String, default: '',trim: true},
"login_time" : {type: Date},
"location" : {type: String, default: '',trim: true},
}
);
return mongoose.model('user_' + suffix, addressSchema);
}
module.exports = dynamicModel;
In controller File example user.js,first function to create dynamic collection and second function to save data to a particular collection
/* user.js */
var mongoose = require('mongoose'),
function CreateModel(user_name){//function to create collection , user_name argument contains collection name
var Model = require(path.resolve('./model.js'))(user_name);
}
function save_user_info(user_name,data){//function to save user info , data argument contains user info
var UserModel = mongoose.model(user_name) ;
var usermodel = UserModel(data);
usermodel.save(function (err) {
if (err) {
console.log(err);
} else {
console.log("\nSaved");
}
});
}
yes we can do that .I have tried it and its working.
REFERENCE CODE:
app.post("/",function(req,res){
var Cat=req.body.catg;
const link= req.body.link;
const rating=req.body.rating;
Cat=mongoose.model(Cat,schema);
const item=new Cat({
name:link,
age:rating
});
item.save();
res.render("\index");
});
I tried Magesh varan Reference Code ,
and this code works for me
router.post("/auto-create-collection", (req, res) => {
var reqData = req.body; // {"username":"123","password":"321","collectionName":"user_data"}
let userName = reqData.username;
let passWord = reqData.password;
let collectionName = reqData.collectionName;
// create schema
var mySchema = new mongoose.Schema({
userName: String,
passWord: String,
});
// create model
var myModel = mongoose.model(collectionName, mySchema);
const storeData = new myModel({
userName: userName,
passWord: passWord,
});
storeData.save();
res.json(storeData);
});
Create a dynamic.model.ts access from some where to achieve this feature.
import mongoose, { Schema } from "mongoose";
export default function dynamicModelName(collectionName: any) {
var dynamicSchema = new Schema({ any: Schema.Types.Mixed }, { strict: false });
return mongoose.model(collectionName, dynamicSchema);
}
Create dynamic model
import dynamicModelName from "../models/dynamic.model"
var stuff = { name: 'hello', content: { x: 1 } };
var Model = await dynamicModelName('test2')
let response = await new Model(stuff).save();
return res.send(response);
Get the value from the dynamic model
var Model = dynamicModelName('test2');
let response = await Model.find();
return res.send(response);