I'm trying to create a Express webserver which does the following -
Register and Sign in
Once the user signs-in the user is redirected to a "Control panel", where they choose specifications for a docker container.
Mongoose schema
email: String,
password: String,
docker: {
site: String,
adminpwd: String,
hostPort: Number,
cpus: Number,
memory: Number,
storage: Number
}
})
The docker object is appended to the email and password fields after a post in the controlpanel page.
app.post('/control-setup/:userid',
//validation
[
body('site').isURL({allow_underscores: true}),
body('admin').isLength({ min: 6 })
], (req,res) => {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
const userid = req.params.userid
//Setup or Edit
var site = req.body.site
var adminpwd = req.body.admin
var cpus = req.body.cpus
var memory = req.body.memory
var storage = req.body.storage
//Needs a function to get the next available port
var hostPort = findhostPort()
//Cutting https:// from input
var site_name = site.slice(8)
//Save or Edit in DB
Users.updateOne({_id: userid}, {docker: {site: site, adminpwd: adminpwd, hostPort: hostPort, cpus: cpus, memory: memory, storage: storage}}, {upsert: true, new: true},
function (err, result) {
if (err) throw err
console.log('Docker Parameters Setup in User' + result)
res.redirect(userid)
})
})
The problem is in the hostPort variable, I'm trying to create a function that will go through mongoDb and find an available port.
I tried using for loops but it wouldn't work.
I'm clearly missing the correct logic here.
Would really appreciate some help as I'm a newbie to NodeJs and Development as a whole
You can use distinct to find all the ports already used, and then generate a free port number.
const ALL_POSSIBLE_PORT_NUMBERS = [...]
function async findhostPort () {
const usedPorts = await User.distinct('docker.hostPort')
for (let port of ALL_POSSIBLE_PORT_NUMBERS) {
if (userPorts.indexOf(port) == -1) {
return port
}
}
}
Some caveats:
You will need to make the docker.hostPort a unique field so the DB will throw an error when trying to save a duplicate hostPort, as simultaneous requests to this method will get the same port number.
I'm assuming you have an array of all possible port numbers as some port ranges are restricted, and you will probably want to keep some for internal use
Using distinct will get slower as you get more models. If it becomes a problem, you can consider caching the list of unused ports in memory and remove from it every time a new one is created.
Related
I am storing a parking detail with a merchant id in the mongoose schema since a parking belongs to a certain merchant user and it cannot be empty or null.
Here is the model:
const parkingSchema = new mongoose.Schema({
merchantId: {
type: mongoose.Schema.Types.ObjectId,
required: true,
ref: "Merchant",
},
//other details
})
merchant model is something like this:
const merchantSchema = new mongoose.Schema({
merchantId: {
type: mongoose.Schema.Types.ObjectId,
ref: "Auth",
},
//other details
})
And finally the auth schema:
const authSchema = new mongoose.Schema({
accountType: {
type: String,
required: true,
trim: true,
default: "user",
enum: ["merchant", "user", "provider"],
},
//other details
})
If the original user wishes it, I simply want to update the parking data; otherwise, I want to throw an error.
I am using jsonwebtoken to authenticate users.
Here is the query to update the data:
exports.updateParking = async (req, res) => {
try {
const { parkingName, price, address, name, phoneNumber, about } = req.body;
const { parkingImage } = req.files;
const check_exist = await Auth.findById(req.data.id);
if (!check_exist) return res.status(404).json({ error: "User not found" });
console.log(req.data.id);
const updateData = await Parking.findByIdAndUpdate(
{ _id: req.params.id, merchantId: req.data.id }, // I think here is the problem
{
$set: {
parkingName,
price,
address,
...
},
}
);
return res.status(200).json({
success: true,
msg: "Parking has updated successfully",
});
} catch (error) {
return error.message;
}
};
However, the issue is that other users can now update another user's data which I want to stop
below is the query of middleware:
routing.patch("/parking/update/:id", middleware.authenticateToken, merchant.updateParking)
You should be showing each user only their parkings that they have created or belong to them.
const myParkings = async (req, res) => {
// always await code in try/catch block
const merchants = await Parkings.find({ user: req.user._id })
.. then populate the fields that you want to show
res.status(200).json({
success: true,
bookings,
});
};
you have to set this req.user._id when user logins. You could create a session.
I think what you're looking for is something like CASL Mongoose (or a similar package), and more specifically, the "conditions" section of the CASL docs.
What you're dealing with here is the distinction between 2 concepts:
AuthN (authentication) - determines who someone is and whether they are "authenticated" to make an API request
AuthZ (authorization) - determines what the authenticated user is allowed to do
In your app, middleware.authenticateToken is responsible for the AuthN piece of the equation. It makes sure that only users that have created an account are able to make requests to your API routes.
What you still need to solve for is the AuthZ piece, which can be done in a bunch of different ways, but one popular one is to use CASL, which is a Node AuthZ library that allows you to utilize your ORM's native query syntax to limit actions based on the authenticated (AuthN) user's attributes.
In other words, you can do something like, "Only allow user with ID 1 to update Parking entities that he/she owns". Below is generally what you're looking for (not tested for your use case, but the general idea is here):
const casl = require('#casl/ability');
// Define what a `Auth` (user) can do based on their database ID
function defineMerchantAbilities(merchantUser) {
const abilities = casl.defineAbility((allow, deny) => {
// Allow merchant to update a parking record that they own
allow('update', 'Parking', { merchantId: merchantUser.id })
})
return abilities
}
exports.updateParking = async (req, res) => {
const userId = req.data.id
const parkingId = req.params.id
// Find your merchant user in DB (see my comments at end of post)
const merchantUser = await Auth.findById(userId)
// Find your parking record
const parking = await Parking.findById(parkingId)
// Pass user to your ability function
const ability = defineMerchantAbilities(merchantUser)
// This will throw an error if a user who does not own this Parking record
// tries to update it
casl.ForbiddenError
.from(ability)
.throwUnlessCan('update', casl.subject('Parking', parking))
// If you make it here, you know this user is authorized to make the change
Parking.findByIdAndUpdate( ...your code here )
}
Additional comments/notes:
I would recommend removing your try/catch handler and using an Express default error handler as it will reduce the boilerplate you have to write for each route.
I would also recommend writing a middleware that finds a user by ID in the database and attaches it to a custom property called req.user so you always have req.user available to you in your authenticated routes.
I'm building an application (MVC) that will use 4 collections in a DB. When I add new clients to the application they will get their separate DB. So if I have 10 customers, there will be 10 DBs and 40 collections (1 db -> 4 collections)
This way each customer data is seperated from other customers, which is crucial here.
So far I've built the app and everything is working nicely, 'cept one thing.
If Company A logs in and start using the app everything works fine. But when Company A is logged in, and Company B logs in, both Company A AND B will be directed towards Company B:s DB.
I've looked trough my code and I guess this is my own fault because I use .connect (mongoose). When a company logs in the route will fire of a controller that will open a new connection (which overrides the old one) which will redirect all open connections towards that specific DB.
controller.dBlogin.js
mongoose.connect(dbConfig.url + id, options)
.then(() => {
console.log("Successfully connected to the database");
next();
}).catch(err => {
console.log('Could not connect to the database. Exiting now...');
process.exit();
});
The id is fetched from req.params.id (example: http://webapp.com/login/:id).
As far as I've figured .connect only allows one connection at a given time so I need something that will, simply, allow many connections. So I'm thinking that I could use .createConnection for this,
https://mongoosejs.com/docs/connections.html#multiple_connections
But i just cant get it to work.
I changed controller.dBlogin.js to
mongoose.createConnection(dbConfig.url + id, options)
.then(() => {
console.log("Successfully connected to the database");
next();
}).catch(err => {
console.log('Could not connect to the database. Exiting now...');
process.exit();
});
but that only leads to a timeout when the company logs in. How do I use the .createConnections? How do you go from .connect to .createConnection?
Heres the examples of routes, controller, and a model (user-schema).
routes.js
// connect to db, check auth
app.post('/login/:id', dbController.connectDB, dbController.login)
controller.dbLogin.js
exports.**connectDB** = (req, res, next) => {
const id = req.params.id;
// Get Mongoose to use the global promise library
mongoose.Promise = global.Promise;
// Options Conncetion mongodb
const options = {
useNewUrlParser: true,
};
// Connecting to the database
mongoose.connect(dbConfig.url + id, options)
.then(() => {
console.log("Successfully connected to the database");
next();
}).catch(err => {
console.log('Could not connect to the database. Exiting now...');
process.exit();
});
};
exports.login = (req, res, next) => {
passport.authenticate('local-login', {
successRedirect: '/start', // redirect to the secure profile section
failureRedirect: '/login', // redirect back to the signup page if there is an error
failureFlash: true // allow flash messages
})(req, res, next);
};
Example of a model user.js
const mongoose = require('mongoose');
const bcrypt = require('bcrypt-nodejs');
const Company = require('../models/company.js');
// define the schema for our user model
const userSchema = mongoose.Schema({
local : {
name : {
type: String,
required : true
},
email : {
type : String,
unique : true,
required : true
},
password : String,
active : Boolean,
company : {
type: mongoose.Schema.Types.ObjectId,
ref: 'Company'
}
}
});
// generating a hash for password
userSchema.methods.generateHash = function(password) {
return bcrypt.hashSync(password, bcrypt.genSaltSync(8), null);
};
// checking if password is valid
userSchema.methods.validPassword = function(password) {
return bcrypt.compareSync(password, this.local.password);
};
// create the model for users and expose it to our app
module.exports = mongoose.model('User', userSchema);
So, for the ones who finds themselves in the same spot:
After reviewing my app and the data it will keep I came to the conclusion that there is no need for splitting multi-tenancy. I reworked the app so when the user fetch or write data they only touch "their" data, and this is controlled backend.
But, I did make a post on mongoose github and got an answer,
See post here: https://github.com/Automattic/mongoose/issues/7386
The author gave a great length which seems to have an actually quite nice implementation of this with express and mongoose: http://nmajor.com/posts/multi-tenancy-with-expressmongoose
I hope this is of any help for you, and if you manage to find a solution or something, where you can show some simple code, please post it as there seems to be a lot of people asking about this.
Cheers.
I am facing memory issues with my node app. Took some heapdumps and saw a lot of mongo objects being held in the memory which is causing the node app to run out of memory.
I have the following setup for my app.
MongoDB 3.4.13
Mongoose 4.11.10 (tried 4.13.11 and 5.0.7 also)
Node 8.9.4
config.js
const clientUID = require('./env').clientUID;
module.exports = {
// Secret key for JWT signing and encryption
secret: 'mysecret',
// Database connection information
database: `mongodb://localhost:27017/app_${clientUID}`,
// Setting port for server
port: process.env.PORT || 3000,
}
I have several models in the app. Every model is defined in the following manner (just listing one of the models here):
models/card.js
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
const CardSchema = new Schema({
name: {
type: String,
unique: true,
required: true
},
macId: {
type: String,
unique: true,
required: true
},
cardTypeId: {
type: mongoose.Schema.Types.ObjectId,
ref: 'CardType',
required: true
},
},
{
timestamps: true
});
module.exports = mongoose.model('Card', CardSchema);
In the app I require the model and perform some actions as follows:
const Card = require('./models/card');
...require other models
const config = require('./config');
mongoose.connect(config.database);
function fetchCardByMacId(macId) {
return Card.findOne({ macId }).lean().exec();
}
function updateTrackerByMacId(macId, x, y, nodeId) {
const data = {x, y, lastNodeId: nodeId};
fetchCardByMacId(macId)
.then(card => {
Tracker.findOneAndUpdate({ cardId: card._id }, data, { upsert: true, new: true }).exec((error, tracker) => {
if (error) {
return console.log('update tracker error', error);
}
TrackerHistory.findOne({ trackerId: tracker._id }).exec((err, trackerHistory) => {
if (err) {
return console.log('fetch trackerHistory error', err);
}
if (trackerHistory) {
trackerHistory.trackers.push({ x, y, timestamp: moment().format(), nodeId });
TrackerHistory.findOneAndUpdate({_id: trackerHistory._id},trackerHistory,(er, trackerHis) => {
if (er) {
return console.log('trackerHistory change update error', er);
}
})
} else {
const trackerHistoryNew = new TrackerHistory({
trackerId: tracker._id,
trackers: [{ x, y, timestamp: moment().format(), nodeId }]
});
trackerHistoryNew.save((er, trackerHis) => {
if (er) {
return console.log('trackerHistory create error', er);
}
});
}
});
});
}).catch(error => {
console.log('updateTrackerByMacId error', error);
});
}
Like this there are many other functions that read and update data.
Every 5 seconds I get new data that needs to be inserted into the db (not more than few 100kbs) and some of the old db data also gets updated based on this new data (seems like fairly straight forward db ops...read, manipulate and update back).
From the index.js I spawn 2 child processes that take the load of processing this new data and updating the db based on the business logic. When new data is received in the index.js using event listeners, I send it to child process 1 to insert/update the db. child process 2 runs on a 10s timer to read this updated data and then do some further updates to the db.
Running this on my local macbook pro is no issue (logging heap memory being used never goes above 40-50mb). When i load it on a DO Ubuntu 16.04 server (4GB /2 CPUs) I am facing memory issues. The child processes are exiting after hitting the memory threshold for the process (~1.5gb) which seems very odd to me.
I also tried to do this using docker containers and see the same results. on the mac it runs without issues but on the server it is eating up memory.
Generating heapdumps shows a lot of mongo objects in the heap.
I would like some help in understanding what I am doing wrong here and what is the issue with mongo eating up this much memory on the server.
So there was a big issue with the way the TrackerHistory collection was modelled. TrackerHistory had an array and every time a new object had to be added to the array the whole TrackerHistory object was being loaded in the memory and at the given frequency of updating the real time data the memory was bloating up faster than it was being gc'd.
Fixed it by removing the trackers array in a new collection and adding a foreign key reference to the TrackerHistory.
reference article that helped me identify this issue.
https://www.mongodb.com/blog/post/6-rules-of-thumb-for-mongodb-schema-design-part-1
There is a code:
const { MongoClient } = require('mongodb')
const db = MongoClient.connect('mongodb://172.17.0.2:27017/test')
db
.then(
async dataBase => {
eduDb = dataBase.db('edu-service-accounts')
const accounts = eduDb.collection('accounts')
await accounts.createIndex({ email: 1 }, { unique: true })
accounts.insertOne({ email: '123' })
}
)
Code above creates an index, but that is no unique. I already read official docs for native mongoDB driver, but can't handle it.
And yes, I've deleted all old indexex before testing that code.
Can someone please show a code that really create an index with unique.
I mean not part of official doc, or something like that - I need code that works.
NOTE: I tested that code with local db and mlab - the same result.
Like the documentation says: db.createIndex(collectionname, index[, options], callback) the creation returns an index. Try to log the result of the callback. Maybe you are getting an error from the db.
Try something like:
// your connection stuff
accounts.createIndex({ email: 1 }, { unique: true }, function(err, result) {
if(err) {
console.log(err);
} else {
console.log(result);
}
});
After that please provide us the logs.
What I want is to have arbitrary databases (50 for example) with the same collections (same schemas, exact models, different data) and 1 nodejs (expressjs + mongoose) web app.
Example simplified case:
I have:
a single web application (expressjs + mongoose) with User model.
50 domains 50 databases with users collection.
What behaviour I want to achieve:
GET /api/users/ http request is coming to one of domains (test-domain-39.myapp.com)
app gets the requested domain name (test-domain-39) and somehow mongoose understands that it wants to query database-39 when I just do User.find({isActive: true}) in users.controller
So I just want an abstraction. I pass db name to mongoose and continue to work with the User model (as we all usually do when having single DB connection) and mongoose, if needed, creates connection (if it's the first request to the specific db), keeps it alive for next requests in connection pool and etc.
What's the most simple and efficient way to accomplish that?
Thank's in advance!
IMHO, while this is possible with MongoDB, I wouldn't advise maintaining a separate database for each domain, especially if you are expecting to have a huge number of them. Have you considered a multi-tenant model instead?
The sample code below adds user 'Alex' into two different databases, "domainOne" and "domainTwo". Hope this helps
var mongoose = require('mongoose');
var personSchema = { name: String, domain : String };
var baseUri = 'mongodb://localhost/';
domains.forEach((domain) => {
var conn = mongoose.createConnection(baseUri + domain, (error) => {
if(error){
console.log('Ups! Database connection failed!');
return;
}
//Use the connection object to create your models,
//instead the mongoose object
//so that our data is saved into the database
//associated with this connection
var Person = conn.model('Person', personSchema);
//Lets add user 'Alex' into the database
(new Person({name : 'Alex', domain : domain })).save((error) => {
if(error){
console.log('Ups! Could not save person');
} else {
conn.close();
}
});
});
});
This is how I implemented my project:
// config/db.ts
import {createConnection} from 'mongoose'
const MONGO_URI = process.env.MONGO_URI
if (!MONGO_URI)
throw new Error(
'Please define the MONGO_URI environment variable inside .env'
)
const connections: any = {}
async function db(dbName: string) {
if (connections[dbName]) {
return connections[dbName]
} else {
connections[dbName] = createConnection(`${MONGO_URI}/${dbName}`)
return connections[dbName]
}
}
export default db
// models/Test.ts
import { Schema } from 'mongoose'
export interface ITest {
_id: Schema.Types.ObjectId
name: string
createdAt?: Date
}
const testSchema = new Schema<ITest>(
{
name: { type: String, required: true },
},
{ timestamps: true }
)
export default testSchema
// pages/api/test.ts
import nc from 'next-connect'
import db from '../../config/db'
import testSchema from '../../models/Test'
const handler = nc()
handler.get(
async (req: NextApiRequestExtended, res: NextApiResponseExtended) => {
try {
const conn = await db(req.headers['x-db-key'])
const model = await conn.model('Test', testSchema)
const data = await model.find({})
res.send(data)
} catch (error: any) {
res.status(500).json({ error: error.message })
}
}
)
export default handler