best way to save images on a mongoose domain - node.js

I'm new to node.js and i'm trying to make an application which saves photos of users just like a normal application. The user can set a profile picture and could add other pictures as well in their wall.
I'm also done with other parts of my application, but I'm trying to figure out what would be the best way to save those images -- since my application should be able to scale the number of users to a big number.
I referenced :
How to upload, display and save images using node.js and express ( to save images on server)
and also: http://blog.robertonodi.me/managing-files-with-node-js-and-mongodb-gridfs/ (to save images on mongo via grid-fs)
and I'm wondering what would be the best option.
So, could you please suggest what I must be rooting for?
Thanks,

It depends on your application needs, one thing I've did for a similar applications was to create an abstraction over the file storage logic of the server application.
var DiskStorage = require('./disk');
var S3Storage = require('./s3');
var GridFS = require('./gridfs');
function FileStorage(opts) {
if (opts.type === 'disk') {
this.storage = new DiskStorage(opts);
}
if (opts.type === 's3') {
this.storage = new S3Storage(opts);
}
if (opts.type === 'gridfs') {
this.storage = new GridFS(opts);
}
}
FileStorage.prototype.store = function(opts, callback) {
this.storage.store(opts, callback);
}
FileStorage.prototype.serve = function(filename, stream, callback) {
this.storage.serve(filename, stream, callback);
}
module.exports = FileStorage;
Basically you will have different implementation for your logic to store user uploaded content. And when you need it you can just scale from your local file storage/mongo gridfs to a S3 maybe. But for a seamless transition when you store the user file relationship in your database you could store also the file provider, local or S3.
Saving images directly to the local file system can be sometimes a little bit complicated when we are talking about many uploaded content, you could easily run into limitations like How many files can I put in a directory?. GridFS should not have such a problem, I've had pretty good experience using MongoDB for file storage, but this depends from application to application.

Related

Should I use GridFS or some other method to create a file sharing app?

I am currently beginning work on a file sharing app for my company. A simple form to upload a file, the user is then given a download URL and can past that on to anyone so they can download the file (Similar to products such as WeTransfer).
However I am struggling on decided how to do it. I have been playing with MongoDB and GridFS. I have successfully used multer and multer-gridfs-storage to upload files directly into my database. I am struggling to get them to download as I don't know that much about GridFS.
const storage = new GridFsStorage({
url: 'mongodb://localhost:27017/fileUpload',
file: (req, file) => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (err, buf) => {
if (err) {
return reject(err)
}
const filename = buf.toString('hex') + path.extname(file.originalname);
const fileInfo = {
filename: filename,
bucketName: 'uploads'
};
resolve(fileInfo)
});
});
}
});
const upload = multer({ storage })
But it got me thinking is this the best way of doing this or would there be a better why or serving those download files (to download to a users computer).
Any advice is greatly appreciated!
GridFS is a specification for storing and retrieving files that exceed the BSON document size limit of 16 MB. GridFS is a convention implemented by all MongoDB drivers that stores binary data across many smaller documents. The binaries are split into the chunks and then the chunks are stored in collections created by GridFS.
Having said that, given the presented use cases i would highly recommend using media server for storage as given the application landscape, that makes a more economical, viable and scalable solution.
Having said that, I would generally, avoid putting BLOBs in the database if there are other storage options that cost less as using a database as BLOB store is generally not a cost optimised solution.
Sure, there are valid reasons for storing blobs in the database, but given the application’s use case (it being media intensive), use the media server for file storage, and databases for data structures.
In such cases, It is often easy to get "cost unoptimized" with time. Plus the database size would grow exponentially with time, bringing it's own challenges with RAM (WiredTiger Cache) management.
All in all - if it was me - I would use media storage for BLOB intensive applications than relying on databases.

export firebase data node to pdf report

What a have is a mobile app emergency message system which uses Firebase as a backend. When the end of an emergency event ends, I would like to capture the message log in a pdf document. I have not been able to find any report editors that work with Firebase. This means I may have to export this to php mysql. The Firebase php SDK looks to be to much overkill for this task. I have been googling php get from firebase and most responses have to do with using the Firebase php SDK. Is this the only way it can be acomplished?
You could use PDF Kit (...) on Cloud Functions (it's all nodeJS, no PHP available there).
On npmjs.com there are several packages for #firebase-ops, googleapis and #google-cloud.
In order to read from Firebase and write to Storage Bucket or Data Store; that example script would still require a database reference and a storage destination, to render the PDF content (eventually from a template) and puts it, where it belongs. also see firebase / functions-samples (especially the package.json which defines the dependencies). npm install -g firebase-tools installs the tools required for deployment; also the requires need to be installed in order to be locally known (quite alike composer - while remotely these are made known while the deployment process).
You'd need a) Firebase Event onUpdate() as the trigger, b) check the endTime of the returned DeltaSnapshot for a value and c) then render & store the PDF document. the code may vary, just to provide a coarse idea of how it works, within the given environment:
'use strict';
const admin = require('firebase-admin');
const functions = require('firebase-functions');
const PDFDocument = require('pdfkit');
const gcs = require('#google-cloud/storage')();
const bucket = gcs.bucket( 'some-bucket' );
const fs = require('fs');
// TODO: obtain a handle to the delta snapshot
// TODO: render the report
var pdf = new PDFDocument({
size: 'A4',
info: {Title: 'Tile of File', Author: 'Author'}
});
pdf.text('Emergency Incident Report');
pdf.pipe(
// TODO: figure out how / where to store the file
fs.createWriteStream( './path/to/file.pdf' )
).on('finish', function () {
console.log('PDF closed');
});
pdf.end();
externally running PHP code is in this case nevertheless not run on the server-side. the problem with it is, that an external server won't deliver any realtime trigger and therefore the file will not appear instantly, upon time-stamp update (as one would expect it from a Realtime Database). one could also add external web-hooks (or interface them with PHP), eg. to obtain these PDF files through HTTPS (or even generated upon HTTPS request, for externally triggered generation). for local testing one can use command firebase serve, saves much time vs. firebase deploy.
the point is, that one can teach Cloud Function how the PDF files shall look alike, when they shall be created and where to put them, as micro-service which does nothing else but to render these files. scripting one script should be still within acceptable range, given all the clues provided.

Best NodeJS Workflow for team development

I'm trying to implement NodeJS and Socket.io for real time communication between two devices (PC & Smartphones) in my company product.
Basically what I want to achieve is sending a notification to all online users when somebody change something on a file.
All the basic functionality for saving the updates are already there and so, when everything is stored and calculated, I send a POST request to my Node server saying that something changed and he need to notify the users.
The problem now is that when I want to change some code in the NodeJS scripts, as long as I work alone, I can just upload the new files via FTP and just restart the pm2 service, but when my colleagues will start working with me on this story we will have problems merging our changes without overlapping each other.
Launching a local server is also not possible because we need the connection between our current server and the node machine and since our server is online it cannot access our localhosts.
It's there a way for a team to work together in the same Node server but without overlapping each other ?
Implement changes using some other option rather than FTP. For example:
You can use webdav-fs in authenticated or non-authenticated mode:
// Using authentication:
var wfs = require("webdav-fs")(
"http://example.com/webdav/",
"username",
"password"
);
wfs.readdir("/Work", function(err, contents) {
if (!err) {
console.log("Dir contents:", contents);
} else {
console.log("Error:", err.message);
}
});
putFileContents(remotePath, format, data [, options])
Put some data in a remote file at remotePath from a Buffer or String. data is a Buffer or a String. options has a property called format which can be "binary" (default) or "text".
var fs = require("fs");
var imageData = fs.readFileSync("someImage.jpg");
client
.putFileContents("/folder/myImage.jpg", imageData, { format: "binary" })
.catch(function(err) {
console.error(err);
});
And use callbacks to notify your team, or lock the files via the callback.
References
webdav-fs
webdav
lockfile
Choosing Secure Passwords

simple Azure website (built with nodejs), how to log http get request

I have a simple Azure website (free or shared) which is built with nodejs/expressjs. There is no local or database storage.
I'd like to save incoming http/get request for further analysis. I guess I can't just save req to the local drive/json temp file.
Is there a way to save to some log file that I can ftp download later?
simpler and less cost, the better.
Something like this:
fs = require('fs');
function homePage(req, res){
var d = new Date();
var logDate = d.getTime();
fs.writeFile(logDate+'.txt', req, function (err) {
if (err) return console.log(err);
console.log('Logged');
});
}
So the first line we call node's file system as a requirement. Then we write a homepage route that created a date variable to use as the log's file name. After that we use fs to write the request to the file.
You'll need to do some tinkering to optimize readability but this'll get you started. File's shouldn't overwrite since we used the time, but it might overwrite if you get huge traffic.

How can I structure my express app where I only need to open a mongodb connection once?

Note: Please read the edited portion of this post before answering, it might save you time and answers one of my questions.
The problem I'm having is pretty simple but I'm pretty new to this overall and I'm having issues figuring out how to implement a mongodb database connection properly in a node/express app.
I'm using express 3.x and am basing my app layout around this project supplied by the author of Express:
https://github.com/expressjs/express/tree/d8caf209e38a214cb90b11ed59fd15b717b3f9bc/examples/blog (now removed from repo)
I have no interest in making a blog however the way the app is structured appears to be quite nice. The routes are separated and everything is organized nicely.
My problem is I might have 5-6 different route js files and each route js file might have anywhere between 1 and 15 routes; of those routes 1 or 15 might want to access the db.
So my problem is it seems like a really terrible idea to do a db.open(...) every single time I want to query the db. I should mention at this point I'm using the native mongo-db driver (npm install mongodb).
I would also need to include a file like this:
http://pastebin.com/VzFsPyax
...in all of those route files and all of my model files. Then I'm also dealing with dozens upon dozens of open connections.
Is there a way I can structure my app in such a way where I only make 1 connection and it stays open for the duration of the session (having a new one made every request would be bad too)?
If so, how can I do this? If you know the answer please post a code sample using tj's blog app (the one linked earlier in this post) structure as a base guide. Basically have a way where the routes and models can use the db freely while being in separate files than the db open code.
Thanks.
EDIT
I made some progress on solving one of my issues. If you look at tj's blog example he initializes his routes in the app.js like so:
require('./routes/site')(app);
require('./routes/post')(app);
And in the routes js file it starts like this:
module.exports = function(app){
I stumbled on a project earlier today where I saw someone pass 2 variables in the modules.exports call -> function(app, db). Then figured wow could it be that easy, do I need to just adjust my routes to be (app, db) too? Yeah, it seems so.
So now part 1 of the problem is solved. I don't have to require a mongo.js file with the connection boilerplate in every route file. At the same time it's flexible enough where I can decide to pick and choose which route files pass a db reference. This is standard and has no downside right?
Part 2 of the problem (the important one unfortunately) still exists though.
How can I bypass having to do a db.open(...) around every query I make and ideally only make a connection once per session?
Other solution is to pass database to the router via request, like this:
app.js
var db = openDatabase();
var app = express();
app.all('*', function(request, response, next)
{
request.database = db;
next();
});
app.get('/api/user/:id', Users.getByID);
users.js
var Users =
{
getByID: function(request, response)
{
request.database.collection('users').findOne(...)
response.send(user);
}
};
module.exports = Users;
I made a very simple module hub for this case that replaces the use of a global space.
In app.js you can create db connection once:
var hub = require('hub');
hub.db = new Db('foobar', new Server('10.0.2.15', 27017, {}), {native_parser: false});
And use it from any other files:
var hub = require('hub');
// hub.db - here link to db connection
This method uses a feature of 'require'. Module is only loaded for the first time and all the other calls gets a reference to an already loaded instance.
UPDATE
That's what I mean:
In main file like app.js we create Db connection, open it and store into hub:
app.js:
var hub = require('hub');
hub.mongodb = require('mongodb');
hub.mongodbClient = new hub.mongodb.Db('foobar', new hub.mongodb.Server('10.0.2.15', 27017, {}), {native_parser: false});
hub.mongodbClient.open(function(error) {
console.log('opened');
});
Now in any other file (message for example) we have access to opened connection and can simple use it:
message.js:
var hub = require('hub');
var collection = new hub.mongodb.Collection(hub.mongodbClient, 'message');
module.exports.count = function(cb) {
collection.count({}, function(err, count) {
cb(err, count);
});
};
Really silly. In the documentation it seems like db.open requires to be wrapped around whatever is using it, but in reality you can use it without a callback.
So the answer is to just do a db.open() in your database connection module, app.js file or where ever you decide to setup your db server/connection.
As long as you pass a reference to the db in the files using it, you'll have access to an "opened" db connection ready to be queried.

Resources