Best NodeJS Workflow for team development - node.js

I'm trying to implement NodeJS and Socket.io for real time communication between two devices (PC & Smartphones) in my company product.
Basically what I want to achieve is sending a notification to all online users when somebody change something on a file.
All the basic functionality for saving the updates are already there and so, when everything is stored and calculated, I send a POST request to my Node server saying that something changed and he need to notify the users.
The problem now is that when I want to change some code in the NodeJS scripts, as long as I work alone, I can just upload the new files via FTP and just restart the pm2 service, but when my colleagues will start working with me on this story we will have problems merging our changes without overlapping each other.
Launching a local server is also not possible because we need the connection between our current server and the node machine and since our server is online it cannot access our localhosts.
It's there a way for a team to work together in the same Node server but without overlapping each other ?

Implement changes using some other option rather than FTP. For example:
You can use webdav-fs in authenticated or non-authenticated mode:
// Using authentication:
var wfs = require("webdav-fs")(
"http://example.com/webdav/",
"username",
"password"
);
wfs.readdir("/Work", function(err, contents) {
if (!err) {
console.log("Dir contents:", contents);
} else {
console.log("Error:", err.message);
}
});
putFileContents(remotePath, format, data [, options])
Put some data in a remote file at remotePath from a Buffer or String. data is a Buffer or a String. options has a property called format which can be "binary" (default) or "text".
var fs = require("fs");
var imageData = fs.readFileSync("someImage.jpg");
client
.putFileContents("/folder/myImage.jpg", imageData, { format: "binary" })
.catch(function(err) {
console.error(err);
});
And use callbacks to notify your team, or lock the files via the callback.
References
webdav-fs
webdav
lockfile
Choosing Secure Passwords

Related

Node.js Cluster Shared Cache

I'm using node-cache to create a local cache, however, the problem I have is that when using the application with PM2 which creates an application cluster the cache is created multiple times, one for each process - this isn't too much of a problem as the cached data is small so memory isn't the issue.
The real problem that I have an API call to my application to flush the cache, however when calling this API it will only flush the cache for the particular process that handles that call.
Is there a way to signal all workers to perform a function?
I did think about using Redis to cache instead as that would make it simpler to only have the one cache, the problem I have with Redis is I'm not sure the best way to scale it, I've currently got 50 applications and wouldn't want to set-up a new Redis database for each application, the alternative was to use ioredis and it's transparent key prefixing for each application but this could cause some security vulnerabilities if one application was to accidentally read data from the other clients application - And I don't believe there is a way to delete all keys just for a particular prefix (i.e. one app/client) as FLUSHALL will remove all keys
What are best practices for sharing cache for clustered node instances, but where there are many instances of the application too - think SAAS application.
Currently, my workaround for this issue is using node-cron to clear the cache every 15mins, however, there are items in the cache that don't really ever change, and there are other items which should be updated as soon as an external tool signals the application to flush the cache via an API call
For anyone looking at this, for my use case, the best method was to use IPC.
I implemented an IPC messenger to pass messages to all processes, I read in the process name from the pm2 config file (app.json) to ensure we send the message to the correct application
// Sender
// The sender can run inside or outside of pm2
var pm2 = require('pm2');
var cfg = require('../app.json');
exports.IPCSend = function (topic, message) {
pm2.connect(function () {
// Find the IDs of who you want to send to
pm2.list(function (err, processes) {
for (var i in processes) {
if (processes[i].name == cfg.apps[0].name) {
console.log('Sending Message To Id:', processes[i].pm_id, 'Name:', processes[i].name)
pm2.sendDataToProcessId(processes[i].pm_id, {
data: {
message: message
},
topic: topic
}, function (err, res) {
console.log(err, res);
});
}
}
});
});
}
// Receiver
// No need to require require('pm2') however the receiver must be running inside of pm2
process.on('message', function (packet) {
console.log(packet);
});

How to create multiple instances of nodejs server

I am building a notification application in which every time an event is triggered the associated user gets a notification. I am using localstorage of NodeJs to store the information of the logged in user. The problem is that when two users are logged in,the localstorage values are overridden by the new users value.
I need to create multiple instances of NodeJS server so that every user has its own localStorage.
For example
If two users log in with credentials
{
userName:name1
}
and
{
userName:name2
}
then two separate localStorage should be created one having userName:name1 and one having userName:name2.
I have tried all the solutions available online but I am unable to create multiple states of NodeJS server.
Thanks in advance
You do not have to create a new server for each user. Take the IP address and port instead. This means that each user can be uniquely identified. You can simply name the files of the users after the variable client.
Here an example code
net.createServer((netSocket : net.Socket) => {
netSocket.on('data', (data) => {
var client = netSocket.remoteAddress + ':' + netSocket.remotePort;
})
})
I wasn't able to create multiple nodejs instances hence I stored the user data in session storage and passed it to nodejs server every time I triggered a request.

What is the most efficient way of sending files between NodeJS servers?

Introduction
Say that on the same local network we have two Node JS servers set up with Express: Server A for API and Server F for form.
Server A is an API server where it takes the request and saves it to MongoDB database (files are stored as Buffer and their details as other fields)
Server F serves up a form, handles the form post and sends the form's data to Server A.
What is the most efficient way to send files between two NodeJS servers where the receiving server is Express API? Where does the file size matter?
1. HTTP Way
If the files I'm sending are PDF files (that won't exceed 50mb) is it efficient to send the whole contents as a string over HTTP?
Algorithm is as follows:
Server F handles the file request using https://www.npmjs.com/package/multer and saves the file
then Server F reads this file and makes an HTTP request via https://github.com/request/request along with some details on the file
Server A receives this request and turns the file contents from string to Buffer and saves a record in MongoDB along with the file details.
In this algorithm, both Server A (when storing into MongoDB) and Server F (when it was sending it over to Server A) have read the file into the memory, and the request between the two servers was about the same size as the file. (Are 50Mb requests alright?)
However, one thing to consider is that -with this method- I would be using the ExpressJS style of API for the whole process and it would be consistent with the rest of the app where the /list, /details requests are also defined in the routes. I like consistency.
2. Socket.IO Way
In contrast to this algorithm, I've explored https://github.com/nkzawa/socket.io-stream way which broke away from the consistency of the HTTP API on Server A (as the handler for socket.io events are defined not in the routes but the file that has var server = http.createServer(app);).
Server F handles the form data as such in routes/some_route.js:
router.post('/', multer({dest: './uploads/'}).single('file'), function (req, res) {
var api_request = {};
api_request.name = req.body.name;
//add other fields to api_request ...
var has_file = req.hasOwnProperty('file');
var io = require('socket.io-client');
var transaction_sent = false;
var socket = io.connect('http://localhost:3000');
socket.on('connect', function () {
console.log("socket connected to 3000");
if (transaction_sent === false) {
var ss = require('socket.io-stream');
var stream = ss.createStream();
ss(socket).emit('transaction new', stream, api_request);
if (has_file) {
var fs = require('fs');
var filename = req.file.destination + req.file.filename;
console.log('sending with file: ', filename);
fs.createReadStream(filename).pipe(stream);
}
if (!has_file) {
console.log('sending without file.');
}
transaction_sent = true;
//get the response via socket
socket.on('transaction new sent', function (data) {
console.log('response from 3000:', data);
//there might be a better way to close socket. But this works.
socket.close();
console.log('Closed socket to 3000');
});
}
});
});
I said I'd be dealing with PDF files that are < 50Mb. However, if I use this program to send larger files in the future, is socket.io a better way to handle 1GB files as it's using stream?
This method does send the file and the details across but I'm new to this library and don't know if it should be used for this purpose or if there is a better way of utilizing it.
Final thoughts
What alternative methods should I explore?
Should I send the file over SCP and make an HTTP request with file details including where I've sent it- thus, separating the protocols of files and API requests?
Should I always use streams because they don't store the whole file into memory? (that's how they work, right?)
This https://github.com/liamks/Delivery.js ?
References:
File/Data transfer between two node.js servers this got me to try socket-stream way.
transfer files between two node.js servers over http for HTTP way
There are plenty of ways to achieve this , but not so much to do it right !
socket io and wesockets are efficient when you use them with a browser , but since you don't , there is no need for it.
The first method you can try is to use the builtin Net module of nodejs, basically it will make a tcp connection between the servers and pass the data.
you should also keep in mind that you need to send chunks of data not the entire file , the socket.write method of the net module seems to be a good fit for your case check it : https://nodejs.org/api/net.html
But depending on the size of your files and concurrency , memory consumption can be quite large.
if you are running linux on both servers you could even send the files at ground zero with a simple linux command called scp
nohup scp -rpC /var/www/httpdocs/* remote_user#remote_domain.com:/var/www/httpdocs &
You can even do this with windows to linux or the other way.
http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
the client scp for windows is pscp.exe
Hope this helps !

Should I use a server method or Collection.insert() to insert a record using Meteor?

I'm trying to decide between two methods for inserting a new document to a collection from the client using Meteor.js. Call a Server Method or using the db API directly.
So, I can either access the db api directly on the client:
MyCollection.insert(doc)
Or, I can create a new Server Method (under the /server dir):
Meteor.methods({
createNew: function(doc) {
check(doc, etc)
var id = MyCollection.insert(doc);
return project_id;
}
});
And then call it from the client like this:
Meteor.call('createNew', doc, function(error, result){
// Carry on
});
Both work but as far as I can see from testing, I only benefit from latency compensation (the local cache updating and showing on the screen before the server responds) if I hit the db api directly, not if I use a Server Method, so my preference is for doing things this way. But I also get the impression the most secure approach is to use a Method on the server (mainly because Emily Stark gave it as an example in her video here) but then the db api is available on the client no matter what so why would a Server Method be better?
I've seen both approaches taken when reading source code elsewhere so I'm stumped.
Note. In both cases I have suitable Allow/Deny rules in place:
MyCollection.allow({
insert: function(userId, project){
return isAllowedTo.createDoc(userId, doc);
},
update: function(userId, doc){
return isAllowedTo.editDoc(userId, doc);
},
remove: function(userId, doc){
return isAllowedTo.removeDoc(userId, doc);
}
});
In short: Which is recommended and why?
The problem was that I had the method declarations under the /server folder, so they were not available to the client and this broke latency compensation (where the client creates stubs of these methods to simulate the action but in my case could not because it couldn't see them). After moving them out of this folder I am able to use Server Methods in a clean, safe and latency-compensated manner (even with all my Allow/Deny rules set to false - they do nothing and only apply to direct db api access from the client, not server).
In short: don't use the db api on the client or allow/deny rules on the server, forget they ever existed and just write Server Methods, make sure they're accessible to both client and server, and use these for crud instead.

Using NodeJs with Firebase - Security

Due to the need to do some server side code - mainly sending emails I have decided to use Nodejs & Express for the server side element along with Firebase to hold the data - Partly from a learning experience.
My question is whats the best approach with regards to using the client side Firebase library and the Nodejs library when doing authentication using the Simple Email & Password API. If I do the authentication client side and then subsequently call a different route on the NodeJS side will the authentication for that user be carried across in the request. What would be the approach to test the user is authenticated within Node.
One approach I assume is to get the current users username & password from firebase and then post these to NodeJS and then use the firebase security API on the server to test.
Essentially the problem here is you need to securely convey to your NodeJS server who the client is authenticated as to Firebase. There are several ways you could go about this, but the easiest is probably to have all of your client<->NodeJS communication go through Firebase itself.
So instead of having the client hit a REST endpoint served by your NodeJS server, have the client write to a Firebase location that your NodeJS server is monitoring. Then you can use Firebase Security Rules to validate the data written by the client and your server can trust it.
For example, if you wanted to make it so users could send arbitrary emails through your app (with your NodeJS server taking care of actually sending the emails), you could have a /emails_to_send location with rules something like this:
{
"rules": {
"emails_to_send": {
"$id": {
".write": "!data.exists() && newData.child('from').val() == auth.email",
".validate": "newData.hasChildren(['from', 'to', 'subject', 'body'])"
}
}
}
}
Then in the client you can do:
ref.child('emails_to_send').push({
from: 'my_email#foo.com',
to: 'joe#example.com',
subject: 'hi',
body: 'Hey, how\'s it going?'
});
And in your NodeJS code you could call .auth() with your Firebase Secret (so you can read and write everything) and then do:
ref.child('emails_to_send').on('child_added', function(emailSnap) {
var email = emailSnap.val();
sendEmailHelper(email.from, email.to, email.subject, email.body);
// Remove it now that we've processed it.
emailSnap.ref().remove();
});
This is going to be the easiest as well as the most correct solution. For example, if the user logs out via Firebase, they'll no longer be able to write to Firebase so they'll no longer be able to make your NodeJS server send emails, which is most likely the behavior you'd want. It also means if your server is temporarily down, when you start it back up, it'll "catch up" sending emails and everything will continue to work.
The above seems like a roundabout way of doing things, I would use something like https://www.npmjs.com/package/connect-session-firebase and keep firebase as the model, handling all routes through express. Easier if your express server is rendering templates and not just behaving as a JSON API.
If you are using Firebase Authentication, the client side can import the Firebase Library (e.g. for javascript) and authenticate directly with the library itself
import firebase from 'firebase/app';
const result = await firebase.auth().signInWithEmailAndPassword(_email, _password);
After that, the client can to obtain the ID Token, this token will be informed on each request that will be made to the server (e.g. as header).
const sendingIdToken = await firebase.auth().currentUser.getIdToken();
On the Node.js server side, you can install the Firebase Admin SDK, to verify if the user is authenticated on the Node.js server, like:
// Let's suppose the client informed the token as header
const receivingIdToken = req.headers['auth-token'];
admin.auth().verifyIdToken(receivingIdToken, true)
.then((decodedIdToken) => { /* proceed to send emails, etc */}, (error) => {...});
The Firebase Admin SDK gives full permissions to the Database, so keep the credentials safe.
You should also configure the Security Rules on Firestore (or Firebase Realtime), so the client side can still perform specific operations directly to the database (e.g. listening for realtime changes on a collection), but you can also restrict all access if you want the client to only interact with the node.js server.
For more details, I developed an example of a node.js server that uses the Firestore Database and handles security and more.

Resources