How to get files from clients local directory? - node.js

I want to create an upload form that will send an image to my hosted server, but i can't find a clear answer on how node.js interacts with the clients side of things.
A lot of the file upload examples I can find use a simple fs get from the temp directory. But when I run code on my server that looks like this:
var os = require('os');
var ostemp = os.tmpDir();
console.log( "Temp directory", ostemp );
It obviously returns a server filepath to the logs when I visit, not my windows temp. Makes sense as node.js is purely server side, so how is it usually done?
EDIT:
I think a related problem I'm having is that my host (GANDI) only allows SFTP file transfer, which might be preventing me from sending files via a form submit thing, though I might be confused about that too. Either way I'd appreciate being set straight...

Related

Make a logger for Node Js

I have a project in Node Js, which executes the project on port 3000 and I access from ngrok with my browser to said localhost port, and it executes a server on port 3001 to make requests to a Maria database db. The project is done in react and the server with express.
I want to save the application logs (errors, warnings, etc.) in a log file so that I can see them whenever I want.
My intention was to use winston, and while I have no problem on the server side (3001), when I try to adapt it to the main project, I get an error that it cannot save files (the reason that appears is that it runs from the browser, and you can't create such a file because you don't have access to the project folders)
Can anyone give me some advice? Am I wrong to use winston, and should I use another?
Greetings and thanks
I've never used winston before and I couldn't find anything online about your error. In the past I've always just used node's fs module to create a log of errors and restarts.
const fs = require('fs')
Node's File System Documentation: https://nodejs.dev/learn/the-nodejs-fs-module
Short YouTube Tutorial: https://www.youtube.com/watch?v=U57kU311-nE

How to separate express server code from Express business logic code?

All the Node.js tutorials that I have followed have put everything in one file. It includes importing of libraries, routing, database connecting and starting of the server, by say, express.js:
var app = require('express');
app.get('/somePath', blah blah);
app.listen(...);
Now, I have 4 node servers behind an Nginx load balancer. It then becomes very difficult to have the source code updated on all the four servers.
Is there a way to keep the source code out of the server creation code in such a way that I can deploy the source code on the servers as one package? The server creation code should not know anything about routing or database connections. It should only be listening to changes in a folder and the moment a new module meta file appears, it starts hosting that web application.
Much like how we deploy a Java code packaged as war by Maven and deployed to the webapp of Tomcat, because Tomcat instantiation is not part of the source code. In node.js it seems server is also part of the source code.
For now, the packaging is not my concern. My concern is how to separate the logic and how do I point all my servers to one source code base?
Node.js or JavaScript for that matter doesn't have a concept like WAR. But what it does have is something similar. To achieve something WAR like, you would essentially bundle the code into one source file using something like webpack. However, this will probably not work with Node.js modules like http (Express uses `http since it likely calls or relies on native V8/C++ functions/libraries.
You could also use Docker and think of the Docker containers as WARs.
Here is what I figured out as a work around:
Keep the servers under a folder say, "server_clusters" and put different node servers there, namely: node1.js, node2.js, node3.js, node4.js, etc (I know, in the real world, the clusters would be different VMs or CPUs altogether but for now, I simply want to separate server creation logic from source code). These files would have this code snippet:
var constants = require('./prop');
var appBasePath = constants.APP_BASE_DIR;
var appFilePath = appBasePath + "/main";
var app = require(appFilePath);
//each server would have just different port number while everything else would remain constant
app.listen(8080, function (req, res) {
console.log("server started up");
});
Create a properties file that would have the path to the source code and export the object. That simple. This is what is used on line#1 in the above code
Create the source directory project wherever you want on the machine and just update its home directory in the constant file above. The source code directory can export one landing file that will provide the express app to the servers to start:
var express = require('express');
var app = express();
module.exports = app;
With this, there are multiple servers that are pointing to the same source code.
Hope this helps to those who are facing the same problem.
Other approaches are welcome.

What is the most efficient way of sending files between NodeJS servers?

Introduction
Say that on the same local network we have two Node JS servers set up with Express: Server A for API and Server F for form.
Server A is an API server where it takes the request and saves it to MongoDB database (files are stored as Buffer and their details as other fields)
Server F serves up a form, handles the form post and sends the form's data to Server A.
What is the most efficient way to send files between two NodeJS servers where the receiving server is Express API? Where does the file size matter?
1. HTTP Way
If the files I'm sending are PDF files (that won't exceed 50mb) is it efficient to send the whole contents as a string over HTTP?
Algorithm is as follows:
Server F handles the file request using https://www.npmjs.com/package/multer and saves the file
then Server F reads this file and makes an HTTP request via https://github.com/request/request along with some details on the file
Server A receives this request and turns the file contents from string to Buffer and saves a record in MongoDB along with the file details.
In this algorithm, both Server A (when storing into MongoDB) and Server F (when it was sending it over to Server A) have read the file into the memory, and the request between the two servers was about the same size as the file. (Are 50Mb requests alright?)
However, one thing to consider is that -with this method- I would be using the ExpressJS style of API for the whole process and it would be consistent with the rest of the app where the /list, /details requests are also defined in the routes. I like consistency.
2. Socket.IO Way
In contrast to this algorithm, I've explored https://github.com/nkzawa/socket.io-stream way which broke away from the consistency of the HTTP API on Server A (as the handler for socket.io events are defined not in the routes but the file that has var server = http.createServer(app);).
Server F handles the form data as such in routes/some_route.js:
router.post('/', multer({dest: './uploads/'}).single('file'), function (req, res) {
var api_request = {};
api_request.name = req.body.name;
//add other fields to api_request ...
var has_file = req.hasOwnProperty('file');
var io = require('socket.io-client');
var transaction_sent = false;
var socket = io.connect('http://localhost:3000');
socket.on('connect', function () {
console.log("socket connected to 3000");
if (transaction_sent === false) {
var ss = require('socket.io-stream');
var stream = ss.createStream();
ss(socket).emit('transaction new', stream, api_request);
if (has_file) {
var fs = require('fs');
var filename = req.file.destination + req.file.filename;
console.log('sending with file: ', filename);
fs.createReadStream(filename).pipe(stream);
}
if (!has_file) {
console.log('sending without file.');
}
transaction_sent = true;
//get the response via socket
socket.on('transaction new sent', function (data) {
console.log('response from 3000:', data);
//there might be a better way to close socket. But this works.
socket.close();
console.log('Closed socket to 3000');
});
}
});
});
I said I'd be dealing with PDF files that are < 50Mb. However, if I use this program to send larger files in the future, is socket.io a better way to handle 1GB files as it's using stream?
This method does send the file and the details across but I'm new to this library and don't know if it should be used for this purpose or if there is a better way of utilizing it.
Final thoughts
What alternative methods should I explore?
Should I send the file over SCP and make an HTTP request with file details including where I've sent it- thus, separating the protocols of files and API requests?
Should I always use streams because they don't store the whole file into memory? (that's how they work, right?)
This https://github.com/liamks/Delivery.js ?
References:
File/Data transfer between two node.js servers this got me to try socket-stream way.
transfer files between two node.js servers over http for HTTP way
There are plenty of ways to achieve this , but not so much to do it right !
socket io and wesockets are efficient when you use them with a browser , but since you don't , there is no need for it.
The first method you can try is to use the builtin Net module of nodejs, basically it will make a tcp connection between the servers and pass the data.
you should also keep in mind that you need to send chunks of data not the entire file , the socket.write method of the net module seems to be a good fit for your case check it : https://nodejs.org/api/net.html
But depending on the size of your files and concurrency , memory consumption can be quite large.
if you are running linux on both servers you could even send the files at ground zero with a simple linux command called scp
nohup scp -rpC /var/www/httpdocs/* remote_user#remote_domain.com:/var/www/httpdocs &
You can even do this with windows to linux or the other way.
http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
the client scp for windows is pscp.exe
Hope this helps !

I cant seem to get imagemagick working on amazon web services

Basically I've been struggling for a while now on getting imagemagick and or graphicsmagick to run properly with my node app. So far I followed the installation source from http://www.imagemagick.org/script/install-source.php#unix
Details on my server include
nodejs
mongodb
mongoose
imagemagick
graphicsmagick
express
and it seems the installation went well. However when I run the code through node with gm, I don't get any errors but the files don't get written. I'll post an example code of the image post function.
var gm = require('./gm');
var newRoute = '/some/user/url/';
var files = req.files;
gm(files.file.path).resize(1126).compress('JPEG').quality(quality)
.write(newRoute,function(err){
//do some stuff to save changes on db
});
Now this is currently working perfectly on my local device. However on the server it wont budge. Does anyone have any ideas on what's going on?.
The app creates folders and the folders modes are supposed to be 0777, though when I log in with ssh it seems like they might be 0755', although Im not sure it's all that to do with permissions since I've got mp3 uploads working fine. It's whenimagemagickandgraphicsmagick` come in to play that this happens. Any Ideas?

Copy files from Samba share in node.js

I need to copy files from Samba share in my application. The paths are in smb://host/filename format. How do I do it in nodejs? fs.createReadStream refuses to open these paths. I need to do this on both Windows and *nix.
Assuming a Linux host (since you mentioned "samba" and not "MS SMB"), you'll first need to mount the remote server with smbmount. This forum post has an overview of how to do that, then you just read the files as if they were local to your server.
Alternatively, smbget lets you acquire single files without mounting the remote host, but isn't efficient for a large number of file requests.
Another edit; some example code:
var remoteFile = require('child_process').spawn('smbget', ['--stdout', 'smb://host/filename']);
remoteFile.stdout.on('data', function(chunk) {
//handle chunk of data
});
remoteFile.on('exit', function() {
//file loaded completely, continue doing stuff
});

Resources