Copy files from Samba share in node.js - node.js

I need to copy files from Samba share in my application. The paths are in smb://host/filename format. How do I do it in nodejs? fs.createReadStream refuses to open these paths. I need to do this on both Windows and *nix.

Assuming a Linux host (since you mentioned "samba" and not "MS SMB"), you'll first need to mount the remote server with smbmount. This forum post has an overview of how to do that, then you just read the files as if they were local to your server.
Alternatively, smbget lets you acquire single files without mounting the remote host, but isn't efficient for a large number of file requests.
Another edit; some example code:
var remoteFile = require('child_process').spawn('smbget', ['--stdout', 'smb://host/filename']);
remoteFile.stdout.on('data', function(chunk) {
//handle chunk of data
});
remoteFile.on('exit', function() {
//file loaded completely, continue doing stuff
});

Related

Does Node.js fs.createWriteStream(downloadDirectory) creates file on the server file systems or on the user's device?

I'm trying to download files from MongoDB to the user's device (the user's local file system).
I'm using Gridfs-stream to create a readStream to read files from MongoDB, and pipe it into Node.js fs.createWriteStream(downloadDirectory) (downloadDirectory is the directory the user wanted it to be).
Now, I'm coding on localhost, so it downloads to wherever I point it to.
My question is: after I deployed my backend and frontend, is my approach going to download the file to the downloadDirectory on my server or the downloadDirectory on the user's device?
const readstream = gfs.createReadStream({
_id: file._id
})
const fileStream = fs.createWriteStream(downloadDirectory + file.name)
const writeStream = readstream.pipe(fileStream)
writeStream.on('finish', (returnedFile) => {
res.json(returnedFile)
})
Solution:
I spent over 12 hours debugging how to download attachment files from the Node.js server on the client-side.
This post saved my life.
Node.js - createWriteStream is writing a different file than writeFile
For those of you who are still struggling on can't open a downloaded attachment or downloaded file doubled the size and corrupted. Check the Encoding!

What is the most efficient way of sending files between NodeJS servers?

Introduction
Say that on the same local network we have two Node JS servers set up with Express: Server A for API and Server F for form.
Server A is an API server where it takes the request and saves it to MongoDB database (files are stored as Buffer and their details as other fields)
Server F serves up a form, handles the form post and sends the form's data to Server A.
What is the most efficient way to send files between two NodeJS servers where the receiving server is Express API? Where does the file size matter?
1. HTTP Way
If the files I'm sending are PDF files (that won't exceed 50mb) is it efficient to send the whole contents as a string over HTTP?
Algorithm is as follows:
Server F handles the file request using https://www.npmjs.com/package/multer and saves the file
then Server F reads this file and makes an HTTP request via https://github.com/request/request along with some details on the file
Server A receives this request and turns the file contents from string to Buffer and saves a record in MongoDB along with the file details.
In this algorithm, both Server A (when storing into MongoDB) and Server F (when it was sending it over to Server A) have read the file into the memory, and the request between the two servers was about the same size as the file. (Are 50Mb requests alright?)
However, one thing to consider is that -with this method- I would be using the ExpressJS style of API for the whole process and it would be consistent with the rest of the app where the /list, /details requests are also defined in the routes. I like consistency.
2. Socket.IO Way
In contrast to this algorithm, I've explored https://github.com/nkzawa/socket.io-stream way which broke away from the consistency of the HTTP API on Server A (as the handler for socket.io events are defined not in the routes but the file that has var server = http.createServer(app);).
Server F handles the form data as such in routes/some_route.js:
router.post('/', multer({dest: './uploads/'}).single('file'), function (req, res) {
var api_request = {};
api_request.name = req.body.name;
//add other fields to api_request ...
var has_file = req.hasOwnProperty('file');
var io = require('socket.io-client');
var transaction_sent = false;
var socket = io.connect('http://localhost:3000');
socket.on('connect', function () {
console.log("socket connected to 3000");
if (transaction_sent === false) {
var ss = require('socket.io-stream');
var stream = ss.createStream();
ss(socket).emit('transaction new', stream, api_request);
if (has_file) {
var fs = require('fs');
var filename = req.file.destination + req.file.filename;
console.log('sending with file: ', filename);
fs.createReadStream(filename).pipe(stream);
}
if (!has_file) {
console.log('sending without file.');
}
transaction_sent = true;
//get the response via socket
socket.on('transaction new sent', function (data) {
console.log('response from 3000:', data);
//there might be a better way to close socket. But this works.
socket.close();
console.log('Closed socket to 3000');
});
}
});
});
I said I'd be dealing with PDF files that are < 50Mb. However, if I use this program to send larger files in the future, is socket.io a better way to handle 1GB files as it's using stream?
This method does send the file and the details across but I'm new to this library and don't know if it should be used for this purpose or if there is a better way of utilizing it.
Final thoughts
What alternative methods should I explore?
Should I send the file over SCP and make an HTTP request with file details including where I've sent it- thus, separating the protocols of files and API requests?
Should I always use streams because they don't store the whole file into memory? (that's how they work, right?)
This https://github.com/liamks/Delivery.js ?
References:
File/Data transfer between two node.js servers this got me to try socket-stream way.
transfer files between two node.js servers over http for HTTP way
There are plenty of ways to achieve this , but not so much to do it right !
socket io and wesockets are efficient when you use them with a browser , but since you don't , there is no need for it.
The first method you can try is to use the builtin Net module of nodejs, basically it will make a tcp connection between the servers and pass the data.
you should also keep in mind that you need to send chunks of data not the entire file , the socket.write method of the net module seems to be a good fit for your case check it : https://nodejs.org/api/net.html
But depending on the size of your files and concurrency , memory consumption can be quite large.
if you are running linux on both servers you could even send the files at ground zero with a simple linux command called scp
nohup scp -rpC /var/www/httpdocs/* remote_user#remote_domain.com:/var/www/httpdocs &
You can even do this with windows to linux or the other way.
http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
the client scp for windows is pscp.exe
Hope this helps !

Best NodeJS Workflow for team development

I'm trying to implement NodeJS and Socket.io for real time communication between two devices (PC & Smartphones) in my company product.
Basically what I want to achieve is sending a notification to all online users when somebody change something on a file.
All the basic functionality for saving the updates are already there and so, when everything is stored and calculated, I send a POST request to my Node server saying that something changed and he need to notify the users.
The problem now is that when I want to change some code in the NodeJS scripts, as long as I work alone, I can just upload the new files via FTP and just restart the pm2 service, but when my colleagues will start working with me on this story we will have problems merging our changes without overlapping each other.
Launching a local server is also not possible because we need the connection between our current server and the node machine and since our server is online it cannot access our localhosts.
It's there a way for a team to work together in the same Node server but without overlapping each other ?
Implement changes using some other option rather than FTP. For example:
You can use webdav-fs in authenticated or non-authenticated mode:
// Using authentication:
var wfs = require("webdav-fs")(
"http://example.com/webdav/",
"username",
"password"
);
wfs.readdir("/Work", function(err, contents) {
if (!err) {
console.log("Dir contents:", contents);
} else {
console.log("Error:", err.message);
}
});
putFileContents(remotePath, format, data [, options])
Put some data in a remote file at remotePath from a Buffer or String. data is a Buffer or a String. options has a property called format which can be "binary" (default) or "text".
var fs = require("fs");
var imageData = fs.readFileSync("someImage.jpg");
client
.putFileContents("/folder/myImage.jpg", imageData, { format: "binary" })
.catch(function(err) {
console.error(err);
});
And use callbacks to notify your team, or lock the files via the callback.
References
webdav-fs
webdav
lockfile
Choosing Secure Passwords

How to get files from clients local directory?

I want to create an upload form that will send an image to my hosted server, but i can't find a clear answer on how node.js interacts with the clients side of things.
A lot of the file upload examples I can find use a simple fs get from the temp directory. But when I run code on my server that looks like this:
var os = require('os');
var ostemp = os.tmpDir();
console.log( "Temp directory", ostemp );
It obviously returns a server filepath to the logs when I visit, not my windows temp. Makes sense as node.js is purely server side, so how is it usually done?
EDIT:
I think a related problem I'm having is that my host (GANDI) only allows SFTP file transfer, which might be preventing me from sending files via a form submit thing, though I might be confused about that too. Either way I'd appreciate being set straight...

node.js - stream file without saving it temporarily

So this is my setup
I have a client from which files are uploaded to the node.js server (serverA) and from there I want to stream the files to another server (serverB) without saving the file temporarily (on serverA).
What is the simplest and the best way to achieve this?
I am able to upload the file to serverA but I don't want the temporary file to be stored.
Update:
its a simple ajax file uplaod to (severA)... The idea is to transfer byte-wise so that even if the connection goes off, you can read it back from that particular byte.
I am using express.js on serverA and backbone.js is the client using which I do the ajax uploads. For now there's no connection between A and B as such, they communicate through endpoints. serverA is running on port 4000 and serverB on port 5000. I want to somehow pipe the file from serverA to an endpoint on serverB.
Since HttpRequest is a stream, you could use the request module to pipe the current request into the other endpoint inside your express route:
app.post('myroute', function (req, res) {
var request = require('request');
req.pipe(request.post('/my/path:5000')).pipe(res);
});
Would that approach work?

Resources