NodeJS - Check whether a SFTP remote file exists using "Sequest" - node.js

I'm new to NodeJS and I'm using "Sequest" package for reading contents of a SFTP remote file. It works great. However if the file that I'm trying to read, does not exist, then it throws exception and the app does not respond further.
So I want to check whether the file exists before trying to read it. Since I'm using a library function (sequest.get), I'm unable to handle the exception that occurs in the library method due to absence of the file specified.
Below is my code:
var reader = sequest.get('xyz#abc', fileName, opts);
reader.setEncoding('utf8');
reader.on('data', function(chunk) {
return res.send(chunk);
});
reader.on('end', function() {
console.log('there will be no more data.');
});
Ref: https://github.com/mikeal/sequest#gethost-path-opts
Sequest (https://github.com/mikeal/sequest) is a wrapper to SSH2 - (https://github.com/mscdex/ssh2).
Any help is greatly appreciated. Thank you.

You can listen to error event to handle such cases.
var reader = sequest.get('xyz#abc', fileName, opts);
reader.setEncoding('utf8');
reader.on('data', function(chunk) {
return res.send(chunk);
});
reader.on('end', function() {
console.log('there will be no more data.');
});
reader.on('error', function() {
console.log('file not found or some other error');
});

Related

Formidable using "end" event with file upload

I am using Formidable with Express in nodeJS in an attempt to have a simple single file upload scheme. I have confirmed that a file is actually sent over from the client-side, but where it seems to run into troubles is on the server-side.
index.js
app.post('/', (req, res) => {
const form = formidale();
form.on('file', (filename, file) => {
fs.rename(file.path, `./data/nodes.csv`, err => {
if (err) {
console.log(`There was an error in downloading a CSV file: ${err}`);
return null;
}
else {
console.log("CSV file has been uploaded correctly.");
}
});
});
form.on('error', err => {
console.log(`There was an error in downloading a CSV file: ${err}`);
return null;
});
form.on('end', () => {
console.log(fs.readFileSync('./data/nodes.csv')); // test to see if file exists
const nodes = assignMetrics();
console.log(nodes);
return nodes;
});
form.parse(req);
});
}
The main trouble I seem to find is that the form.on('end', ...) event does not seem to wait till the file has finished uploading to fire. I have confirmed this by trying to read the file in the event, but by that point it doesn't exist? The documentation though appears to suggest it is only meant to fire "after all files have been flushed [from the APIs pipe it infers]".
There appears to be no other events available that might wait till the file has been uploaded to be called? I also don't want to start throwing in layers of promises and such unless it is the only option, as each new layer of promises I find is a chance for unintended effects to happen.

Node.js Streaming/Piping Error Handling (Change Response Status on Error)

I have millions of rows in my Cassandra db that I want to stream to the client in a zip file (don't want a potentially huge zip file in memory). I am using the stream() function from the Cassandra-Node driver, piping to a Transformer which extracts the one field from each row that I care about and appends a newline, and pipes to archive which pipes to the Express Response object. This seems to work fine but I can't figure out how to properly handle errors during streaming. I have to set the appropriate headers/status before streaming for the client, but if there is an error during the streaming, on the dbStream for example, I want to clean up all of the pipes and reset the response status to be something like 404. But If I try to reset the status after the headers are set and the streaming starts, I get Can't set headers after they are sent. I've looked all over and can't find how to properly handle errors in Node when piping/streaming to the Response object. How can the client tell if valid data was actually streamed if I can't send a proper response code on error? Can anyone help?
function streamNamesToWriteStream(query, res, options) {
return new Promise((resolve, reject) => {
let success = true;
const dbStream = db.client.stream(query);
const rowTransformer = new Transform({
objectMode: true,
transform(row, encoding, callback) {
try {
const vote = row.name + '\n';
callback(null, vote);
} catch (err) {
callback(null, err.message + '\n');
}
}
});
// Handle res events
res.on('error', (err) => {
logger.error(`res ${res} error`);
return reject(err);
});
dbStream.on('error', function(err) {
res.status(404).send() // Can't set headers after they are sent.
logger.debug(`dbStream error: ${err}`);
success = false;
//res.end();
//return reject(err);
});
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-disposition': 'attachment; filename=myFile.zip'
});
const archive = archiver.create('zip');
archive.on('error', function(err) { throw err; });
archive.on('end', function(err) {
logger.debug(`Archive done`);
//res.status(404).end()
});
archive.pipe(res, {
//end:false
});
archive.append(dbStream.pipe(rowTransformer), { name: 'file1.txt' });
archive.append(dbStream.pipe(rowTransformer), { name: 'file1.txt' });
archive.finalize();
});
}
Obviously it's too late to change the headers, so there's going to have to be application logic to detect a problem. Here's some ideas I have:
Write an unambiguous sentinel of some kind at the end of the stream when an error occurs. The consumer of the zip file will then need to look for that value to check for a problem.
Perhaps more simply, have the consumer execute a verification on the integrity of the zip archive. Presumably if the stream fails the zip will be corrupted.

How to gunzip stream in nodejs?

I'm trying to accomplish quite an easy task but I'm a little bit confused and got stuck on using zlib in nodejs. I'm building functionality that includes me downloading file from aws S3 which is gziped, unzipping it and reading it line by line. I want to accomplish all of this using streams as I believe it is possible to do so in nodejs.
Here is my current code base:
//downloading zipped file from aws s3:
//params are configured correctly to access my aws s3 bucket and file
s3.getObject(params, function(err, data) {
if (err) {
console.log(err);
} else {
//trying to unzip received stream:
//data.Body is a buffer from s3
zlib.gunzip(data.Body, function(err, unzippedStream) {
if (err) {
console.log(err);
} else {
//reading line by line unzziped stream:
var lineReader = readline.createInterface({
input: unzippedStream
});
lineReader.on('line', function(lines) {
console.log(lines);
});
}
});
}
});
I get an error saying:
readline.js:113
input.on('data', ondata);
^
TypeError: input.on is not a function
I believe a problem might be in unzipping process, but I'm not too sure what is wrong, any help would be appreciated.
I don't have an S3 account to test with, but reading the docs suggests that s3.getObject() can return a stream, in which case I think that this might work:
var lineReader = readline.createInterface({
input: s3.getObject(params).pipe(zlib.createGunzip())
});
lineReader.on('line', function(lines) {
console.log(lines);
});
EDIT: looks like the API may have changed, and you're now required to instantiate a stream object manually before you can pipe it through anything else:
s3.getObject(params).createReadStream().pipe(...)

Sending parameters with file - Delivery.js and socket.io

I'm trying to upload a file using nodejs,socket.io and delivery.js. The file upload is working fine for me, but I need to send some parameters with the file.But I don't see how I can do that! Here's my code so far.
Client.html
var socket = io.connect();
var delivery = new Delivery(socket);
function sendFile(){
delivery.on('delivery.connect',function(delivery){
var file = document.getElementById("file").files[0];
var extraParams = {foo: 'bar'};
delivery.send(file, extraParams); //trying to send the params with file
return false;
});
delivery.on('send.success',function(fileUID){
alert("file was successfully sent.");
});
}
and my server.js
io.sockets.on('connection', function(socket){
var delivery = dl.listen(socket);
delivery.on('receive.success',function(file){
// delivery.on('receive.success',function(file,params){<--tried this also
var params = file.params;
console.log(params);
fs.writeFile(file.name,file.buffer, function(err){
if(err){
console.log('File could not be saved.');
}else{
console.log('File saved.');
};
});
});
});
But my console.log(params) prints undefined. I would be really grateful if anyone could point me to the right direction.
Update: Solved the issue by cloning from github deliver.js,the npmjs code seems to be outdated.

Node pipe stops working

My client sends an image file to the server. It works 5 times and then it suddenly stops. I am pretty new using streams and pipe so I am not sure what I am doing wrong.
Server Code
http.createServer(function(req, res) {
console.log("File received");
// This opens up the writeable stream to `output`
var name = "./test"+i+".jpg";
var writeStream = fs.createWriteStream(name);
// This pipes the POST data to the file
req.pipe(writeStream);
req.on('end', function () {
console.log("File saved");
i++;
});
// This is here incase any errors occur
writeStream.on('error', function (err) {
console.log(err);
});
}).listen(3000);
Client code
var request = require('request');
var fs = require('fs');
setInterval(function () {
var readStream = fs.createReadStream('./test.jpg');
readStream.on('open', function () {
// This just pipes the read stream to the response object (which goes to the client)
readStream.pipe(request.post('http://192.168.1.100:3000/test'));
console.log("Send file to server");
});
}, 1000);
Behaves like a resource exhaustion issue. Not sure which calls throw errors and which just return. Does the server connect on the 6th call? Does the write stream open? Does the pipe open?
Try ending the connection and closing the pipe after the image is saved. Maybe close the write stream too, don't remember if node garbage collects file descriptors.
I had to do the following on the server side to make this work :
res.statusCode = 200;
res.end();

Resources