node.js File upload configurations - node.js

No Matter how much I've searched and changed and toyed over the last 24 hours, I simply can not find the right combination of settings that allows Node.js to upload multiple files.
My setup is quite simple - I have a form interface that posts multipart content (files) to an endpoint.. lets call it: /theendpoint
and this end point is supposed to parse the multiple files. However, during the parsing, there are various events that need to be called once the file is uploaded.
I'm currently using the express.bodyParser({ uploadDir:'/tmp', keepextensions:true, defer:true}); in the app configuration.
Using the following method, I am trying to parse the file, but the problem is
Only 2 files will begin uploading, and will not complete (ie. the progress bar hangs near the end without fully completing).
The other files to be uploaded by the form (item 3+) do not even begin to upload to the server.
It seems to be some sort of asynchronus holdup, however I can't properly interpret the problem. Some of the code used at the upload endpoint are as follows:
// This applies to /theendpoint route. Using Express.
exports.theendpoint = function(req,res){
console.log(req.files);
fs.readfile(uploadPath, function(err,data){
if(err) throw err;
fs.writeFile(newFilePath, data, function(err){
// Series of checks and definitions
// Database connection
// Conditional executions
fs.unlink(req.files.file.path, function(err){
if(err) throw err;
console.log('Deleted');
});
});
});
};
Obviously I've left out some of the code here. If anyone can help - is this structure workable?
You should know that items in the commented section.. ie DB connection etc. are asynchronus tasks.

after
fs.unlink(req.files.file.path, function(err){
if(err) throw err;
console.log('Deleted');
});
add
res.redirect("back");
and it works!

Related

unwanted caching in nodejs fs.readFile

Im having a strange issue in my NodeJS app. Im using
function getFile(filename, cb){
fs.readFile(filename, 'utf8', (err, data) => {
if(err){
throw err;
return;
}
return cb(data);
});
}
to open an html file, get its contents and send to the client for different modal screens. However, the first html file it opens seems to be getting cached, somehow. I know there's no real caching in the nodejs fs, so Im unsure as to where the problem could be stemming from. Ive ensured that the correct values are in fact being passed into the function, but the output is always the first file opened.
How can I stop this from happening?

How to write logs in node js

Recently i got a work to write log messages to my node js project.I am not sure about what exactly a log message mean,generally for a function we write 2 cases like below
exports.inserttopic = function (req, res) {
var topics = new Topics(req.body);console.log(topics)
topics.save(function (err, result) {
if (err) {
console.log(err);
return err;
}
if (result) {
data = { status: true, error_code: 0, result: result, message: 'Inserted successfully' };
}
res.json(data);
});
};
From the above code,i put console.log(err) for error case .is this a log message?If not how does log message s different from it?I heared something that log messages should be ride into a file.How can i do it,i surfed in google but i didnt come to end in understanding.I really troubled about it.Can anyone suggest me some help and post some good articles.Thanks.
A "log message" is only some Text Information which is offered by a program.
The message can be written to different output channels.
E.g. you are using the Console channel which is bound on the running program. This means when the program ends the log message may get lost if you don't save it explicitly (e.g. with a text-editor in a file).
The better way is to log into a so called "log-file".
You can write your own function which writes to a file or you can use some logging-framework.
The benefit on a logging framework is, that it mostly offers you the ability to choose, which output channel you prefer (for example also Database!), how the logging message has to look like (e.g. Date and Time at the beginning of each line) and that it offers you different severities.
Severities can be for example of type:
Error
Info
Debug
The Logging Framework (or your configuration) then decides how to handle the different severities.
Write the severities in different Logfiles (debug.log, error.log)
Write only messages over the configured Severity Level (e.g. Level Info skips debug messages)
...

How do I post file to Apache Solr using Nodejs?

I am kind of asking about best practice here, since I've looked everywhere but can't find any. I'm creating a Solr-based application using NodeJS, particularly solr-client to connect to the Solr. My application is supossed to be able to receive files in multiple kind of formats (from json, xml, csv, etc, which is possible in Solr) and I am using NodeJS for that.
On terminal I can use command like bin/solr post -c core-name path_to_file to post any kind of format readable by Solr, but I have some trouble doing that in Javascript. The following code just doesn't work.
client.add(thisData, function(err, obj){
if(err){
console.log("Failed add file");
console.log(err.stack);
// throw err;
}else{
// console.log(obj);
console.log("Successful");
}
});
client.commit(function(err, res){
if(err){
console.log(err);
}
if(res){
console.log(res);
}
});
As a note, thisData is a variable containing the file content that I have obtained using fileSystem. I don't understand why I can post any kind of file format from terminal, but can't do the same from programming thing. Is it not allowed in Solr?
Some friends suggested me to convert all the file contents (from xls, csv, json, etc) to json then send the json to Solr. It works, but I realize that it's so much works. I have to use a lot of nodejs packages to convert those files and I don't find them really efficient. I have been looking for the best way to do this on github but seems like I found nothing to reference to. Can someone give me some hints? I am sorry if I am not being really clear, I will explain more if you need me to.

How can I stream multiple remote images to a zip file and stream that to browser with ExpressJS?

I've got a small web app built in ExpressJs that allows people in our company to browse product information. A recent feature request requires that users be able to download batches of images (potentially hundreds at a time). These are stored on another server.
Ideally I think I need to to stream the batch of files to a zip file and stream that to the end user's browser as a download. All preferably without having to store the files on the server. The idea being that I want to reduce load on the server as much as possible.
Is it possible to do this or do I need to look at another approach? I've been experimenting with the 'request' module for the initial download.
If anyone can point me in the right direction or recommend any NPM modules that might help it would be very much appreciated.
Thanks.
One useful module for this is archiver, but I'm sure there are others as well.
Here's an example program that shows:
how to retrieve a list of URL's (I'm using async to handle the requests, and also to limit the # of concurrent HTTP requests to 3);
how to add the responses for those URL's to a ZIP file;
to stream the final ZIP file somewhere (in this case to stdout, but in case of Express you can pipe to the response object).
Example:
var async = require('async');
var request = require('request');
var archiver = require('archiver');
function zipURLs(urls, outStream) {
var zipArchive = archiver.create('zip');
async.eachLimit(urls, 3, function(url, done) {
var stream = request.get(url);
stream.on('error', function(err) {
return done(err);
}).on('end', function() {
return done();
});
// Use the last part of the URL as a filename within the ZIP archive.
zipArchive.append(stream, { name : url.replace(/^.*\//, '') });
}, function(err) {
if (err) throw err;
zipArchive.finalize().pipe(outStream);
});
}
zipURLs([
'http://example.com/image1.jpg',
'http://example.com/image2.jpg',
...
], process.stdout);
Do note that although this doesn't require the image files to be locally stored, it does build the ZIP file entirely in memory. Perhaps there are other ZIP modules that would allow you to work around that, although (AFAIK) the ZIP file format isn't really great in terms of streaming, as it depends on metadata being appended to the end of the file.

Custom Module DB Variable Export - NodeJS

I've Got a Small Script That Runs Continuously On My Server.
I'd Like To Connect It To a DB To Get Some Data For The Script's Operation. So, I Required The Custom Model I Built, And Node Recognises It - It Does Not Send Back An Error With "Module Not Found".
The DB Script Works, And Pulls Exactly What I Need From The DB. This Is How It Looks (Assume The Connection Is Open - This Is Just The Query & Response Code):
connection.query(sql, function (err,rows){
//Handle Errors
if(err){
throw err;
}
//If Successful - Log Results For Now (Change To Export Results)
else{
//console.log(rows);
rows = exports.rows;
}
});
Unfortunately, Whenever I Call DB.rows In My Main Script, It Returns undefined.
Am I Exporting It Wrong Or The Problem Lies Deeper Within?

Resources