Nodejs Express Send File - node.js

I am trying to send a file's content to the client in my request, but the only documentation Express has is it's download function which requires a physical file; the file I am trying to send comes from S3, so all I have is the filename and content.
How do I go about sending the content of the file and appropriate headers for content type and filename, along with the file's content?
For example:
files.find({_id: id}, function(e, o) {
client.getObject({Bucket: config.bucket, Key: o.key}, function(error, data) {
res.send(data.Body);
});
});

The type of file depends on the file obviously. Have a look at this:
http://en.wikipedia.org/wiki/Internet_media_type
If you know what exactly is your file, then assign one of these to response ( not mandatory though ). You should also add the length of the file to response ( if it is possible, i.e. if it is not a stream ). And if you want it to be downloadable as an attachment, then add Content-Disposition header. So all in all you only need to add this:
var filename = "myfile.txt";
res.set({
"Content-Disposition": 'attachment; filename="'+filename+'"',
"Content-Type": "text/plain",
"Content-Length": data.Body.length
});
NOTE: I'm using Express 3.x.
EDIT: Actually Express is smart enough to count content length for you, so you don't have to add Content-Length header.

This is a great situation to use streams. Use the knox library to simplify things. Knox should take care of setting the needed headers to pipe files to the client
var inspect = require('eyespect').inspector();
var knox = require('knox');
var client = knox.createClient({
key: 's3KeyHere'
, secret: 's3SecretHere'
, bucket: 's3BucketHer'
});
/**
* #param {Stream} response is the response handler provided by Express
**/
function downloadFile(request, response) {
var filePath = 's3/file/path/here';
client.getFile(filePath, function(err, s3Response) {
s3Response.pipe(response);
s3Response.on('error', function(err){
inspect(err, 'error downloading file from s3');
});
s3Response.on('progress', function(data){
inspect(data, 's3 download progress');
});
s3Response.on('end', function(){
inspect(filePath, 'piped file to remote client successfully at s3 path');
});
});
}
npm install knox eyespect

Related

Send multiple files from one API to another with NodeJS , multer and Axios

I have an API in one nodejs project as below which receive multiple attachment from UI:
const upload = multer()
router.post('/upload', upload.array("attachments"),controller.getSomething);
getSomething is supposed to call another POST API using Axios only, which is in another NodeJs project which accept these attachments and process it. It as well accept multiple files via multer.
I am unsure how could i send multiple files as a request from one Nodejs project to another at once. could you please favour.
I had to set formdata as below:
const formData=new FormData();
for(let file of req.files){
formData.append("attachments",file.buffer,file.originalname);
}
And passed the formdata to other api via axios.
You can do the following steps:
When you upload the temporary files (coming from UI), save them in the temporary folder.
Pass all the files names to the POST API using Axios.
In the post API, read all the files from the temporary folder and stream them to the destination.
controller.getSomething = (req, res, next) => {
// get the file names
const fileNames = req.files.map(filename => filename);
// now post this filenames to the
axios.post('/pathname', {fileNames})
// or, you can use get request
}
Reading files in the post Request:
var promises= ['file1.css', 'file2.css'].map(function(_path){
return new Promise(function(_path, resolve, reject){
fs.readFile(_path, 'utf8', function(err, data){
if(err){
console.log(err);
resolve(""); //following the same code flow
}else{
resolve(data);
}
});
}.bind(this, _path));
});
Promise.all(promises).then(function(results){
//Put your callback logic here
response.writeHead(200, {"Content-Type": "text/css"});
results.forEach(function(content){response.write(content)});
response.end();
});
#copied from this link. You should check the different answers that can help you.

Sending Zip file from server to client browser with Express and Archiver

I am a beginner with Node and I am trying to figure out how to create a zip file at the server then send it to the client and then download the zip file to the user's browser. I am using the Express framework and I am using Archiver to actually do the zipping. My server code is the following which was taken from Dynamically create and stream zip to client
router.get('/image-dl', function (req,res){
res.writeHead(200, {
'Content-Type': 'application/zip',
'Content-disposition': 'attachment; filename=myFile.zip'
});
var zip = archiver('zip');
// Send the file to the page output.
zip.pipe(res);
// Create zip with some files. Two dynamic, one static. Put #2 in a sub folder.
zip.append('Some text to go in file 1.', { name: '1.txt' })
.append('Some text to go in file 2. I go in a folder!', { name: 'somefolder/2.txt' })
.finalize();
});
So its zipping two text files and returning the result. On the client side I am using the following function in a service to actually call that endpoint
downloadZip(){
const headers = new Headers({'Content-Type': 'application/json'});
const token = localStorage.getItem('token')
? '?token=' + localStorage.getItem('token')
: '';
return this.http.get(this.endPoint + '/job/image-dl' + token, {headers: headers})
.map((response: Response) => {
const result = response;
return result;
})
.catch((error: Response) => {
this.errorService.handleError(error.json());
return Observable.throw(error.json());
});
}
and then I have another function which calls downloadZip() and actually downloads the zip file to the user's local browser.
testfunc(){
this.jobService.downloadZip().subscribe(
(blah:any)=>{
var blob = new Blob([blah], {type: "application/zip"});
FileSaver.saveAs(blob, "helloworld.zip");
}
);
}
When testfunc() is called, a zip file is downloaded to the user's browser however when I try to unzip it it creates a zip.cpgz file which then turns back into a zip file when clicked in an infinite loop which indicates that some kind of corruption happened. Can anyone see where I went wrong here?

uploaded file differ with the original

I'm trying to upload a file with node using this simple code:
UpdateController.prototype.uploadUpdateFile = function(req, res)
{
var w = fs.createWriteStream(settings.uploadFolder + settings.updateFile);
req.pipe(w);
w.on('finish', function()
{
res.send(JSON.stringify({
status:0,
filename:settings.uploadFolder + settings.updateFile
}));
}, function()
{
res.send(JSON.stringify({
status:1,
message:"error during file upload, operation failed"
}));
});
}
The file is correctly uploaded but it changes between the original because header (------WebKitForm ... /octet-stream....) and footer (..------WebKitFormBoundary9gOZjMubs9GivcUQ--..) are added to the content.
How to get only the file content ?
You would have to look at the headers of the client request to understand how the client decided to send you the file (how the file was encoded)
You will probably end up using busboy or another package that depends on it : https://www.npmjs.com/package/busboy
such a package will "decode" the data sent by the browser.

Pushing binary data to Amazon S3 using Node.js

I'm trying to take an image and upload it to an Amazon S3 bucket using Node.js. In the end, I want to be able to push the image up to S3, and then be able to access that S3 URL and see the image in a browser. I'm using a Curl query to do an HTTP POST request with the image as the body.
curl -kvX POST --data-binary "#test.jpg" 'http://localhost:3031/upload/image'
Then on the Node.js side, I do this:
exports.pushImage = function(req, res) {
var image = new Buffer(req.body);
var s3bucket = new AWS.S3();
s3bucket.createBucket(function() {
var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: image};
// Put the object into the bucket.
s3bucket.putObject(params, function(err) {
if (err) {
res.writeHead(403, {'Content-Type':'text/plain'});
res.write("Error uploading data");
res.end()
} else {
res.writeHead(200, {'Content-Type':'text/plain'});
res.write("Success");
res.end()
}
});
});
};
My file is 0 bytes, as shown on Amazon S3. How do I make it so that I can use Node.js to push the binary file up to S3? What am I doing wrong with binary data and buffers?
UPDATE:
I found out what I needed to do. The curl query is the first thing that should be changed. This is the working one:
curl -kvX POST -F foobar=#my_image_name.jpg 'http://localhost:3031/upload/image'
Then, I added a line to convert to a Stream. This is the working code:
exports.pushImage = function(req, res) {
var image = new Buffer(req.body);
var s3bucket = new AWS.S3();
s3bucket.createBucket(function() {
var bodyStream = fs.createReadStream(req.files.foobar.path);
var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: bodyStream};
// Put the object into the bucket.
s3bucket.putObject(params, function(err) {
if (err) {
res.writeHead(403, {'Content-Type':'text/plain'});
res.write("Error uploading data");
res.end()
} else {
res.writeHead(200, {'Content-Type':'text/plain'});
res.write("Success");
res.end()
}
});
});
};
So, in order to upload a file to an API endpoint (using Node.js and Express) and have the API push that file to Amazon S3, first you need to perform a POST request with the "files" field populated. The file ends up on the API side, where it resides probably in some tmp directory. Amazon's S3 putObject method requires a Stream, so you need to create a read stream by giving the 'fs' module the path where the uploaded file exists.
I don't know if this is the proper way to upload data, but it works. Does anyone know if there is a way to POST binary data inside the request body and have the API send that to S3? I don't quite know what the difference is between a multi-part upload vs a standard POST to body.
I believe you need to pass the content-length in the header as documented on the S3 docs: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
After spending quite a bit of time working on pushing assets to S3, I ended up using the AwsSum library with excellent results in production:
https://github.com/awssum/awssum-amazon-s3/
(See the documentation on setting your AWS credentials)
Example:
var fs = require('fs');
var bucket_name = 'your-bucket name'; // AwsSum also has the API for this if you need to create the buckets
var img_path = 'path_to_file';
var filename = 'your_new_filename';
// using stat to get the size to set contentLength
fs.stat(img_path, function(err, file_info) {
var bodyStream = fs.createReadStream( img_path );
var params = {
BucketName : bucket_name,
ObjectName : filename,
ContentLength : file_info.size,
Body : bodyStream
};
s3.putObject(params, function(err, data) {
if(err) //handle
var aws_url = 'https://s3.amazonaws.com/' + DEFAULT_BUCKET + '/' + filename;
});
});
UPDATE
So, if you are using something like Express or Connect which are built on Formidable, then you don't have access to the file stream as Formidable writes files to disk. So depending on how you upload it on the client side the image will either be in req.body or req.files. In my case, I use Express and on the client side, I post other data as well so the image has it's own parameter and is accessed as req.files.img_data. However you access it, that param is what you pass in as img_path in the above example.
If you need to / want to Stream the file that is trickier, though certainly possible and if you aren't manipulating the image you may want to look at taking a CORS approach and uploading directly to S3 as discussed here: Stream that user uploads directly to Amazon s3

mongodb gridfs encoding picture base64

i try to readout an image, saved in mongodb, via gridfs (without temporary file)
it should be directly sent to ajax, which injects it into html
when i use my actual functions a large bit string is formed and sent to client (is saved in ajax response var)
but as it reaches the client, the bits arent correct anymore
so i look for a way to encode the picture before it is sent (into base64)
(or is there any other way?)
Serverside - javascript, gridfs
exports.readFileFromDB = function(req, res, profile, filename, callback){
console.log('Find data from Profile ' + JSON.stringify(profile));
var GridReader = new GridStore(db, filename,"r");
GridReader.open(function(err, gs) {
var streamFile = gs.stream(true);
streamFile.on("end", function(){
});
// Pipe out the data
streamFile.pipe(res);
GridReader.close(function(err, result) {
});
Clientside - javascript ajax call:
function imgUpload(){
var thumb = $("#previewPic");
$('#uploadForm').ajaxSubmit({
beforeSend:function(){
//launchpreloader();
},
error: function(xhr) {
//status('Error: ' + xhr.status);
},
success: function(response) {
console.log(response);
var imageData = $.base64Encode(response);
console.log(imageData);
thumb.attr("src","data:image/png;base64"+imageData);
$("#spanFileName").html("File Uploaded")
}
});
}
I'm doing something similar for a current project, but when the upload is complete, I return a JSON object containing the URL for the uploaded image:
{ success : true, url : '/uploads/GRIDFSID/filename.ext' }
I have a route in Express that handles the /uploads route to retrieve the file from GridFS and stream it back to the client, so I can use the above URL in an IMG SRC. This is effectively what appears in the DOM:
<img src="/uploads/GRIDFSID/filename.ext">
The route handler looks something like this (it uses node-mime and gridfs-stream):
app.get(/^\/uploads\/([a-f0-9]+)\/(.*)$/, function(req, res) {
var id = req.params[0];
var filename = req.params[1];
// Set correct content type.
res.set('Content-type', mime.lookup(filename));
// Find GridFS file by id and pipe it to the response stream.
gridfs
.createReadStream({ _id : id })
.on('error', function(err) {
res.send(404); // or 500
})
.pipe(res);
});
It obviously depends on your exact setup if my solution works for you.

Resources