image upload from one server to other in nodejs [duplicate] - node.js

This question already has answers here:
Nodejs POST request multipart/form-data
(5 answers)
Closed 5 years ago.
File uploading using multer is not happening
My code:
File read from html and pass to external url
router.post('/fileUpload',function (req,res) {
request({
uri: http//example.com/upload, // url of other server
method: "POST",
form: {
"name":req.body.name,
"image":? //image from html - no idea how to pass the req.files here
}
}, function (error, response, body) {
------------
------------
}
});
other server url : /example.com/upload
This is the code to upload image using multer
app.post('/upload',function(req,res){
var storage = multer.diskStorage({
destination: function (req, file, callback) {
callback(null, 'uploads');
},
filename: function (req, file, callback) {
callback(null, file.fieldname + '-' + Date.now());
}
});
var upload = multer({ storage : storage }).array('productImage');
upload(req,res,function(err) {
if(err) {
return res.json({'success':0,'result':{},'errorMessage':'Unknown Error'});
}
return res.json({'success':1,'result':{},'errorMessage':''});
});
});

Create readStream from file uploaded and pipe it to the another server.check this link https://www.npmjs.com/package/request, you will easily get this done.

The simple answer would be to create a read stream from the uploaded file and pipe it to the second server, like so:
fs.createReadStream(req.files[0].path).pipe(request.post('http//example.com/upload'))
However, there are many ways to make this work. The most efficient of which is to use a binary stream from the initial upload all the way to the second server. You want to avoid using your first server as a storage for all of the uploads.
Here's how I would do it instead:
Use jQuery file upload on client side upload
(Or any other library/approach to upload the raw binary stream)
$('#fileupload').fileupload({
url: 'https://server1.com/upload'
})
Use Formidable (or other library like multer) to handle upload server-side
(This will allow you to read the binary stream in parts, and handle each part as it comes)
app.post('/upload',function(req,res){
var form = new formidable.IncomingForm();
form.parse(req);
form.onPart = function(part) {
fs.createReadStream(part).pipe( request.post('http//example.com/upload'))
}
}
When each part of the binary upload is received, you can to stream the binary directly to the second server via pipe() to avoid having to store it on the first server whatsoever.
The key to making this work is to look at the file upload and transfer as a stream of binary bits. When the user uploads a file (req.body) you want to create a read stream from the file, and pipe the binary over the request.post().

Related

Send multiple files from one API to another with NodeJS , multer and Axios

I have an API in one nodejs project as below which receive multiple attachment from UI:
const upload = multer()
router.post('/upload', upload.array("attachments"),controller.getSomething);
getSomething is supposed to call another POST API using Axios only, which is in another NodeJs project which accept these attachments and process it. It as well accept multiple files via multer.
I am unsure how could i send multiple files as a request from one Nodejs project to another at once. could you please favour.
I had to set formdata as below:
const formData=new FormData();
for(let file of req.files){
formData.append("attachments",file.buffer,file.originalname);
}
And passed the formdata to other api via axios.
You can do the following steps:
When you upload the temporary files (coming from UI), save them in the temporary folder.
Pass all the files names to the POST API using Axios.
In the post API, read all the files from the temporary folder and stream them to the destination.
controller.getSomething = (req, res, next) => {
// get the file names
const fileNames = req.files.map(filename => filename);
// now post this filenames to the
axios.post('/pathname', {fileNames})
// or, you can use get request
}
Reading files in the post Request:
var promises= ['file1.css', 'file2.css'].map(function(_path){
return new Promise(function(_path, resolve, reject){
fs.readFile(_path, 'utf8', function(err, data){
if(err){
console.log(err);
resolve(""); //following the same code flow
}else{
resolve(data);
}
});
}.bind(this, _path));
});
Promise.all(promises).then(function(results){
//Put your callback logic here
response.writeHead(200, {"Content-Type": "text/css"});
results.forEach(function(content){response.write(content)});
response.end();
});
#copied from this link. You should check the different answers that can help you.

Firebase Cloud Function Serving Local File for Download

I created cloud function that generates an xlsx file, I need the user to download that file after it's generated.
Method 1: Upload to Bucket, then redirect
So far i've tried uploading the file to a bucket using this API, and then redirect him to the bucket file url, I also double checked the bucket name using this API, but I get the same error every time:
{"error":{"code":500,"status":"INTERNAL","message":"function crashed","errors":["socket hang up"]}}
Portion of the code that contains uploading to bucket:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
Portion of the code that proves file exists:
fs.access('myfile.xlsx', fs.constants.F_OK, (err) => {
console.log(`${file} ${err ? 'does not exist' : 'exists'}`);
});
I also checked if the library "#google-cloud/storage" reads the file, and it reads it correctly and gets the file size right.
Method 2: Direct Download
Download the file directly, the problem is that every doc online for nodejs to download a local file to the user is setting up a custom server to download the file, but i'm using firebase, so it's not in control of that server.
Just wanted to add more detail to the answer, since there's no need to write into a file and read from it to download it's data, simply take the data and send it, using the few lines below.
res.setHeader('Content-Type', 'application/vnd.openxmlformats');
res.setHeader("Content-Disposition", "attachment; filename=" + fileName);
res.end(fileData, 'binary');
If your excel file is created and should be returned to the client as a response to an HTTP request (calling to an API endpoint) then this is how you can do it.
export const getExcelFile = functions.https.onRequest(async (request, response) => {
// ...
// Create your file and such
// ..
await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
response.setHeader('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
response.send(fs.readFileSync('myfile.xlsx'));
return null;
});
Otherwise, if the excel file is created as a response to an event, and you want the user to download the file at another time, then you create a download link and serve it to the user in any way you want.
// ...
// Create your file and such
// ..
const [file] = await storage.bucket('bucket-name').upload('myfile.xlsx', {
gzip: false,
});
const [downloadUrl] = await file.getSignedUrl({
action: 'read',
expires: '20-03-2019' // Link expiry date: DD-MM-YYYY
});
console.log(downloadUrl);

How can I get multiple upload stream with multer storage engine?

I am making a multer storage engine which makes stream connection between client and S3 Server.
At middle of the stream, my code examine chunks and send it to S3.
I could get a file stream from node.js server. But when I request file array upload, node inspector shows only one stream. What should I do?
Stream engine snippet
CustomStreamEngine.prototype._handleFile = function _handleFile (req, file, cb) {
// for inspect
req.files.length // 1
file;
};
request controller
var streamStorage = multer({
storage: streamEngine()
});
dev.post('/rec_test', streamStorage.array('source'), (req, res, next) => {
});
I just published this streaming multipart/form-data parser on npm as form-parser:
You should be able to do the following:
dev.post('/rec_test', async (req, res, next) => {
// Parse request
await parser(req, async ({ fieldType, fieldName, fieldContent }) => {
// Log all fields
console.log({ fieldType, fieldName, fieldContent });
// Handle 'source' file fields
if (fieldType === 'file' && fieldName === 'source[]') {
// Get file info
const { fileName, fileType, fileStream } = fieldContent;
// Upload fileStream to S3 :-)
}
});
});
Hope it's helpful.
K
I think you can add some logs to https://github.com/expressjs/multer/blob/master/lib/make-middleware.js to check.
Currently, I use axios on the client to send multi files to the server with multer. And I can see all files in the function
busboy.on('file', function (fieldname, fileStream, filename, encoding, mimetype), but there is only one file at a time, and this function will call the _handfile function of the custom storage, so that I think it is the reason for your issue.
Hope it can help you

File Upload using node js without multer

Just want simple file upload functionality . I have used fs-path that serves my purpose of creating dynamic folder structure and file at upload location. I am not able to achieve streaming of request file, that will have to be uploaded. My code is as follows :
fsPath.writeFile(path, **req.body**, function (err) {
if (err) {
throw err;
} else {
console.log('File Upload successful...');
res.send({ code: 200, message: `File Upload successful` });
}
});
Need some insights about, how can I send request file as input in the above code snippet. How do I steam my request file, that will be written in respective upload location?
If you want to stream the request body then instead of using a body parser or multr you should use the req stream directly. Remember that the request object is a stream and you can use it as such:
req.on('data', data => {
console.log(data);
});
You can also pipe it to some other stream, like a writable stream created with fs.createWriteStream etc.

Pushing binary data to Amazon S3 using Node.js

I'm trying to take an image and upload it to an Amazon S3 bucket using Node.js. In the end, I want to be able to push the image up to S3, and then be able to access that S3 URL and see the image in a browser. I'm using a Curl query to do an HTTP POST request with the image as the body.
curl -kvX POST --data-binary "#test.jpg" 'http://localhost:3031/upload/image'
Then on the Node.js side, I do this:
exports.pushImage = function(req, res) {
var image = new Buffer(req.body);
var s3bucket = new AWS.S3();
s3bucket.createBucket(function() {
var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: image};
// Put the object into the bucket.
s3bucket.putObject(params, function(err) {
if (err) {
res.writeHead(403, {'Content-Type':'text/plain'});
res.write("Error uploading data");
res.end()
} else {
res.writeHead(200, {'Content-Type':'text/plain'});
res.write("Success");
res.end()
}
});
});
};
My file is 0 bytes, as shown on Amazon S3. How do I make it so that I can use Node.js to push the binary file up to S3? What am I doing wrong with binary data and buffers?
UPDATE:
I found out what I needed to do. The curl query is the first thing that should be changed. This is the working one:
curl -kvX POST -F foobar=#my_image_name.jpg 'http://localhost:3031/upload/image'
Then, I added a line to convert to a Stream. This is the working code:
exports.pushImage = function(req, res) {
var image = new Buffer(req.body);
var s3bucket = new AWS.S3();
s3bucket.createBucket(function() {
var bodyStream = fs.createReadStream(req.files.foobar.path);
var params = {Bucket: 'My/bucket', Key: 'test.jpg', Body: bodyStream};
// Put the object into the bucket.
s3bucket.putObject(params, function(err) {
if (err) {
res.writeHead(403, {'Content-Type':'text/plain'});
res.write("Error uploading data");
res.end()
} else {
res.writeHead(200, {'Content-Type':'text/plain'});
res.write("Success");
res.end()
}
});
});
};
So, in order to upload a file to an API endpoint (using Node.js and Express) and have the API push that file to Amazon S3, first you need to perform a POST request with the "files" field populated. The file ends up on the API side, where it resides probably in some tmp directory. Amazon's S3 putObject method requires a Stream, so you need to create a read stream by giving the 'fs' module the path where the uploaded file exists.
I don't know if this is the proper way to upload data, but it works. Does anyone know if there is a way to POST binary data inside the request body and have the API send that to S3? I don't quite know what the difference is between a multi-part upload vs a standard POST to body.
I believe you need to pass the content-length in the header as documented on the S3 docs: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
After spending quite a bit of time working on pushing assets to S3, I ended up using the AwsSum library with excellent results in production:
https://github.com/awssum/awssum-amazon-s3/
(See the documentation on setting your AWS credentials)
Example:
var fs = require('fs');
var bucket_name = 'your-bucket name'; // AwsSum also has the API for this if you need to create the buckets
var img_path = 'path_to_file';
var filename = 'your_new_filename';
// using stat to get the size to set contentLength
fs.stat(img_path, function(err, file_info) {
var bodyStream = fs.createReadStream( img_path );
var params = {
BucketName : bucket_name,
ObjectName : filename,
ContentLength : file_info.size,
Body : bodyStream
};
s3.putObject(params, function(err, data) {
if(err) //handle
var aws_url = 'https://s3.amazonaws.com/' + DEFAULT_BUCKET + '/' + filename;
});
});
UPDATE
So, if you are using something like Express or Connect which are built on Formidable, then you don't have access to the file stream as Formidable writes files to disk. So depending on how you upload it on the client side the image will either be in req.body or req.files. In my case, I use Express and on the client side, I post other data as well so the image has it's own parameter and is accessed as req.files.img_data. However you access it, that param is what you pass in as img_path in the above example.
If you need to / want to Stream the file that is trickier, though certainly possible and if you aren't manipulating the image you may want to look at taking a CORS approach and uploading directly to S3 as discussed here: Stream that user uploads directly to Amazon s3

Resources