Store custom image http call with Multer in NodeJS - node.js

I have HTTP call with Buffer image inside from NodeJS Server 1 (run at localhost:5000) to NodeJS Server 2 (run at localhost:5001)
There are the Server 1 axios call:
await axios.post('http://localhost:5001/', {
buffer: fileBuffer,
...otherData //
})
I read that Multer gets the image from req.files, but in my case they arrive as req.body.buffer
How can i do?

Solved
I haven't used Multer
On Server 2 use Buffer.from() for get Buffer from your http call (because is an JSON array) and then fs.writeFile() or fs.writeFileSync() write file on disk
This is the way I used (Server 1):
await axios.post('http://localhost:5001/', {
buffer: fileBuffer,
name: fileName
})
Server 2:
const { body } = req // This is the HTTP object, it contains buffer and name
const data = Buffer.from(body.buffer) // Transform array to buffer value
const path = './public/' + body.name // "./public/sample.jpg"
fs.writeFileSync(path, data) // writeFile and writeFileSync also accept Buffer

Related

how to send a sub buffer to server by nodejs + axios

I want send a part of bigFile to serve by pure nodejs+axios+form-data.
nodejs v16.14.0
but I don't know how to send buffer direct to server
import FormData from 'form-data'
const buffer = fs.readFileSync(filePath)
const chunk = buffer.slice(start, end)
// some code ...
// below code can work
// but the way need write file to disk
fs.writeFileSync(chunkPath, chunk)
formData.append('file', fs.createReadStream(chunkPath))
// end below code can work
// below code doesn't work
import { Readable } from 'stream'
formData.append('file', Readable.from(chunk).read())
dont's save buffer to disk, and send to server

Pass file uploaded via HTTP POST to another API

I have a Node.js (16.13.1) REST API using Express and one of my endpoints receives one or more uploaded files. The client (web app) uses FormData into which the files are appended. Once they're submitted to my API, the code there uses multer to grab the files from the request object.
Now I'm having trouble trying to send those same files to another API. multer attaches the files to req.files and each file object in that array has several properties one of which is buffer. I tried using the stream package's Duplex object to convert this buffer to a stream so that I could append the file to another FormData object, but when the server the second API is running on receives the request, I get an error from the web server saying that "a potentially dangerous request.form value was detected from the client.".
Any suggestions?
I am working on a nest project I was also facing this issue did some research and found that we need to create a Readable from the Buffer of that file and it's working for me.
// Controller
#UseInterceptors(FileInterceptor('file'))
async uploadFile(#UploadedFile() file: Express.Multer.File) {
return this.apiservice.upload(file);
}
// Service
uploadFile(file: Express.Multer.File) {
const readstream = Readable.from(file.buffer)
console.log(readstream)
const form = new FormData();
form.append('file', file, { filename: extra.filename });
const url = `api_endpoint`;
const config: AxiosRequestConfig = {
headers: {
'Content-Type': 'multipart/form-data'
},
};
return axios.post(url, form, config);
}

How to upload file (image) to NodeJS (and from NodeJS)

I have the following code:
const formData = new FormData();
formData.append("filedata", myImage);
fetch("/extension/api/uploadtofb", {
method: "POST",
body: formData
})
If myImage is a base64 string representation (dataURL) of an image, multer considers it not as a file but as a text. Therefore it is found in req.body, not in req.file. How can I upload this image to my NodeJS (Express) server so it can be recognized and read as an image? (Doesn't need to use multer, I just followed instructions I found so far.)
I was able to work around it by converting the base64 string to blob and send it as blob instead. The image now can be correctly found in the req.file. But I need to send a post request with this image from the NodeJS (Express) server. FormData doesn't accept req.file or req.file.bufferin append. How can I convert received blob to a format, that can be accepted by FormData and sent with POST request?
I think that if the first problem can be solved, the second problem will also solve itself.
You can directly upload base64 image data
var base64Data = JSON.parse(res.body).data;
var filename = "test.png";
var base64_attachement = base64Data.replace(/-/g, '+').replace(/_/g, '/').replace(/ /g, '+');
filepath = (__dirname + './../../public/temp');
if (!fs.existsSync(filepath)) {
fs.mkdirSync(filepath, 0744); //Set Folder Permission
}
filepath = filepath + '/' + _filename;
//TempFilePath
require("fs").writeFile(filepath, base64_attachement, 'base64', function (err) {
console.log(err);
return cb(filepath);
});

nodejs handling arraybuffers

suppose I make a multipart, application/octet-stream request with responseType as 'arraybuffer'...suppose I receive this in nodejs and I try to write the response into a file. How can I handle this such that I don't corrupt the contents?
My current approach is something like this
var req = var req = restler.post(url, opts)
.on('data', function (data){
console.log('receiving data...');
console.log(data);
}).on('complete', function (data) {
var buff = new Buffer(data) //this is prolly incorrect, but I can't figure this out at all
fs.writeFile(file_name, buff.toString('binary'), function(err){
console.log('done!')
});
Here I write the contents into filename.
Suppose I fetch a microsoft word file...fetching it only leads me to a corrupt file. Also using restler package for this
According to the restler documentation, you can set decoding: 'buffer' in your opts and it will keep the binary data intact as a Buffer instead of the default utf8-encoded string. From there it's just a matter of passing the buffer directly to fs.writeFile() without calling buffer.toString().

hitting a multipart url in nodejs

I have a client code using form-data module to hit a url that returns a content-type of image/jpeg. Below is my code
var FormData = require('form-data');
var fs = require('fs');
var form = new FormData();
//form.append('POLICE', "hello");
//form.append('PAYSLIP', fs.createReadStream("./Desert.jpg"));
console.log(form);
//https://fbcdn-profile-a.akamaihd.net/hprofile-ak-xfp1/v/t1.0- 1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a4703494c6c1&oe=55ADC7C8&__gda__=1436921313_bf58cbf91270adcd7b29241838f7d01a
form.submit({
protocol: 'https:',
host: 'fbcdn-profile-a.akamaihd.net',
path: '/hprofile-ak-xfp1/v/t1.0-1/c8.0.50.50/p50x50/10934065_1389946604648669_2362155902065290483_n.jpg?oh=13640f19512fc3686063a3494c6c1&oe=55ADCC8&__gda__=1436921313_bf58cbf91270adcd7b2924183',
method: 'get'
}, function (err, res) {
var data = "";
res.on("data", function (chunks) {
data += chunks;
});
res.on("end", function () {
console.log(data);
console.log("Response Headers - " + JSON.stringify(res.headers));
});
});
I'm getting some chunk data and the response headers i received was
{"last-modified":"Thu, 12 Feb 2015 09:49:26 GMT","content-type":"image/jpeg","timing-allow-origin":"*","access-control-allow-origin":"*","content-length":"1443","cache-control":"no-transform, max-age=1209600","expires":"Thu, 30 Apr 2015 07:05:31 GMT","date":"Thu, 16 Apr 2015 07:05:31 GMT","connection":"keep-alive"}
I am now stuck as how to process the response that i received to a proper image.I tried base64 decoding but it seemed to be a wrong approach any help will be much appreciated.
I expect that data, once the file has been completely downloaded, contains a Buffer.
If that is the case, you should write the buffer as is, without any decoding, to a file:
fs.writeFile('path/to/file.jpg', data, function onFinished (err) {
// Handle possible error
})
See fs.writeFile() documentation - you will see that it accepts either a string or a buffer as data input.
Extra awesomeness by using streams
Since the res object is a readable stream, you can simply pipe the data directly to a file, without keeping it in memory. This has the added benefit that if you download really large file, Node.js will not have to keep the whole file in memory (as it does now), but will write it to the filesystem continuously as it arrives.
form.submit({
// ...
}, function (err, res) {
// res is a readable stream, so let's pipe it to the filesystem
var file = fs.createWriteStream('path/to/file.jpg')
res.on('end', function writeDone (err) {
// File is saved, unless err happened
})
.pipe(file) // Send the incoming file to the filesystem
})
The chunk you got is the raw image. Do whatever it is you want with the image, save it to disk, let the user download it, whatever.
So if I understand your question clearly, you want to download a file from an HTTP endpoint and save it to your computer, right? If so, you should look into using the request module instead of using form-data.
Here's a contrived example for downloading things using request:
var fs = require('fs');
var request = require('request')
request('http://www.example.com/picture.jpg')
.pipe(fs.createWriteStream('picture.jpg'))
Where 'picture.jpg' is the location to save to disk. You can open it up using a normal file browser.

Resources