i'm trying to send an image with axios (nodejs) to express server with formidable.
This is the code of axios script:
const axios = require('axios');
const fs = require('fs')
const FormData = require('form-data')
var img = fs.readFileSync("C:/Users/alessio/Documents/FX/screenshot.png", 'utf8');
console.log(img)
let data = new FormData();
data.append('img', img, "img.png")
console.log(data);
axios.post('http://192.168.43.193:3000/testAPI/upload_img_mt', data, {
headers: {
'accept': 'application/json',
'Accept-Language': 'en-US,en,q=0.8',
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
'timeout': 999999
},
})
.then(function (response){
//console.log(response);
})
And this is the code serverside with express and the response managed with formidable:
router.post('/upload_img_mt', function(req, res, next){
console.log(req)
var form = new formidable.IncomingForm();
form.uploadDir = "fxdiary";
form.encoding = 'utf8';
form.on('fileBegin', function(name, file){
console.log(form.uploadDir + "/" + file.name);
});
form.parse(req, function(err, fields, files) {
console.log(files);
console.log(err);
console.log(fields);
});
res.sendStatus(200);
});
The file image is saved but is not correct png image. The size of the image is not correct and change sometimes randomly. Ad example the original file size is 33k and become 900bytes or 54k or another value.
Whats happen? Where is the problem in this code?
You don't need to pass boundary in Content-type header as it's auto added by browser. you might breaking default boundary mechanism.
If you still face file size issue then try with multer module for file handling on Nodejs side
Related
I have a code as shown below that uploads files from the browser and saves in the server, once it has been saved to the server, I want the server to connect to the Pinata API so the file can also be saved to the IPFS node.
let data = new FormData();
const fileBuffer = Buffer.from(`./public/files/${fileName}`, 'utf-8');
data.append('file', fileBuffer, `${fileName}`);
axios.post('https://api.pinata.cloud/pinning/pinJSONToIPFS',
data,
{
headers: {
'Content-Type': `multipart/form-data; boundary= ${data._boundary}`,
'pinata_api_key': pinataApiKey,
'pinata_secret_api_key': pinataSecretApiKey
}
}
).then(function (response) {
console.log("FILE UPLOADED TO IPFS NODE", fileName);
console.log(response);
}).catch(function (error) {
console.log("FILE WASNT UPLOADED TO IPFS NODE", fileName);
console.log(error);
});
The issue i'm having is that after creating a buffer of my file and wrapping it in a formdata, the pinata API returns an error :
data: {
error: 'This API endpoint requires valid JSON, and a JSON content-type'
}
If i convert the data to string like JSON.stringify(data) and change the content-type to application/json, the file buffer will be uploaded successfully as string.
I hope explained it well to get a solution. Thanks.
It looks like you're attempting to upload a file to the pinJSONToIPFS endpoint, which is intended to purely be used for JSON that is passed in via a request body.
In your situation I would recommend using Pinata's pinFileToIPFS endpoint
Here's some example code based on their documentation that may be of help:
//imports needed for this function
const axios = require('axios');
const fs = require('fs');
const FormData = require('form-data');
export const pinFileToIPFS = (pinataApiKey, pinataSecretApiKey) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
//we gather a local file for this example, but any valid readStream source will work here.
let data = new FormData();
data.append('file', fs.createReadStream('./yourfile.png'));
return axios.post(url,
data,
{
maxContentLength: 'Infinity', //this is needed to prevent axios from erroring out with large files
headers: {
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
'pinata_api_key': pinataApiKey,
'pinata_secret_api_key': pinataSecretApiKey
}
}
).then(function (response) {
//handle response here
}).catch(function (error) {
//handle error here
});
};
The proper code to pin any file to IPFS is as below.
Apparently, even Pinata support staff didn't know this.
You need to set an object with the property name filepath as your last parameter. The name doesn't matter, it can be a duplicate, it can be the same as others, or it can be unique.
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
const fileContents = Buffer.from(bytes);
const data = new FormData();
data.append("file", fileContents, {filepath: "anyname"});
const result = await axios
.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
"pinata_api_key": userApiKey,
"pinata_secret_api_key": userApiSecret,
"path": "somename"
}
});
Code to upload a file on IPFS using Pinata.
There are two methods available to upload files/images on Pinata. One is with Pinata SDK and the second is the pinFileToIPFS endpoint.
If you are uploading files from Next.js then you cannot convert your image into binary using fs.createReadStream or Buffer.from. These packages support the Node side. So if you want to upload the file with Next.js on Pinata then you can use this code.
// convert file into binary
const data = new FormData();
data.append("title", file.name);
data.append("file", file);
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
// pass binary data into post request
const result = await axios.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: "your_pinata_key",
pinata_secret_api_key:
"your_pinata_secret",
path: "somename",
},
});
console.log("RESULT", result);
this will upload a file to ipfs under the path ipfs://{cid}/images/{fileId}
const PINATA_BASE_URL = "https://api.pinata.cloud";
const PINATA_PIN_URI = "/pinning/pinFileToIPFS";
const fileExt = file.type.split("/")[1];
let nftId = 1
// creates a 64byte string '0000...0001' to follow ERC-1155 standard
const paddedId = createPaddedHex(nftId);
const ipfsFileId = `${paddedId}.${fileExt}`;
const ipfsImageFilePath = `/images/${ipfsFileId}`;
const fileUploadData = new FormData();
// this uploads the file and renames the uploaded file to the path created above
fileUploadData.append("file", file, ipfsImageFilePath);
fileUploadData.append(
"pinataOptions",
'{"cidVersion": 1, "wrapWithDirectory": true}'
);
fileUploadData.append(
"pinataMetadata",
`{"name": "${ipfsImageFilePath}", "keyvalues": {"company": "Pinata"}}`
);
const pinataUploadRes = await axios.post(
PINATA_BASE_URL + PINATA_PIN_URI,
fileUploadData,
{
headers: {
Authorization: `Bearer ${PINATA_JWT}`,
},
}
);
const ipfsCID = pinataUploadRes.data.IpfsHash;
I want to copy data from one stream to another in Java i do in below way
ByteStreams.copy( inputStream, outputStream );
In Node JS i am trying to find out how to do that
// Making an ajax call to get the Video file
const getVideo = async (req, res) => {
try {
axios.get('Video URL')
.then(function (videoResponse) {
res.setHeader("Content-Type", "video/mp4");
// copy the inputStram of videoResponse
//to the output stram of res
// copy the videoResponse to res
})
} catch (error) {
console.log(error);
}
};
Can anyone suggest how to do that, Thank you for your help
You need to set the responseType from axios to 'stream', then you can pipe data from the input stream to the response stream using .pipe().
// Making an ajax call to get the video file
const getVideo = async (req, res) => {
try {
axios({
method: 'get',
url: 'Video URL',
responseType: 'stream'
}).then((videoResponse) => {
res.setHeader("Content-Type", "video/mp4");
videoResponse.data.pipe(res);
});
} catch (err) {
console.log(err);
}
}
For more information about the .pipe() function, please read the Node.js docs.
For more information about axios request configuration options, please read the axios docs.
The most simple example for reading and writing to File System would be:
const fs = require('fs')
const input = fs.createReadStream('input_file')
const output = fs.createWriteStream('output_file')
input.pipe(output)
Check the File System docs for Node.js.
The input and output stream can be any ReadStream and WriteStream, like an HTTP response or S3 for example.
From the Axios Github README you have the example which looks very similar to what you are trying to do (please use the original one, I had to change the URL here)
// GET request for remote image in node.js
axios({
method: 'get',
url: 'https://lh3.googleusercontent.com/iXm....',
responseType: 'stream'
})
.then(function (response) {
response.data.pipe(fs.createWriteStream('ada_lovelace.jpg'))
});
The example below streams video outpu. I am assuming you need something similar. Can you try something like this and based on this example modify your code
const express = require('express')
const fs = require('fs')
const path = require('path')
const app = express()
app.get('/', function(req, res) {
res.sendFile(path.join(__dirname + '/index.html'))
})
app.get('/video', function(req, res) {
const path = 'assets/sample.mp4' //This can be replaced by axios.get('Video URL')
const stat = fs.statSync(path)
const fileSize = stat.size
const range = req.headers.range
if (range) {
const parts = range.replace(/bytes=/, "").split("-")
const start = parseInt(parts[0], 10)
const end = parts[1]
? parseInt(parts[1], 10)
: fileSize-1
const chunksize = (end-start)+1
const file = fs.createReadStream(path, {start, end})
const head = {
'Content-Range': `bytes ${start}-${end}/${fileSize}`,
'Accept-Ranges': 'bytes',
'Content-Length': chunksize,
'Content-Type': 'video/mp4',
}
res.writeHead(206, head)
file.pipe(res)
} else {
const head = {
'Content-Length': fileSize,
'Content-Type': 'video/mp4',
}
res.writeHead(200, head)
fs.createReadStream(path).pipe(res)
}
})
app.listen(3000, function () {
console.log('App is running on port 3000')
In the above code:
At the top I have included needed NPM packages. After that we have a get method served on route '/' for serving the html file. Then there is a get method with route '/video' which is being called from html file. In this method at first the filesize is detected with statSync method of fs. after that with stream video is downloaded to client in chunks. With every new request from client the value of start and end is changing to get the next chunk of video. 206 is set in response header to send only newly made stream(chunk of video).
Credit - Example source
I have a .png file in my node project folder and I want to read that file and send that file to a remote REST api which accepts form-data format and returns the image url after uploading it to S3.
I have previously used the same api for image upload in front-end using JavaScript. In my JS application I was using input type file to upload a image, which was giving me image in file format and then I was passing that to api after adding that file into formData object like this:
let formData = new FormData();
formData.append("content_file", file)
but when I'm trying to do the same in node.js, I'm not able to read that file into file format, due to which Api is not accepting the request body.
I'm new to node js, I'm not even sure that I'm reading the file in right way or not. Please help!
var express = require('express');
var fs = require('fs');
var path = require('path');
var Request = require("request");
var FormData = require('form-data');
var app = express();
// for reading image from local
app.get('/convertHtml2image1', function (req, res) {
fs.readFile(`image_path`, (err, data) => {
if (err) res.status(500).send(err);
let extensionName = path.extname(`banner.png`);
let base64Image = new Buffer(data, 'binary').toString('base64');
let imgSrcString = `data:image/${extensionName.split('.').pop()};base64,${base64Image}`;
// for converting it to formData
let formData = new FormData();
formData.append("content_file", data)
// for calling remote REST API
Request.post({
"headers": { "token": "my_token" },
"url": "api_url",
"body": formData
}, (error, response, body) => {
if (error) {
return console.log(error);
}
let result = JSON.parse(body)
res.send("image_url: " + result.url)
});
})
})
app.listen(5000);
As far as I understand, you try to create Blobs in Nodejs. It is not defined but it is basically an arraybuffer with file information. Maybe, you can use an external npm package to make it blob. Check this
Instead of fs.readFile() use fs.createReadStream() and don't upload image inside file stream, first read the file then add it to formData and then upload(or send it to api) because instead of file you were uploading the stream here.
var express = require('express');
var fs = require('fs');
var path = require('path');
var Request = require("request");
var FormData = require('form-data');
var app = express();
app.get('/convertHtml2image', function (req, res) {
var formData = {
name: 'content_file',
content_file: {
value: fs.createReadStream('banner.png'),
options: {
filename: 'banner.png',
contentType: 'image/png'
}
}
};
Request.post({
"headers": {
"Content-Type": "multipart/form-data",
"token": my_token
},
"url": api_url,
"formData": formData
}, (error, response, body) => {
if (error) {
return console.log("Error: ", error);
}
let result = JSON.parse(body)
res.send("image_url: " + result.url)
});
})
app.listen(5000);
I'm trying to upload images to aws-s3 via a signed-url from NodeJS server (not from a browser). The image to upload has been generated by NodeJS. I'm getting the signed-url from aws and succeeding to upload it to s3.
But my image is corrupted. For some reason, S3 is adding some headers to my image (compare image attached).
What am I doing wrong?
getting the signed url:
try {
var params = {
Bucket: bucketName,
Key: 'FILE_NAME.png',
Expires: 60
};
const url = await s3.getSignedUrlPromise('putObject', params);
return url;
} catch (err) {
throw err;
}
uploading to s3
var stats = fs.statSync(filePath);
var fileSizeInBytes = stats["size"];
const imageBuffer = fs.readFileSync(filePath);
var formData = {
'file': {
value: imageBuffer,
options: {
filename: 'FILE_NAME.png'
}
}
};
request({
method: 'put',
url,
headers: {
'Content-Length': fileSizeInBytes,
'Content-MD': md5(imageBuffer)
},
formData
}, function (err, res, body) {
console.log('body',body);
});
Compare between the actual image and the uploaded image to s3. S3 added some headers:
I know this is old but I struggled with the same issue for a while. When uploading using a pre-sgined url, DO NOT use new FormData();
One thing I noticed that all of my files on s3 were exactly 2kb larger than the originals.
<input type="file" id="upload"/>
var upload = document.getElementById('upload');
var file = upload.files[0];
//COMMENTED OUT BECAUSE IT WAS CAUSING THE ISSUE
//const formData = new FormData();
//formData.append("file", file);
// Assuming axios
const config = {
onUploadProgress: function(progressEvent) {
var percentCompleted = Math.round(
(progressEvent.loaded * 100) / progressEvent.total
);
console.log(percentCompleted);
},
header: {
'Content-Type': file.type
}
};
axios.put(S3SignedPutURL, file, config)
.then(async res => {
callback({res, key})
})
.catch(err => {
console.log(err);
})
I followed the above solution for react js
What I was doing before uploading an image is passing through the createObject URL and then passing it to the API body.
if (e.target.files && e.target.files[0]) {
let img = e.target.files[0];
**setImage(URL.createObjectURL(img))**
Correct Way:
if (e.target.files && e.target.files[0]) {
let img = e.target.files[0];
**setImage(img)**
Work For me, Thanks Sam Munroe
Came here in 2023, was facing the same problem using formdata, but in postman, before handing it to the front end department.
To handle it in postman, use the type of request body as binary:
And don't forget to add the proper headers.
Try to specify the content type in the request as Content-Type multipart/form-data.
I've spent hours trying to find the solution for something which should be quite simple: uploading a file to the server from the client. I am using React.js on the frontend, Express on the backend, and multer for the image uploads.
When I try to upload a file, nothing happens. An uploads/ directory is created, but no file goes there. req.file and req.files are undefined. req.body.file is empty. The form data exists before it is sent.
If I set the Content-Type header to "multipart/form-data" I get a boundary error from multer.
Input
<input
onChange={this.sendFile}
name="avatar"
placeholder="Choose avatar"
type="file"
/>
sendFile
sendFile = e => {
const data = new FormData();
const file = e.target.files[0];
data.append("file", file);
this.props.sendFile(data);
};
Redux action
export default file => async dispatch => {
const res = await axios.post("/api/upload/", { file });
};
Express
const multer = require("multer");
const upload = multer({ dest: "uploads/" });
router.post("/upload/", upload.single("avatar"), (req, res) => {
return res.sendStatus(200);
});
I tried to reproduce it and made it work with this method:
sendFile = e => {
const data = new FormData();
const file = e.target.files[0];
data.append("avatar", file); // <-- use "avatar" instead of "file" here
axios({
method: 'post',
url: 'http://localhost:9000/api/upload',
data: data,
config: { headers: { 'Content-Type': 'multipart/form-data' } }
});
};
Try to set the content-type header to multipart/form-data in the axios request and send the full FormData object as the second parameter.
Like this:
const config = {
headers: {
'content-type': 'multipart/form-data'
}
};
axios.post('/api/upload/', file, headers);`