I want to copy data from one stream to another in Java i do in below way
ByteStreams.copy( inputStream, outputStream );
In Node JS i am trying to find out how to do that
// Making an ajax call to get the Video file
const getVideo = async (req, res) => {
try {
axios.get('Video URL')
.then(function (videoResponse) {
res.setHeader("Content-Type", "video/mp4");
// copy the inputStram of videoResponse
//to the output stram of res
// copy the videoResponse to res
})
} catch (error) {
console.log(error);
}
};
Can anyone suggest how to do that, Thank you for your help
You need to set the responseType from axios to 'stream', then you can pipe data from the input stream to the response stream using .pipe().
// Making an ajax call to get the video file
const getVideo = async (req, res) => {
try {
axios({
method: 'get',
url: 'Video URL',
responseType: 'stream'
}).then((videoResponse) => {
res.setHeader("Content-Type", "video/mp4");
videoResponse.data.pipe(res);
});
} catch (err) {
console.log(err);
}
}
For more information about the .pipe() function, please read the Node.js docs.
For more information about axios request configuration options, please read the axios docs.
The most simple example for reading and writing to File System would be:
const fs = require('fs')
const input = fs.createReadStream('input_file')
const output = fs.createWriteStream('output_file')
input.pipe(output)
Check the File System docs for Node.js.
The input and output stream can be any ReadStream and WriteStream, like an HTTP response or S3 for example.
From the Axios Github README you have the example which looks very similar to what you are trying to do (please use the original one, I had to change the URL here)
// GET request for remote image in node.js
axios({
method: 'get',
url: 'https://lh3.googleusercontent.com/iXm....',
responseType: 'stream'
})
.then(function (response) {
response.data.pipe(fs.createWriteStream('ada_lovelace.jpg'))
});
The example below streams video outpu. I am assuming you need something similar. Can you try something like this and based on this example modify your code
const express = require('express')
const fs = require('fs')
const path = require('path')
const app = express()
app.get('/', function(req, res) {
res.sendFile(path.join(__dirname + '/index.html'))
})
app.get('/video', function(req, res) {
const path = 'assets/sample.mp4' //This can be replaced by axios.get('Video URL')
const stat = fs.statSync(path)
const fileSize = stat.size
const range = req.headers.range
if (range) {
const parts = range.replace(/bytes=/, "").split("-")
const start = parseInt(parts[0], 10)
const end = parts[1]
? parseInt(parts[1], 10)
: fileSize-1
const chunksize = (end-start)+1
const file = fs.createReadStream(path, {start, end})
const head = {
'Content-Range': `bytes ${start}-${end}/${fileSize}`,
'Accept-Ranges': 'bytes',
'Content-Length': chunksize,
'Content-Type': 'video/mp4',
}
res.writeHead(206, head)
file.pipe(res)
} else {
const head = {
'Content-Length': fileSize,
'Content-Type': 'video/mp4',
}
res.writeHead(200, head)
fs.createReadStream(path).pipe(res)
}
})
app.listen(3000, function () {
console.log('App is running on port 3000')
In the above code:
At the top I have included needed NPM packages. After that we have a get method served on route '/' for serving the html file. Then there is a get method with route '/video' which is being called from html file. In this method at first the filesize is detected with statSync method of fs. after that with stream video is downloaded to client in chunks. With every new request from client the value of start and end is changing to get the next chunk of video. 206 is set in response header to send only newly made stream(chunk of video).
Credit - Example source
Related
Edit
No solutions worked for me, so I ended up doing the following:
// in handler
return reply.sendFile('my_script.sql', 'scripts');
// on the client
const { data } = await axios({
url: '/api/generate',
method: 'GET',
responseType: 'blob'
})
const url = window.URL.createObjectURL(new Blob([content]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', 'my_script.sql');
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
This works like a charm
Original post:
I have a simple endpoint as follows:
fastify.get('/generate', async (request, reply) => {
console.log('endpoint hit'); // this line is printed
const filePath = path.join(__dirname, '..', '..', 'scripts', 'my_script.sql');
reply.sendFile(filePath, 'scripts');
})
This just sends the contents of the sql file in text form like so:
I know that express has a res.download method, but I couldn't find an analog with fastify.
What am I doing wrong here? How can I download this file?
You can use fastify-static or by sending the right headers:
fastify.get('/', (request, reply) => {
const filePath = require('path').join(
__dirname,
'../../scripts/my_script.sql'
)
const stream = require('fs').createReadStream(filePath)
reply.header(
'Content-Disposition',
'attachment; filename=foo.sql'
reply.send(stream).type('application/sql').code(200)
})
I have a code as shown below that uploads files from the browser and saves in the server, once it has been saved to the server, I want the server to connect to the Pinata API so the file can also be saved to the IPFS node.
let data = new FormData();
const fileBuffer = Buffer.from(`./public/files/${fileName}`, 'utf-8');
data.append('file', fileBuffer, `${fileName}`);
axios.post('https://api.pinata.cloud/pinning/pinJSONToIPFS',
data,
{
headers: {
'Content-Type': `multipart/form-data; boundary= ${data._boundary}`,
'pinata_api_key': pinataApiKey,
'pinata_secret_api_key': pinataSecretApiKey
}
}
).then(function (response) {
console.log("FILE UPLOADED TO IPFS NODE", fileName);
console.log(response);
}).catch(function (error) {
console.log("FILE WASNT UPLOADED TO IPFS NODE", fileName);
console.log(error);
});
The issue i'm having is that after creating a buffer of my file and wrapping it in a formdata, the pinata API returns an error :
data: {
error: 'This API endpoint requires valid JSON, and a JSON content-type'
}
If i convert the data to string like JSON.stringify(data) and change the content-type to application/json, the file buffer will be uploaded successfully as string.
I hope explained it well to get a solution. Thanks.
It looks like you're attempting to upload a file to the pinJSONToIPFS endpoint, which is intended to purely be used for JSON that is passed in via a request body.
In your situation I would recommend using Pinata's pinFileToIPFS endpoint
Here's some example code based on their documentation that may be of help:
//imports needed for this function
const axios = require('axios');
const fs = require('fs');
const FormData = require('form-data');
export const pinFileToIPFS = (pinataApiKey, pinataSecretApiKey) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
//we gather a local file for this example, but any valid readStream source will work here.
let data = new FormData();
data.append('file', fs.createReadStream('./yourfile.png'));
return axios.post(url,
data,
{
maxContentLength: 'Infinity', //this is needed to prevent axios from erroring out with large files
headers: {
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
'pinata_api_key': pinataApiKey,
'pinata_secret_api_key': pinataSecretApiKey
}
}
).then(function (response) {
//handle response here
}).catch(function (error) {
//handle error here
});
};
The proper code to pin any file to IPFS is as below.
Apparently, even Pinata support staff didn't know this.
You need to set an object with the property name filepath as your last parameter. The name doesn't matter, it can be a duplicate, it can be the same as others, or it can be unique.
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
const fileContents = Buffer.from(bytes);
const data = new FormData();
data.append("file", fileContents, {filepath: "anyname"});
const result = await axios
.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
"pinata_api_key": userApiKey,
"pinata_secret_api_key": userApiSecret,
"path": "somename"
}
});
Code to upload a file on IPFS using Pinata.
There are two methods available to upload files/images on Pinata. One is with Pinata SDK and the second is the pinFileToIPFS endpoint.
If you are uploading files from Next.js then you cannot convert your image into binary using fs.createReadStream or Buffer.from. These packages support the Node side. So if you want to upload the file with Next.js on Pinata then you can use this code.
// convert file into binary
const data = new FormData();
data.append("title", file.name);
data.append("file", file);
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
// pass binary data into post request
const result = await axios.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: "your_pinata_key",
pinata_secret_api_key:
"your_pinata_secret",
path: "somename",
},
});
console.log("RESULT", result);
this will upload a file to ipfs under the path ipfs://{cid}/images/{fileId}
const PINATA_BASE_URL = "https://api.pinata.cloud";
const PINATA_PIN_URI = "/pinning/pinFileToIPFS";
const fileExt = file.type.split("/")[1];
let nftId = 1
// creates a 64byte string '0000...0001' to follow ERC-1155 standard
const paddedId = createPaddedHex(nftId);
const ipfsFileId = `${paddedId}.${fileExt}`;
const ipfsImageFilePath = `/images/${ipfsFileId}`;
const fileUploadData = new FormData();
// this uploads the file and renames the uploaded file to the path created above
fileUploadData.append("file", file, ipfsImageFilePath);
fileUploadData.append(
"pinataOptions",
'{"cidVersion": 1, "wrapWithDirectory": true}'
);
fileUploadData.append(
"pinataMetadata",
`{"name": "${ipfsImageFilePath}", "keyvalues": {"company": "Pinata"}}`
);
const pinataUploadRes = await axios.post(
PINATA_BASE_URL + PINATA_PIN_URI,
fileUploadData,
{
headers: {
Authorization: `Bearer ${PINATA_JWT}`,
},
}
);
const ipfsCID = pinataUploadRes.data.IpfsHash;
i'm trying to send an image with axios (nodejs) to express server with formidable.
This is the code of axios script:
const axios = require('axios');
const fs = require('fs')
const FormData = require('form-data')
var img = fs.readFileSync("C:/Users/alessio/Documents/FX/screenshot.png", 'utf8');
console.log(img)
let data = new FormData();
data.append('img', img, "img.png")
console.log(data);
axios.post('http://192.168.43.193:3000/testAPI/upload_img_mt', data, {
headers: {
'accept': 'application/json',
'Accept-Language': 'en-US,en,q=0.8',
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
'timeout': 999999
},
})
.then(function (response){
//console.log(response);
})
And this is the code serverside with express and the response managed with formidable:
router.post('/upload_img_mt', function(req, res, next){
console.log(req)
var form = new formidable.IncomingForm();
form.uploadDir = "fxdiary";
form.encoding = 'utf8';
form.on('fileBegin', function(name, file){
console.log(form.uploadDir + "/" + file.name);
});
form.parse(req, function(err, fields, files) {
console.log(files);
console.log(err);
console.log(fields);
});
res.sendStatus(200);
});
The file image is saved but is not correct png image. The size of the image is not correct and change sometimes randomly. Ad example the original file size is 33k and become 900bytes or 54k or another value.
Whats happen? Where is the problem in this code?
You don't need to pass boundary in Content-type header as it's auto added by browser. you might breaking default boundary mechanism.
If you still face file size issue then try with multer module for file handling on Nodejs side
I would like to get upload progress information during http post request using node js http.
my code is simple and it works but I am only notified when there is an error or the upload is completed.
But I need to show a progress bar to the user and to do that I need to check the progress every second or so.
Basically I need to have upload speed and how much of the file has been uploaded at given time.
const http = require('http');
const fs = require('fs');
const FormData = require('form-data');
function myUploadFunction(authToken,uploadPath, file) {
const form = new FormData();
form.append('file', fs.createReadStream(file.path));
form.append('filename', file.name);
form.append('parent_dir', '/');
const requestHeaders = {
Accept: '*/*',
Authorization: `Token ${authToken}`,
};
const options = {
host: 'myHost',
port: 'myPort',
path: uploadPath,
method: 'POST',
headers: requestHeaders,
};
form.submit(options, (error, response) => {
response.setEncoding('utf8');
response.on('data', (chunk) => {
// this code runs only when the upload is completed and the server only send some info about the file. And only if this info is large it will send it in chunks
logger.debug(`BODY: ${chunk}`);
});
response.on('end', () => {
logger.debug('No more data in response.');
});
});
}
When I upload a file, nothing happens till upload is completed then I get the response and it executes response.onData and afetr that response.onEnd.
You just need to add another listener to handle the progress
form.on('progress', (bytesReceived, bytesExpected) => {
logger.warn('progress bytesReceived: ', bytesReceived);
logger.warn('progress bytesExpected: ', bytesExpected);
});
I have used the Winston module to create a daily log file for my offline app. I now need to be able to send or upload that file to a remote server via POST (that part already exists)
I know I need to write the file in chunks so it doesn't hog the memory so I'm using fs.createReadStream however I seem to only get a 503 response, even if sending just sample text.
EDIT
I worked out that the receiver was expecting the data to be named 'data'. I have removed the createReadSteam as I could only get it to work with 'application/x-www-form-urlencoded' and a synchronous fs.readFileSync. If I change this to 'multipart/form-data' on the php server would I be able to use createReadStream again, or is that only if I change to physically uploading the json file.
I've only been learning node for the past couple of weeks so any pointers would be gratefully received.
var http = require('http'),
fs = require('fs');
var post_options = {
host: 'logger.mysite.co.uk',
path: '/',
port: 80,
timeout: 120000,
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}
var sender = http.request(post_options, function(res) {
if (res.statusCode < 399) {
var text = ""
res.on('data', function(chunk) {
text += chunk
})
res.on('end', function(data) {
console.log(text)
})
} else {
console.log("ERROR", res.statusCode)
}
})
var POST_DATA = 'data={['
POST_DATA += fs.readFileSync('./path/file.log').toString().replace(/\,+$/,'')
POST_DATA += ']}'
console.log(POST_DATA)
sender.write(POST_DATA)
sender.end()
After gazillion of trial-failure this worked for me. Using FormData with node-fetch. Oh, and request deprecated two days ago, btw.
const FormData = require('form-data');
const fetch = require('node-fetch');
function uploadImage(imageBuffer) {
const form = new FormData();
form.append('file', imageBuffer, {
contentType: 'image/jpeg',
filename: 'dummy.jpg',
});
return fetch(`myserver.cz/upload`, { method: 'POST', body: form })
};
In place of imageBuffer there can be numerous things. I had a buffer containing the image, but you can also pass the result of fs.createReadStream('/foo/bar.jpg') to upload a file from drive.
copied from https://github.com/mikeal/request#forms
var r = request.post('http://service.com/upload', function optionalCallback (err, httpResponse, body) {
if (err) {
return console.error('upload failed:', err);
}
console.log('Upload successful! Server responded with:', body);
})
var form = r.form()
form.append('my_field1', 'my_value23_321')
form.append('my_field2', '123123sdas')
form.append('my_file', fs.createReadStream(path.join(__dirname, 'doodle.png')))
Have a look at the request module.
It will provide you the ability to stream a file to POST requests.