nodejs cant open pdf file after download - node.js

I am using nest js, I have converted html file and saved to pdf on my localhost when opening the pdf file from the save location it is fine,
but when I am downloading I'm unable to open the file.
My api controller answers that the file is download successfully.
async exportPDF(#Res({ passthrough: true }) res: Response, #Body() dto: ExportReadingsPDFDto) {
const stream = await this.metersService.exportPDF(dto);
const filename = stream.replace(/^.*[\\\/]/, "");
const fileReadStream = fs.createReadStream(stream, { encoding: "base64" });
const stat = fs.statSync(stream);
res.set({
"Content-Type": "application/pdf",
"Content-Length": stat.size,
"Content-Disposition": 'attachment; filename="' + filename + '"'
});
fileReadStream.pipe(res);
}
Please help, I couldn't find any other example for creating pdf files and sending them to the user

You can simply create the PDF file within server, then using this piece of code you can download it as a response for the user.
const filename = '123.pdf';
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
const filestream = createReadStream('files/' + filename);
filestream.pipe(res);

Related

IPFS Pinata service not accepting file

I have a code as shown below that uploads files from the browser and saves in the server, once it has been saved to the server, I want the server to connect to the Pinata API so the file can also be saved to the IPFS node.
let data = new FormData();
const fileBuffer = Buffer.from(`./public/files/${fileName}`, 'utf-8');
data.append('file', fileBuffer, `${fileName}`);
axios.post('https://api.pinata.cloud/pinning/pinJSONToIPFS',
data,
{
headers: {
'Content-Type': `multipart/form-data; boundary= ${data._boundary}`,
'pinata_api_key': pinataApiKey,
'pinata_secret_api_key': pinataSecretApiKey
}
}
).then(function (response) {
console.log("FILE UPLOADED TO IPFS NODE", fileName);
console.log(response);
}).catch(function (error) {
console.log("FILE WASNT UPLOADED TO IPFS NODE", fileName);
console.log(error);
});
The issue i'm having is that after creating a buffer of my file and wrapping it in a formdata, the pinata API returns an error :
data: {
error: 'This API endpoint requires valid JSON, and a JSON content-type'
}
If i convert the data to string like JSON.stringify(data) and change the content-type to application/json, the file buffer will be uploaded successfully as string.
I hope explained it well to get a solution. Thanks.
It looks like you're attempting to upload a file to the pinJSONToIPFS endpoint, which is intended to purely be used for JSON that is passed in via a request body.
In your situation I would recommend using Pinata's pinFileToIPFS endpoint
Here's some example code based on their documentation that may be of help:
//imports needed for this function
const axios = require('axios');
const fs = require('fs');
const FormData = require('form-data');
export const pinFileToIPFS = (pinataApiKey, pinataSecretApiKey) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
//we gather a local file for this example, but any valid readStream source will work here.
let data = new FormData();
data.append('file', fs.createReadStream('./yourfile.png'));
return axios.post(url,
data,
{
maxContentLength: 'Infinity', //this is needed to prevent axios from erroring out with large files
headers: {
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
'pinata_api_key': pinataApiKey,
'pinata_secret_api_key': pinataSecretApiKey
}
}
).then(function (response) {
//handle response here
}).catch(function (error) {
//handle error here
});
};
The proper code to pin any file to IPFS is as below.
Apparently, even Pinata support staff didn't know this.
You need to set an object with the property name filepath as your last parameter. The name doesn't matter, it can be a duplicate, it can be the same as others, or it can be unique.
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
const fileContents = Buffer.from(bytes);
const data = new FormData();
data.append("file", fileContents, {filepath: "anyname"});
const result = await axios
.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
"pinata_api_key": userApiKey,
"pinata_secret_api_key": userApiSecret,
"path": "somename"
}
});
Code to upload a file on IPFS using Pinata.
There are two methods available to upload files/images on Pinata. One is with Pinata SDK and the second is the pinFileToIPFS endpoint.
If you are uploading files from Next.js then you cannot convert your image into binary using fs.createReadStream or Buffer.from. These packages support the Node side. So if you want to upload the file with Next.js on Pinata then you can use this code.
// convert file into binary
const data = new FormData();
data.append("title", file.name);
data.append("file", file);
const url = "https://api.pinata.cloud/pinning/pinFileToIPFS";
// pass binary data into post request
const result = await axios.post(url, data, {
maxContentLength: -1,
headers: {
"Content-Type": `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: "your_pinata_key",
pinata_secret_api_key:
"your_pinata_secret",
path: "somename",
},
});
console.log("RESULT", result);
this will upload a file to ipfs under the path ipfs://{cid}/images/{fileId}
const PINATA_BASE_URL = "https://api.pinata.cloud";
const PINATA_PIN_URI = "/pinning/pinFileToIPFS";
const fileExt = file.type.split("/")[1];
let nftId = 1
// creates a 64byte string '0000...0001' to follow ERC-1155 standard
const paddedId = createPaddedHex(nftId);
const ipfsFileId = `${paddedId}.${fileExt}`;
const ipfsImageFilePath = `/images/${ipfsFileId}`;
const fileUploadData = new FormData();
// this uploads the file and renames the uploaded file to the path created above
fileUploadData.append("file", file, ipfsImageFilePath);
fileUploadData.append(
"pinataOptions",
'{"cidVersion": 1, "wrapWithDirectory": true}'
);
fileUploadData.append(
"pinataMetadata",
`{"name": "${ipfsImageFilePath}", "keyvalues": {"company": "Pinata"}}`
);
const pinataUploadRes = await axios.post(
PINATA_BASE_URL + PINATA_PIN_URI,
fileUploadData,
{
headers: {
Authorization: `Bearer ${PINATA_JWT}`,
},
}
);
const ipfsCID = pinataUploadRes.data.IpfsHash;

Uploading pdf to firebase storage via NodeJS, corrupted file

I build an admin panel for my firebase app with nodejs.
Now I want to upload a pdf file to firebase/google storage.
My Javascript-client, to upload those files from computer, looks like this:
$("#upload-pdf").on("change", function(e) {
file = e.target.files[0];
document.getElementById("pdfName").value = file.name;
if (file) {
var reader = new FileReader();
reader.readAsDataURL(e.target.files[0]);
reader.onload = function (event) {
var fileContent = reader.result;
$('#pdfPreview').attr('src', event.target.result);
document.getElementById("pdfContent").value = fileContent;
}
}
});
The imagecontent of the file (value of pdfContent) will now be sent to my nodejs server via ajax:
...
$.ajax({
url: '/editMenucard',
type: 'POST',
data: JSON.stringify(output),
contentType: 'application/json',
success: function(data) {
window.location.assign('/restaurant/overview');
}
// ...
}).catch(function(error) {
// Handle error
});
In my nodejs-server I have implemented the following:
var bucket = admin.storage().bucket('myBucketName');
const contents = data.PDF.PDFContent;
mimeType = contents.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)[1];
fileName = data.PDF.PDFName + '-original.' + mimeTypes.detectExtension(mimeType);
base64EncodedPDFString = contents.replace("data:application\/pdf;base64,", '');
PDFBuffer = new Buffer.from(base64EncodedPDFString, 'base64');
const file = bucket.file('restaurant-menucards/' + fileName);
console.log(mimeType);
file.save(PDFBuffer, {
contentType: mimetype + ";charset=utf-8",
gzip : false,
metadata: { contentType: mimetype + ";charset=utf-8" },
public: true,
validation: 'md5'
}, function (error) {
if (!error) {
console.log("PDF successfully written")
}
});
The file is successfully uploaded to my firebase storage, but unforunately, when I want to open it (or embedd it in my html view via signed URL, that I created for) it doesn't work.
The error appears:
Fehler
Fehler beim Laden des PDF-Dokuments.
In english it means something like "Error; Failed to load pdf document".
So I think my PDF-file is corrupted. I also tested it in Safari, and the pdf didn't appear either.
Is there anything, I do wrong with my upload?
I'm using, as you can see, the NodeJS method "save" from the object "file".
Does anyone have suggestions for me?
I've tried on different browsers and it doesn't work.
The base64 string is not corrupted. I tried it in an online Base64-Converter and the pdf showed correctly.

how to send a zip file from server to client+meteor

I have the following code at server side to download a zip(destpath),zip file is getting downloaded at the client but when we try to open it shows invaild zip content.
this.route('download', {
where: 'server',
path: '/download/:_id',
action: function() {
destPath="/home/rootuser/botbuilder/botBuilderdevelo/Python.zip"
if (fs.existsSync(destPath)) {
filetext = fs.readFileSync(destPath, "utf-8");//tried encoding binary
}
var headers = {
'Content-Type': 'application/octet-stream',//tried application/zip
'Content-Disposition': "attachment; filename=" + "pyth" + '.zip'
};
this.response.writeHead(200, headers);
return this.response.end(filetext);
// i tried this.response.end(destpath);
}
})
specify the encoding when calling res.end() solved the problem:
this.response.end(filetext1,"binary");

File download giving corrupt file in Nodejs

I am using request module in NodeJS to read data from AWS S3. When I am downloading the file(docx or image or pdf) using below code,its giving me an invalid/corrupted file. But when I am downloading .txt file it's not getting corrupted and I am able to see file in notepad.
I did a bit of googling and as suggested also tried by setting encoding to binary, still its not giving required result.
File upload is working fine. And I am able to see the uploaded file in AWS console.
File download code
var s3 = new properties.AWS.S3();
var params = {Bucket: properties.AWS_BUCKET, Key: req.headers['x-org'] + "/" + "talk" + "/" + req.body.fileName};
s3.getSignedUrl('getObject', params, function (err, URL) {
if (err) {
console.log("Error inside the S3");
console.log(err, err.stack); // an error occurred
res.send(null);
} else {
console.log("After getObject:-" + URL);
request({
url: URL, //URL to hit
method: 'GET',
encoding: 'binary'
}, function (error, response, body) {
if (error) {
console.log(error);
} else {
//console.log(response.statusCode, body);
res.set('content-disposition', 'attachment; filename=' + req.body.fileName);
res.send(body);
}
});
}
});
Update:-
I have narrow down the error, and just trying to send the file by reading file from local file system. Even that also is giving the corrupted files on client.
Here's the code for same
var filePath = path.join(__dirname, '..', '..', '..', 'downloads', req.body.fileURL);
var stat = fs.statSync(filePath);
var filename = path.basename(filePath);
var mimetype = mime.lookup(filePath);
console.log("mimetype=" + mimetype);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype + ";charset=UTF-8");
res.setHeader('Content-Length', stat.size);
var filestream = fs.createReadStream(filePath);
filestream.pipe(res);
Finally able to solve the problem.
Got solution hint from this blog https://templth.wordpress.com/2014/11/21/handle-downloads-with-angular/.
Per this blog
When testing with binary content like zip files or images, we see
that the downloaded content is corrupted. This is due to the fact that
Angular automatically applies transformation on the received data.
When handling binary contents, we want to get them as array buffer.
Final working code is:-
var filePath = path.join(__dirname, '..', '..', '..', 'downloads', req.body.fileURL);
var file = fs.createWriteStream(filePath);
s3.getObject(params).
on('httpData', function(chunk) {
//console.log("inside httpData");
file.write(chunk);
}).
on('httpDone', function() {
console.log("inside httpDone");
file.end();
//file.pipe(res);
}).
send(function() {
console.log("inside send");
res.setHeader('Content-disposition', 'attachment; filename=' + filePath);
res.setHeader('Content-type', mimetype);
res.setHeader('Transfer-Encoding', 'chunked');
var filestream = fs.createReadStream(filePath);
filestream.pipe(res);
});

Express.js - how to download base64 string as PDF file?

I have pdf file encoded as base64 string. How to download this string to the browser as file in .pdf format?
What I have already tried:
res.set('Content-Disposition', 'attachment; filename="filename.pdf"');
res.set('Content-Type', 'application/pdf');
res.write(fileBase64String, 'base64');
I ended up to decode the pdf first and then send it to the browser as binary as follows:
(For simplicity I use node-http here but the functions are available in express as well)
const http = require('http');
http
.createServer(function(req, res) {
getEncodedPDF(function(encodedPDF) {
res.writeHead(200, {
'Content-Type': 'application/pdf',
'Content-Disposition': 'attachment; filename="filename.pdf"'
});
const download = Buffer.from(encodedPDF.toString('utf-8'), 'base64');
res.end(download);
});
})
.listen(1337);
What drove me nuts here was the testing with Postman:
I was using the Send Button instead of the Send and Download Button to submit the request:
Using the Send button for this request causes that the pdf file becomes corrupted after saving.
Just a reference for Express. This answer is based on ofhouse's answer.
This solution is downloading a png file. I was missing the "Content-Disposition" part, which makes the browser not display the png, but download it. png is a Buffer-object.
app.get("/image", (req, res) => {
getPng()
.then((png) => {
res.writeHead(200, {
"Content-Type": png.ContentType,
"Content-Length": png.ContentLength,
"Content-Disposition": 'attachment; filename="image.png"',
});
res.end(png.Body);
})
.catch(() => {
res.send("Couldn't load the image.");
});
});

Resources