I am uploading images to Nodejs server using Angular 2 HTTPRequest class and tracking progress of upload as well. Once the upload is completed I return the path of the uploaded file in response to the POST request. I am stuck with/unable to capture the return value of the post request. I need the URL to show a thumbnail of the image just uploaded.
The below snippet is from Nodejs.
router.post('/',function(req, res, next) {
var form = new formidable.IncomingForm();
var oldPath;
var newPath;
form.parse(req, function(err, fields, files) {
for(var file in files) {
if(!files.hasOwnProperty(file)) {
continue;
}
oldPath = files[file].path;
newPath = './uploadedFiles/' + files[file].name;
fs.renameSync(oldPath, newPath);
}
res.json(newPath);
res.end();
});
module.exports = router;
The below snippet is from Angular 2 on client side.
uploadFiles(f, progressUpdate) {
this.files = f;
let req: any;
const data = new FormData();
for (let i = 0; i < this.files.length; i++) {
data.append('file' + i, this.files[i], this.files[i].name);
}
req = new HttpRequest('POST', 'api/upload', data, {
reportProgress: true
});
// The `HttpClient.request` API produces a raw event stream
// which includes start (sent), progress, and response events.
return this.http.request(req).pipe(
map(event => this.getEventMessage(event, this.files[0])),
tap(message => progressUpdate(message)),
last()
);
}
Ok so figured it out myself...There was an HttpEvent being fired called the EventType.Response. This had an event object where a body property contained the response I was returning from the server...My bad...
Related
I am able to upload files to the server with the following code:
service.component.ts
uploadFiles(){
console.log("uploadFiles() in Service");
const formData: FormData = new FormData();
for(let i = 0; i<this.files_to_upload.length; i++){
formData.append('files', this.files_to_upload.item(i));
}
return this.http.post(`${this.basicUrl}/post/file`, formData).subscribe(
res => console.log(res)
)
}
In Express:
app.post('/post/file', (req,res)=>{
console.log("post/file");
console.log(req.body.files)
if(req.files){
console.log(req.files);
let files = req.files.files;
if(files.length > 1){
for(let i=0; i<files.length; i++){
let filename = files[i].name
files[i].mv(__dirname + '/uploads/' + filename, function(err){
if(err){
console.log(err);
} else {
console.log("Uploaded")
}
})
}
}
}
This would work perfectly.
But now I want to extend the function in my service so that I have a path I can navigate to and upload those files in the correct folder.
uploadFiles(path){
console.log("uploadFiles() in Service");
const formData: FormData = new FormData();
for(let i = 0; i<this.files_to_upload.length; i++){
formData.append('files', this.files_to_upload.item(i));
}
return this.http.post(`${this.basicUrl}/post/file`, {files: formData, path: path}).subscribe(
res => console.log(res)
)
}
But then somehow I cannot acces the files.
If I do the follwing in Express:
app.post('/post/file', (req,res)=>{
console.log("post/file");
console.log(req.body)
...
The body has both properties files and path but files is an empty object...
How can I access the files when extending the object sending to the server?
When sending a file, the body should be encoded using multipart/form-data method. You are sending a Json object (although it has a multipart property) therefore the body would be encoded using application/json method which is the reason why you can't access the uploaded files.
I am not sure what I am doing wrong.
I have a html content and want to save it as pdf. I use html-pdf (from npm) and a download library http://danml.com/download.html
Actually when I directly save to file or show it as a result I can get the pdf without problem. But I call my webservice method from a js method and I have a stream/buffer as a return value and saving with the 'download' library
Here is my code
pdf.create(html, options).toFile('./mypdf.pdf', function (err, res) {
if (err) return console.log(err);
console.log(res);
});
pdf.create(html,options).toBuffer(function (err, buffer) {
if (err) return reject(err);
return resolve(buffer);
});
//res.setHeader('Content-type', 'application/pdf');
pdf.create(html, options).toStream(function (err, stream) {
if (err) return res.send(err);
//res.type('pdf');
return resolve(stream);// .pipe(res);
});
I can save the content as a pdf it works fine. but when I try to send stream or buffer, somehow the page is empty. I opened the both pdf files with notepad. There are some differences. For example, probless one is 44kb the other one 78 kb. and the empty one contains also the following lines
%PDF-1.4 1 0 obj << /Title (��) /Creator (��) /Producer (�� Q t 5 .
5 . 1) /CreationDate (D:20190524152156)
endobj
I think toBuffer or toStream method has a problem in my case. Because the stream seems not bad. at least I can see that it is a pdf file (no error, just page is empty)
Anyway, here is my API router
let result = await
routerVoucher.CreatePdfStream(req.query.VoucherCode,req.query.AppName);
res.setHeader('Content-type', 'application/pdf');
res.type('pdf');
//result.pipe(res);
res.end(result, 'binary');
and here is my js consumer
$.ajax({
type: "GET",
url: '/api/vouchers/GetLicensePdf',
data:data,
success: function (pdfFile) {
if (!pdfFile)
throw new Error('There is nothing to download');
download(pdfFile,voucherCode + '.pdf', 'application/pdf')
I've solved the problem.
Firstly I converted buffer to base64
const base64 = buffer.toString('base64')
and then converted base64 to blob by using the following code
function base64toBlob (base64Data, contentType) {
contentType = contentType || '';
var sliceSize = 1024;
var byteCharacters = atob(base64Data);
//var byteCharacters = decodeURIComponent(escape(window.atob(base64Data)))
var bytesLength = byteCharacters.length;
var slicesCount = Math.ceil(bytesLength / sliceSize);
var byteArrays = new Array(slicesCount);
for (var sliceIndex = 0; sliceIndex < slicesCount; ++sliceIndex) {
var begin = sliceIndex * sliceSize;
var end = Math.min(begin + sliceSize, bytesLength);
var bytes = new Array(end - begin);
for (var offset = begin, i = 0 ; offset < end; ++i, ++offset) {
bytes[i] = byteCharacters[offset].charCodeAt(0);
}
byteArrays[sliceIndex] = new Uint8Array(bytes);
}
return new Blob(byteArrays, { type: contentType });
}
and then
again I've used my download method (from download.js library) as follow
download(new Blob([base64toBlob(base64PDF,"application/pdf")]),
voucherCode + '.pdf', "application/pdf");
then everything is fine :)
I have a form that has files control which can select multiple images. I want to upload the multiple images to server and data to database. I tried formData:
postNewProduct(vlaue) is called when submit button is pressed:
postNewProduct(value) {
const formData = new FormData();
const files: Array<File> = this.selectedFile;
for(let i =0; i < files.length; i++){
formData.append("uploads[]", files[i], files[i]['name']);
}
formData.append('productName', value.productName);
formData.append('productCategory', value.productCategory);
this.productService.postNewProduct(formData).subscribe(
message => alert(message));
}
onFileSelected(fileInput) is called when images are selected by form controls:
onFileSelected(fileInput){
let images = fileInput.target.files.length;
if(images < 1 || images > 4){
alert("please select at image between 1 and 4");
}else{
this.selectedFile = <Array<File>>fileInput.target.files;
console.log(this.selectedFile);
}
}
This is my function in service that make calls to api:
postNewProduct(newProduct){
return this.http.post(`/api/product/postproduct`, newProduct);
}
this is my function in node backend but there is nothing in request.body and files
exports.postProduct = function(request, response) {
console.log(request.body);
console.log(request.files);
productModel.postProduct(request, (message, lastID) => {
response.json(message, lastID);
});
}
I have a question about handling a gzip response on my client side application. I would like the client's browser to pop up an alert "how do you want to handle?" download prompt.
My Node.js server is compressing my files into a gzip format then sending it with a HTTP write response. My client receives a HTTP 200 status although the size of the response is very small compared to my file and nothing doesn't populate my web app. I have anticipated the browser to handle this sort of response to a server sending gzip. similar to how gmail handles downloading files. Can you help me to see if I have missed anything?
server.js
var server = http.createServer(function(request, response) {
if (request.url === '/download'){
let data_zip = retrievedata()
const scopedata_zip = ('./scopedata.txt.gz')
response.writeHead(200, { 'Content-Encoding': 'gzip' });
response.writeHead(200, { 'Content-Type': 'application/javascript' });
response.write(scopedata_zip);
}
})
var retrievedata = () =>{
const gzip = zlib.createGzip();
const inp = fs.createReadStream('scopedata.txt');
const out = fs.createWriteStream('scopedata.txt.gz');
inp.pipe(gzip).pipe(out);
return out
}
Client.js
var downloadData=()=>{
var xhr = new XMLHttpRequest();
xhr.open('POST', 'download', true);
//xhr.setRequestHeader("Accept-Encoding", "gzip")
xhr.setRequestHeader("Encoding", "null")
xhr.onload = function (){
if(this.status == 200){
let form = document.createElement("form");
let element1 = document.createElement("input");
document.body.appendChild(form);
let response = this.responseText
console.log(response)
document.getElementById("state").innerHTML = 'download'
document.getElementById("index").innerHTML = response;
// document.getElementById("state").appendChild(form)
}
}
xhr.onerror = function(err){
console.log("request error...",err)
}
xhr.send()
}
The client is just populating my index div the response to, but nothing is received.
my gzip file is 327mb.
Chrome inspector network says this request is only 170B so I am not receiving my file.
Note xhr.setRequestHeader("Accept-Encoding", "gzip") is commented out becuase I get this error: Refused to set unsafe header "Accept-Encoding". I have set it to null to allow the browser to handle this.
Any input on what I am doing wrong?
There were three things I was doing wrong. I managed to get the browser window by creating a new element, checking if the element has a download attribute and appending the XHR.Response as the location from the href. The second portion of my issue was not receiving the zip file with the appropriate request headers. Because my zip file was a larger size the browser handles the binary buffer stream as a blob. Read more about XHR response types XHR.response. The other issue was on my server side which was using fs.readFile to read the zip as a buffer. Because my zip was made up of multiple files fs.readFile it would stop reading as it hit the end of the first file.
so my client code looks like
var xhr = new XMLHttpRequest();
document.getElementById("state").innerHTML = ' '
document.getElementById("index").innerHTML = ' ';
xhr.open('POST', 'download', true);
xhr.setRequestHeader('Content-disposition', 'attachment')
xhr.setRequestHeader("Content-type","application/zip"); //content-type must be set
xhr.setRequestHeader("Encoding", "null") //unsure of why I need this but it doesnt work with out it for me
xhr.responseType = "blob"; // This must be set otherwise the browser was interpretting the buffer stream as string instead of binary
xhr.onload = function (){
if(this.status == 200){
let form = document.createElement("form");
let element1 = document.createElement("input");
document.body.appendChild(form);
let response = this.response // defined as blob above
document.getElementById("state").innerHTML = 'download'
document.getElementById("index").innerHTML = response;
var blob = new Blob([response], {type: "application/zip"});
var file = URL.createObjectURL(blob);
filename = 'Data.zip'
var a = document.createElement("a");
if ("download" in a) { //check if element can download
a.href = file;
a.download = filename;
document.body.appendChild(a);
a.click(); //automatically browser download
document.body.removeChild(a);
}
}
Server side
else if (request.url === '/download'){
archiveZip((data)=>{ // using archivezip and adding a callback function to insert my routes XHR response
response.setHeader('Content-Type', 'application/zip')
response.setHeader('Content-Length', data.length) // this is important header because without it the browser might truncate the entire response especially if there are end of file characters zipped up in the buffer stream
response.setHeader('Content-disposition', 'attachment; filename="Data.zip"');
response.end(data);
})
}
var archiveZip = (callback) =>{
var output = fs.createWriteStream(__dirname + '/Data.zip'); //output
var archive = archiver('zip', {zlib: { level: 9 }});
output.on('close', function() {
console.log(archive.pointer() + ' total bytes');
console.log('archiver has been finalized and the output file descriptor has closed.');
fs.readFile('./Data.zip', function (err, content) {
if (err) {
response.writeHead(400, {'Content-type':'text/html'})
console.log(err);
response.end("No such file");
} else {
callback(content);
}
});
});
output.on('end', function() {
console.log('Data has been drained');
});
archive.on('error', function(err) {
throw err;
});
archive.pipe(output);
// append a file
archive.file(data_files + '/parsed_scope.json', { name: 'parsed_scope.json' });
archive.file(data_files + '/scopedata_index.json', { name: 'scopedata_index.json' });
archive.file(data_files + '/scopedata.txt', { name: 'scopedata.txt' });
archive.finalize();
There are many zip libraries I was looking at ones that can handle zipping a directory with multiple files and went with archiver. I would have like to use the built in zlib that comes with node but only supports single small files.
i need to send a PDF file from angularjs client to NodeJS service.
I did the angularjs service, and when i receive the file its a string like this:
%PDF-1.3
3 0 obj
<</Type /Page
/Parent 1 0 R
/Reso
How can i reconvert this string to PDF in NodeJS?
This is the client code:
var sendByEmail = function () {
$scope.generatingPdf = true;
$('#budget').show();
var pdf = new JsPDF('p', 'pt', 'letter');
var source = $('#budget')[0];
pdf.addHTML(source, 0, 0, function () {
var resultPdf = pdf.output();
BillService.sendByEmail("rbrlnx#gmail.com", resultPdf).then(function () {
});
$('#budget').hide();
});
};
var sendByEmail = function (email, file) {
var deferred = $q.defer();
var data = {
email: email,
file: file
};
BillService.sendByEmail(data, function (result) {
deferred.resolve(result);
}, function () {
deferred.reject();
});
return deferred.promise;
};
The server code controller its empty:
var sendByEmail = function (req, res, next) {
var file = req.body.file;
};
I experimented with this a while ago, and I came up with this. It's not production ready by a long shot maybe you find it useful. It's free of front end libraries (except Angular ofcourse), but assumes you're using Express 4x and body-parser.
The result:
In the browser:
On the server:
What you're seeing:
You're seeing a tiny node server, serving static index.html and angular files, and a POST route receiving a PDF in base64 as delivered by the HTML FileReader API, and saves it to disk.
Instead of saving to disk, you can send it as an email attachment. See for instance here or here for some info on that.
The example below assumes uploading a PDF by a user through a file input, but the idea is the same for all other ways of sending a document to your back end system. The most important thing is to send the pdf data as BASE64, because this is the format that most file writers and email packages use (as opposed to straight up binary for instance..). This also goes for images, documents etc.
How did I do that:
In your HTML:
<div pdfs>Your browser doesn't support File API.</div>
A directive called pdfs:
myApp.directive('pdfs', ['upload', function(upload) {
return {
replace: true,
scope: function() {
files = null;
},
template: '<input id="files" type="file">',
link: function(scope,element) {
element.bind('change', function(evt) {
scope.$apply(function() {
scope.files = evt.target.files;
});
});
},
controller: function($scope, $attrs) {
$scope.$watch('files', function(files) {
//upload.put(files)
if(typeof files !== 'undefined' && files.length > 0) {
for(var i = 0; i<files.length;i++) {
readFile(files[i])
}
}
}, true);
function readFile(file) {
var reader = new FileReader();
reader.addEventListener("loadend", function(evt) {
upload.post({name: file.name, data: reader.result})
})
if(reader.type = 'application/pdf') {
reader.readAsDataURL(file);
}
}
}
}
}]);
A tiny service:
myApp.service('upload', function($http) {
this.post = function(file) {
$http.post('/pdf', file);
}
});
And a node server:
var express = require('express');
var bodyParser = require('body-parser')
var fs = require("fs");
var app = express();
app.use(express.static('.'));
app.use( bodyParser.json({limit: '1mb'}) );
app.post('/pdf', function(req, res){
var name = req.body.name;
var pdf = req.body.data;
var pdf = pdf.replace('data:application/pdf;base64,', '');
res.send('received');
fs.writeFile(name, pdf, 'base64', function(err) {
console.log(err);
});
});
var server = app.listen(3000, function() {
console.log('Listening on port %d', server.address().port);
});