I'm using the Google Drive REST API to upload a ZIP file but all my ZIP files become corrupted after the upload. When I download the file and then try to unzip it on my computer, on MacOS it says "Unable to expand 'FILE_NAME.zip' into FOLDER. Error 79 - Inappropriate file type or format.". I made sure it wasn't just my computer by having another person on a different computer try to unzip it and they had the same problem. I also confirmed that the ZIP file wasn't becoming corrupted before I uploaded it to Google Drive.
Below is a simplified version of my code.
const async = require('async');
const requestModule = require('request');
const fs = require('fs');
var api = {};
var tasks = {
// first, get the zip file contents
'getFile': function(cb) {
fs.readFile('my_file.zip', {'encoding':'UTF-8'}, function(err, data) {
if (err) {
console.error(err);
return cb();
}
api.file_data = data;
cb();
});
},
// second, upload the file contents to google drive via their API
'uploadFile': function(cb) {
var metadata = {
'mimeType': 'application/zip',
'name': 'my_file.zip'
};
var request = {
'url': 'https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&supportsAllDrives=true',
'method': 'POST',
'headers': {
'Authorization': 'Bearer ' + GOOGLE_ACCESS_TOKEN,
'Content-Type': 'multipart/related; boundary="SECTION"'
},
'body': '--SECTION\r\n' +
'Content-Type: application/json; charset=UTF-8\r\n' +
'\r\n' +
JSON.stringify(metadata) + '\r\n' +
'\r\n' +
'--SECTION\r\n' +
'Content-Type: application/zip\r\n' +
'Content-Transfer-Encoding: base64\r\n' +
'\r\n' +
new Buffer.from(api.file_data).toString('base64') + '\r\n' +
'\r\n' +
'--SECTION--'
};
requestModule(request, function(err, res, body) {
if (err) {
console.error(err);
return cb();
}
cb();
});
}
};
async.series(tasks, function() {
console.log('Done');
});
Note: I'm doing a Q&A-style post and will be answering my own question.
After a lot of trail and error, it came down to how I was reading the file before being uploaded. As an artifact from a copy/paste, the encoding on the readFile function was kept. When I removed {'encoding':'UTF-8'} then uploaded the file, the resulting zip file was able to be unzipped just perfectly.
I simply removed the encoding on readFile, so with the changes the code now looks like this:
fs.readFile('my_file.zip', function(err, data) {
// ...
});
Related
I am using request module in NodeJS to read data from AWS S3. When I am downloading the file(docx or image or pdf) using below code,its giving me an invalid/corrupted file. But when I am downloading .txt file it's not getting corrupted and I am able to see file in notepad.
I did a bit of googling and as suggested also tried by setting encoding to binary, still its not giving required result.
File upload is working fine. And I am able to see the uploaded file in AWS console.
File download code
var s3 = new properties.AWS.S3();
var params = {Bucket: properties.AWS_BUCKET, Key: req.headers['x-org'] + "/" + "talk" + "/" + req.body.fileName};
s3.getSignedUrl('getObject', params, function (err, URL) {
if (err) {
console.log("Error inside the S3");
console.log(err, err.stack); // an error occurred
res.send(null);
} else {
console.log("After getObject:-" + URL);
request({
url: URL, //URL to hit
method: 'GET',
encoding: 'binary'
}, function (error, response, body) {
if (error) {
console.log(error);
} else {
//console.log(response.statusCode, body);
res.set('content-disposition', 'attachment; filename=' + req.body.fileName);
res.send(body);
}
});
}
});
Update:-
I have narrow down the error, and just trying to send the file by reading file from local file system. Even that also is giving the corrupted files on client.
Here's the code for same
var filePath = path.join(__dirname, '..', '..', '..', 'downloads', req.body.fileURL);
var stat = fs.statSync(filePath);
var filename = path.basename(filePath);
var mimetype = mime.lookup(filePath);
console.log("mimetype=" + mimetype);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype + ";charset=UTF-8");
res.setHeader('Content-Length', stat.size);
var filestream = fs.createReadStream(filePath);
filestream.pipe(res);
Finally able to solve the problem.
Got solution hint from this blog https://templth.wordpress.com/2014/11/21/handle-downloads-with-angular/.
Per this blog
When testing with binary content like zip files or images, we see
that the downloaded content is corrupted. This is due to the fact that
Angular automatically applies transformation on the received data.
When handling binary contents, we want to get them as array buffer.
Final working code is:-
var filePath = path.join(__dirname, '..', '..', '..', 'downloads', req.body.fileURL);
var file = fs.createWriteStream(filePath);
s3.getObject(params).
on('httpData', function(chunk) {
//console.log("inside httpData");
file.write(chunk);
}).
on('httpDone', function() {
console.log("inside httpDone");
file.end();
//file.pipe(res);
}).
send(function() {
console.log("inside send");
res.setHeader('Content-disposition', 'attachment; filename=' + filePath);
res.setHeader('Content-type', mimetype);
res.setHeader('Transfer-Encoding', 'chunked');
var filestream = fs.createReadStream(filePath);
filestream.pipe(res);
});
I am able to successfully upload a file via an upload button to my vendors API. My vendors API also returns a .png file in blob format that I need to upload to Azure Blob Storage. I have tried a few approaches, but am getting the following error in my Node console:
[Error] statusCode: 414
My front end code is in an Angular Controller which passes data back to my Node backend that contains my Azure Blob Storage calls. I have the formidable and request modules installed and required, but am not using them in my current backend code since the data I receive is already in blob format.
Here is my front end upload code. The success "result" is the blob data I am returned:
$scope.sendToProduction = function () {
var parts = document.getElementById("file").value.split("\\");
var uploadedfilename = parts[parts.length - 1];
var basefilename = uploadedfilename.split(".")[0];
var fileextension = uploadedfilename.split(".")[1];
var filename = basefilename + '.' + fileextension;
var file = document.getElementById("file").files[0];
var formdata = new FormData();
formdata.append(filename, file);
$.ajax({
url: 'http://myvendorsapi/fileuploadendpoint',
type: "POST",
data: formdata,
mimeType: "multipart/form-data",
processData: false,
contentType: false,
crossDomain: true,
success: function (result) {
var filename = 'Test.png';
var file = result;
console.log(file);
$http.post('/postAdvanced', {filename: filename, file: file }).success(function (data) {
console.log(data);
}, function (err) {
console.log(err);
});
},
error: function (error) {
console.log("Something went wrong!");
}
});
};
Here is my node backend for uploading to Azure Blob Storage:
app.post('/postAdvanced', function (req, res, next) {
var filename = req.body.filename;
var file = req.body.file;
blobSvc.createBlockBlobFromText('blob5', file, filename, function (error, result, response) {
if (!error) {
console.log("Uploaded" + result);
}
else {
console.log(error);
}
});
})
How do I upload an AJAX response into Azure Blob Storage?
The problem is that in this line of code:
blobSvc.createBlockBlobFromText('blob5', file, filename, function (error, result, response) {
you have the wrong parameter order. It should be:
blobSvc.createBlockBlobFromText('blob5', filename, file, function (error, result, response) {
HTTP status code 414 means "Request-URI Too Long". Did you pass the correct blob name into blobSvc.createBlockBlobFromText?
I have problem uploading file using POST request in Node.js. I have to use request module to accomplish that (no external npms). Server needs it to be multipart request with the file field containing file's data. What seems to be easy it's pretty hard to do in Node.js without using any external module.
I've tried using this example but without success:
request.post({
uri: url,
method: 'POST',
multipart: [{
body: '<FILE_DATA>'
}]
}, function (err, resp, body) {
if (err) {
console.log('Error!');
} else {
console.log('URL: ' + body);
}
});
Looks like you're already using request module.
in this case all you need to post multipart/form-data is to use its form feature:
var req = request.post(url, function (err, resp, body) {
if (err) {
console.log('Error!');
} else {
console.log('URL: ' + body);
}
});
var form = req.form();
form.append('file', '<FILE_DATA>', {
filename: 'myfile.txt',
contentType: 'text/plain'
});
but if you want to post some existing file from your file system, then you may simply pass it as a readable stream:
form.append('file', fs.createReadStream(filepath));
request will extract all related metadata by itself.
For more information on posting multipart/form-data see node-form-data module, which is internally used by request.
An undocumented feature of the formData field that request implements is the ability to pass options to the form-data module it uses:
request({
url: 'http://example.com',
method: 'POST',
formData: {
'regularField': 'someValue',
'regularFile': someFileStream,
'customBufferFile': {
value: fileBufferData,
options: {
filename: 'myfile.bin'
}
}
}
}, handleResponse);
This is useful if you need to avoid calling requestObj.form() but need to upload a buffer as a file. The form-data module also accepts contentType (the MIME type) and knownLength options.
This change was added in October 2014 (so 2 months after this question was asked), so it should be safe to use now (in 2017+). This equates to version v2.46.0 or above of request.
Leonid Beschastny's answer works but I also had to convert ArrayBuffer to Buffer that is used in the Node's request module. After uploading file to the server I had it in the same format that comes from the HTML5 FileAPI (I'm using Meteor). Full code below - maybe it will be helpful for others.
function toBuffer(ab) {
var buffer = new Buffer(ab.byteLength);
var view = new Uint8Array(ab);
for (var i = 0; i < buffer.length; ++i) {
buffer[i] = view[i];
}
return buffer;
}
var req = request.post(url, function (err, resp, body) {
if (err) {
console.log('Error!');
} else {
console.log('URL: ' + body);
}
});
var form = req.form();
form.append('file', toBuffer(file.data), {
filename: file.name,
contentType: file.type
});
You can also use the "custom options" support from the request library. This format allows you to create a multi-part form upload, but with a combined entry for both the file and extra form information, like filename or content-type. I have found that some libraries expect to receive file uploads using this format, specifically libraries like multer.
This approach is officially documented in the forms section of the request docs - https://github.com/request/request#forms
//toUpload is the name of the input file: <input type="file" name="toUpload">
let fileToUpload = req.file;
let formData = {
toUpload: {
value: fs.createReadStream(path.join(__dirname, '..', '..','upload', fileToUpload.filename)),
options: {
filename: fileToUpload.originalname,
contentType: fileToUpload.mimeType
}
}
};
let options = {
url: url,
method: 'POST',
formData: formData
}
request(options, function (err, resp, body) {
if (err)
cb(err);
if (!err && resp.statusCode == 200) {
cb(null, body);
}
});
I did it like this:
// Open file as a readable stream
const fileStream = fs.createReadStream('./my-file.ext');
const form = new FormData();
// Pass file stream directly to form
form.append('my file', fileStream, 'my-file.ext');
const remoteReq = request({
method: 'POST',
uri: 'http://host.com/api/upload',
headers: {
'Authorization': 'Bearer ' + req.query.token,
'Content-Type': req.headers['content-type'] || 'multipart/form-data;'
}
})
req.pipe(remoteReq);
remoteReq.pipe(res);
I have used the Winston module to create a daily log file for my offline app. I now need to be able to send or upload that file to a remote server via POST (that part already exists)
I know I need to write the file in chunks so it doesn't hog the memory so I'm using fs.createReadStream however I seem to only get a 503 response, even if sending just sample text.
EDIT
I worked out that the receiver was expecting the data to be named 'data'. I have removed the createReadSteam as I could only get it to work with 'application/x-www-form-urlencoded' and a synchronous fs.readFileSync. If I change this to 'multipart/form-data' on the php server would I be able to use createReadStream again, or is that only if I change to physically uploading the json file.
I've only been learning node for the past couple of weeks so any pointers would be gratefully received.
var http = require('http'),
fs = require('fs');
var post_options = {
host: 'logger.mysite.co.uk',
path: '/',
port: 80,
timeout: 120000,
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}
var sender = http.request(post_options, function(res) {
if (res.statusCode < 399) {
var text = ""
res.on('data', function(chunk) {
text += chunk
})
res.on('end', function(data) {
console.log(text)
})
} else {
console.log("ERROR", res.statusCode)
}
})
var POST_DATA = 'data={['
POST_DATA += fs.readFileSync('./path/file.log').toString().replace(/\,+$/,'')
POST_DATA += ']}'
console.log(POST_DATA)
sender.write(POST_DATA)
sender.end()
After gazillion of trial-failure this worked for me. Using FormData with node-fetch. Oh, and request deprecated two days ago, btw.
const FormData = require('form-data');
const fetch = require('node-fetch');
function uploadImage(imageBuffer) {
const form = new FormData();
form.append('file', imageBuffer, {
contentType: 'image/jpeg',
filename: 'dummy.jpg',
});
return fetch(`myserver.cz/upload`, { method: 'POST', body: form })
};
In place of imageBuffer there can be numerous things. I had a buffer containing the image, but you can also pass the result of fs.createReadStream('/foo/bar.jpg') to upload a file from drive.
copied from https://github.com/mikeal/request#forms
var r = request.post('http://service.com/upload', function optionalCallback (err, httpResponse, body) {
if (err) {
return console.error('upload failed:', err);
}
console.log('Upload successful! Server responded with:', body);
})
var form = r.form()
form.append('my_field1', 'my_value23_321')
form.append('my_field2', '123123sdas')
form.append('my_file', fs.createReadStream(path.join(__dirname, 'doodle.png')))
Have a look at the request module.
It will provide you the ability to stream a file to POST requests.
I am trying to send a simple HTTP POST request, retrieve the response body.Following is my code. I am getting
Error: Incorrect header check
inside the "zlib.gunzip" method. I am new to node.js and I appreciate any help.
;
fireRequest: function() {
var rBody = '';
var resBody = '';
var contentLength;
var options = {
'encoding' : 'utf-8'
};
rBody = fSystem.readFileSync('resources/im.json', options);
console.log('Loaded data from im.json ' + rBody);
contentLength = Buffer.byteLength(rBody, 'utf-8');
console.log('Byte length of the request body ' + contentLength);
var httpOptions = {
hostname : 'abc.com',
path : '/path',
method : 'POST',
headers : {
'Authorization' : 'Basic VHJhZasfasNWEWFScsdfsNCdXllcjE6dHJhZGVjYXJk',
'Content-Type' : 'application/json; charset=UTF=8',
// 'Accept' : '*/*',
'Accept-Encoding' : 'gzip,deflate,sdch',
'Content-Length' : contentLength
}
};
var postRequest = http.request(httpOptions, function(response) {
var chunks = '';
console.log('Response received');
console.log('STATUS: ' + response.statusCode);
console.log('HEADERS: ' + JSON.stringify(response.headers));
// response.setEncoding('utf8');
response.setEncoding(null);
response.on('data', function(res) {
chunks += res;
});
response.on('end', function() {
var encoding = response.headers['content-encoding'];
if (encoding == 'gzip') {
zlib.gunzip(chunks, function(err, decoded) {
if (err)
throw err;
console.log('Decoded data: ' + decoded);
});
}
});
});
postRequest.on('error', function(e) {
console.log('Error occured' + e);
});
postRequest.write(rBody);
postRequest.end();
}
response.on('data', ...) can accept a Buffer, not just plain strings. When concatenating you are converting to string incorrectly, and then later can't gunzip. You have 2 options:
1) Collect all the buffers in an array, and in the end event concatentate them using Buffer.concat(). Then call gunzip on the result.
2) Use .pipe() and pipe the response to a gunzip object, piping the output of that to either a file stream or a string/buffer string if you want the result in memory.
Both options (1) and (2) are discussed here: http://nickfishman.com/post/49533681471/nodejs-http-requests-with-gzip-deflate-compression
In our case we added 'accept-encoding': 'gzip,deflate' to the headers and code started working (solution credited to Arul Mani):
var httpOptions = {
hostname : 'abc.com',
path : '/path',
method : 'POST',
headers : {
...
'accept-encoding': 'gzip,deflate'
}
};
I had this error trying to loop with fs.readdirSync but there is a .Dstore file so the unzip function was applied to it.
Be careful to pass only .zip/gz
import gunzip from 'gunzip-file';
const unzipAll = async () => {
try {
const compFiles = fs.readdirSync('tmp')
await Promise.all(compFiles.map( async file => {
if(file.endsWith(".gz")){
gunzip(`tmp/${file}`, `tmp/${file.slice(0, -3)}`)
}
}));
}
catch(err) {
console.log(err)
}
}
In addition to #Nitzan Shaked's answer, this might be related to the Node.js version.
What I experienced is that https.request (OP uses http.request, but it may behave the same) already decompresses the data under the hood, so that once you accumulate the chunks into a buffer, all you're left with is to call buffer.toString() (assuming utf8 as an example). I myself experienced it in this other answer and it seems to be related to Node.js version.
I'll end up this answer with a live demo of a similar working code which may come handy for future readers (it queries StackExchange API, gets a gzip compressed chunks, and then decompress it):
It includes a code that works on 14.16.0 (current StackBlitz version) - which, as I described, already decompresses the data under the hood - but not on Node.js 15.13.0,
It includes a commented-out code that works for Node.js 15.13.0 the latter but not for 14.16.0.