How to resumable upload with google drive node.js - node.js

Hey since google drive was changing their library I´am not able to upload anymore files bigger than 5MB with the basic upload drive.files.create. The docs told me that I have to choose resumable uploads instead. But google drive didn´t provide any sample code and aswell I can´t find anything on google.
Maybe it´s important to know that I can upload files smaller than 5MB with the drive.files.create
So there is no problem with the auth.
https://developers.google.com/drive/v3/web/resumable-upload
I wrote this POST request(Also not working with PUT):
var fs = require('fs')
var request = require('request')
var file = 'C:\\test\\sample.container'
var uploadUrl = 'https://www.googleapis.com/drive/v3/files?uploadType=resumable'
var stats = fs.statSync(file)
var fileSizeInBytes = stats["size"]
fs.readFile(file, function read(e, f) {
if (e) {
console.log(e)
return;
}
request.post({
url: uploadUrl,
headers: {
'Authorization': 'xxxxxxxxxxxxxxxxxxxxxxx',
'Content-Length': fileSizeInBytes,
'Content-Type': 'application/octet-stream'
},
body: f,
}, function(e, r, b) {
if (e) {
console.log(e)
return;
}
console.log(`
Response: ${ JSON.stringify(r) }
Body: ${ b }
`)
});
});
but I get as body result:
<HTML>
<HEAD>
<TITLE>Request Entity Too Large</TITLE>
</HEAD>
<BODY BGCOLOR="#FFFFFF" TEXT="#000000">
<H1>Request Entity Too Large</H1>
<H2>Error 413</H2>
</BODY>
</HTML>
If I would use as request url instead:
https://www.googleapis.com/upload/drive/v3/files?uploadType=resumable
I get aswell an similiar message as body result:
Request is too large.
So anybody has a working code for uploading files with the resumable upload or maybe again with the basic upload? Or is there another way for uploading big files? I´am open for alternatives! Thank you

In other api clients (e.g. the Python one), resumable uploads are created by altering the MediaFileUpload constructor with the parameter resumable=True. The node.js api client is only in alpha, so it may not have built-in support for resumable uploads. You can try feeding drive a stream, or simply extending that example media parameter, e.g.
media: {
mimeType: 'some mimetype',
body: 'some body',
resumable: true
}
If stream and the above resumable don't work, then you won't be able to use the node.js client library to do resumable uploads, and will have to use the REST API directly.

The problem itself was related to the googleapi library for node.js. With v27.0.0 the basic upload was working again for big files aswell. Related to:
https://github.com/google/google-api-nodejs-client/issues/1038
This is not an answer for how to resumable upload with node.js so you may keep this topic open until somebody post a sample code for resumable upload because even with v27 my POST request is still not working. Maybe you watch the github link from above because I asked there aswell for an sample code.
However my problem was only that I was not able to upload bigger files than 5MB. But this problem is now gone for me :)!

Related

Pass file uploaded via HTTP POST to another API

I have a Node.js (16.13.1) REST API using Express and one of my endpoints receives one or more uploaded files. The client (web app) uses FormData into which the files are appended. Once they're submitted to my API, the code there uses multer to grab the files from the request object.
Now I'm having trouble trying to send those same files to another API. multer attaches the files to req.files and each file object in that array has several properties one of which is buffer. I tried using the stream package's Duplex object to convert this buffer to a stream so that I could append the file to another FormData object, but when the server the second API is running on receives the request, I get an error from the web server saying that "a potentially dangerous request.form value was detected from the client.".
Any suggestions?
I am working on a nest project I was also facing this issue did some research and found that we need to create a Readable from the Buffer of that file and it's working for me.
// Controller
#UseInterceptors(FileInterceptor('file'))
async uploadFile(#UploadedFile() file: Express.Multer.File) {
return this.apiservice.upload(file);
}
// Service
uploadFile(file: Express.Multer.File) {
const readstream = Readable.from(file.buffer)
console.log(readstream)
const form = new FormData();
form.append('file', file, { filename: extra.filename });
const url = `api_endpoint`;
const config: AxiosRequestConfig = {
headers: {
'Content-Type': 'multipart/form-data'
},
};
return axios.post(url, form, config);
}

Getting a bad request 400 when trying to upload zipped wkt file to here-maps rest api

We have a problem with the fleet.ls.hereapi.com when uploading a new layer for geofencing.
const myId = 'MYLAYER'; // just a id to check
zip.file('data.wkt', `NAME\tWKT\n${myId}\t${wkt}`);
const content = await zip.generateAsync({ type: 'nodebuffer' });
const formData = new FormData();
formData.append('zipfile', content);
await axios.post(config.HERE_MAPS_UPLOAD_ENDPOINT, formData, {
headers: {
'content-type': 'multipart/form-data',
},
params: {
apiKey: config.HERE_MAPS_REST_API_KEY,
layer_id: myId,
},
});
We get a bad request without a message and do not know what the problem is. The same implementation works in the Frontend (with 'blob' as zip type). Is there a parameter to get a better error message from the api?
We got the instructions how to implement it from this tutorial: https://www.youtube.com/watch?v=Ayw9GcS1V-8 and as I mentioned it works fine in the frontend. Also it works if I write a file in node and upload it via curl. Thank you for any help in advance!
Edit: I'm getting the following issue from the API: 'Multipart should contain exactly one part but contains 0'
I fixed it!
The problem was that the api needed a filename for the form data. This filename can be provided as third parameter as described here.
So I basically changed formData.append('zipfile', content); to formData.append('zipfile', content, zipfile.zip); and it worked.
Hope this will help somebody in the future!

Upload multipart/form-data to S3 from lambda (Nodejs)

I want to upload multipart/form-data file to S3 using Nodejs.
I have tried various approaches but none of them are working. I was able to write content to S3 from lambda but when the file is downloaded from S3, it was corrupted.
Can someone provide me a working example or steps that could help me?
Thanking you in anticipation.
Please suggest another alternative if you think so.
Following is my lambda code:
export const uploadFile = async event => {
const parser = require("lambda-multipart-parser");
const result = await parser.parse(event);
const { content, filename, contentType } = result.files[0];
const params = {
Bucket: "name-of-the-bucket",
Key: filename,
Body: content,
ContentDisposition: `attachment; filename="${filename}";`,
ContentType: contentType,
ACL: "public-read"
};
const res = await s3.upload(params).promise();
return {
statusCode: 200,
body: JSON.stringify({
docUrl: res.Location
})
};
}
If want to upload file through lambda, one way is to open your AWS API Gateway console.
Go to
"API" -> {YourAPI} -> "Settings"
There you will find "Binary Media Types" section.
Add following media type:
multipart/form-data
Save your changes.
Then Go to "Resources" -> "proxy method"(eg. "ANY") -> "Method Request" -> "HTTP Request Headers" and add following headers "Content-Type", "Accept".
Finally deploy your api.
For more info visit: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings-configure-with-console.html
There are 2 possible points of failure - Lambda receives corrupted data or you corrupt data while sending it to S3.
Sending multipart/form-data content to Lambda is not straightforward. You can see how to do that here.
After you did this and you're sure your data is correct in Lambda, check if you send it to S3 correctly (see S3 docs and examples for that).

how to upload file from ng-controller?

How can I upload a file from my angular controller. I am doing something like, On ng-click I am calling upload_file() function which is declared inside my controller. And I want to use something like this
$http.post("url", data).success().error();
url is of node upload service. It's working fine when I use like . But without using action there, I want to upload it from my function.
But I am not getting how to attach the file selected to the data here. I want to send some data along with file. Can I upload it in the way I am trying? please help me...
You can use angular-file-upload library:
Basically you need to $http.post like this:
$http({
method: 'POST',
url: config.url,
headers: { 'Content-Type': false },
transformRequest: function (data) {
var formData = new FormData();
formData.append('file', myFile);
for (key in myData) {
formData.append(key, myData[key]);
}
return formData;
}
}).success().error().progress()
The library supports IE with flash polyfill which normally doesn't support FormData so you cannot get the file object from the <input file...>.

Advice: flatiron, formidable and aws s3

I'm new with serverside programming with node.js. I'm sticking together a tiny webapp with it right now and having the usual startup learning to do. The following piece of code WORKS. But I would love to know if it's more or less a right way to do a simple file upload from a form and throw it into aws s3:
app.router.post('/form', { stream: true }, function () {
var req = this.req,
res = this.res,
form = new formidable.IncomingForm();
form
.parse(req, function(err, fields, files) {
console.log('Parsed file upload' + err);
if (err) {
res.end('error: Upload failed: ' + err);
} else {
var img = fs.readFileSync(files.image.path);
var data = {
Bucket: 'le-bucket',
Key: files.image.name,
Body: img
};
s3.client.putObject(data, function() {
console.log("Successfully uploaded data to myBucket/myKey");
});
res.end('success: Uploaded file(s)');
}
});
});
Note: I had to turn buffer off in union / flatiron.plugins.http.
What I would like to learn is, when to stream load a file and when to syncload it. It will be a really tiny webapp with little traffic.
If it's more or less good then please consider this as a token of working code which I also would throw into a gist. It's not that easy to find documenation and working examples of this kind of stuff. I like flatiron alot. But it's small module approach leads to lots of splattered docs and examples all over the net, speak alone of tutorials.
You should use other module than formidable because as far as I know formidable does not have s3 storage option , then you must save the files in your server before uploading it.
I would recommend you to use : multiparty
Use this example in order to upload directly to S3 without saving it locally in your server.

Resources