I need to upload a file to an API-Gateway. After adding some meta information, the file should be send to another (micro) service (as Content-Type: multipart/form-data). I am having some problems to build a FormData object within the API-Gateway. I do not want to persist the file on the gateway, so I am basically just trying to pass it through.
For creating the formData-object, I am using Form-Data
This is what a tried:
// Controller
#Post()
#UseInterceptors(FileInterceptor('file'))
async create(#Res() res, #UploadedFile('file') file, #Body() body: any) {
return await this.someService.create(file);
}
// Service
async create(file: any) {
const formData = new FormData();
formData.append('file', file);
formData.append('key', 'value');
const formHeaders = formData.getHeaders();
try {
const result = await this.httpService
.post('http://some-other-service/import', formData , {
headers: {
...formHeaders,
},
})
.toPromise();
return result.data;
} catch (e) {
throw new BadGatewayException();
}
}
This results in the following error:
TypeError: source.on is not a function
at Function.DelayedStream.create (/usr/app/node_modules/delayed-stream/lib/delayed_stream.js:33:10)
at FormData.CombinedStream.append (/usr/app/node_modules/combined-stream/lib/combined_stream.js:44:37)
at FormData.append (/usr/app/node_modules/form-data/lib/form_data.js:74:3)
at ImportService.<anonymous> (/usr/app/src/import/import.service.ts:47:18)
This question is a bit old, but someone might benefit from this solution.
The problem is that you are passing the whole #UploadedFile object to the formData.append method. The #UploadedFile object contains the data from from the file, but also mimetype, size, fieldname ('file' in this case), originalname (the original file name), etc.
You need to pass the actual contents of the file you are trying to upload to the formData.append method.
So to make it work, use
formData.append('file', file.buffer);
//OR
formData.append('file', file.buffer, file.originalname);
Related
I'm working on a Node Typescript Rest API with Express. I did an uploadFiles middleware using multer that store all the files uploaded on a diskStorage, and this works fine.
But then, I want to immediately send these same files to another API to process them and extract data.
I am stuck here because I can't achieve to send these files with axios in a formData.
I first tried to access my files on req.files, and then send them with axios like this :
const filesMulter = req.files as Express.Multer.File[];
for await (const cvFile of filesMulter) {
const formData = new FormData();
formData.append("file", cvFile);
const extractRes = await axios.post(
`${process.env.API_EXTRACT_BASEURL}/cv/`,
formData,
{
headers: {
"Content-Type": "multipart/form-data",
},
}
);
}
But I got this error in the append() :
Argument of type 'File' is not assignable to parameter of type 'string | Blob'.
So I tried to convert the file to a Blob, so the file is accepted in the formData.append(), but then I get this axios error :
AxiosError: Data after transformation must be a string, an ArrayBuffer, a Buffer, or a Stream
I tried to convert the file in different formats (Buffer, Stream, string...) but I get the append() error or the Axios error mentionned before...
I even tried to get again the file from the disk by using
import fs from "fs";
const getFile = async (filePath: string): Promise<Buffer> => {
return await new Promise((resolve, reject) => {
fs.readFile(filePath, (error, data) => {
if (error != null) {
reject(error);
} else {
resolve(data);
}
});
});
};
export default getFile;
And then :
const cvFileOnDisk = await getFile("/path/to/my/file/hardcoded/example.pdf");
This return a Buffer, but again I cannot add it to the formData with append()...
Any idea how I could solve this ?
Thanks a lot !
My API end point returns an excel file streamed from S3. It works on the local but when testing on API gateway the file is corrupted. Here is my API code:
const downloadContentFromS3 = async function(bucket, file) {
return new Promise((resolve, reject) => {
streamFileFromS3(bucket, file, (error, s3buffer) => {
if (error) return reject(error);
return resolve(s3buffer);
});
});
};
const streamFileFromS3 = async function(bucket, fileName, callback) {
const params = {
Bucket: bucket,
Key: fileName,
};
const buffers = [];
const stream = s3.getObject(params).createReadStream();
stream.on('data', data => buffers.push(data));
stream.on('end', () => callback(null, Buffer.concat(buffers)));
stream.on('error', error => callback(error));
};
downloadExcelFile: async (req, res) => {
try {
const fileName = 'myFilename';
const workbook = await downloadContentFromS3(
'bucket-name'
fileName
);
const workbook = xlsx.read(buffer);
res.setHeader('Content-disposition', `attachment; filename=${fileName}`);
res.setHeader(
'Content-type',
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
);
const wbout = xlsx.write(workbook, { bookType: 'xlsx', type: 'buffer' });
res.status(200).send(Buffer.from(wbout));
} catch (error) {
throw new OriolaError(error.message);
}
},
What I have tried so far: Setup the binary media types as shown in the picture:
In addition tried to set the RESPONSE HEADER to Content-Type and Content-Disposition but to no avail. The problem seems to persist. Any ideas and help is appreciated.
EDIT: I have also tried to set binary type */* in the settings but this does not help as well.
API Gateway and Lambda send files between themselves as base64. This is regardless of whether you set a binary media type on API Gateway, as API Gateway does the conversion between base64 and binary.
S3 getObject gets a binary file, but that is a moot point for your problem, as you are still creating a binary file with xlsx.
What you are currently doing is sending binary data untransformed as base64 data.
All you need to do is return the file as a base64 buffer instead of a binary one.
So
res.status(200).send(Buffer.from(wbout));
becomes
res.status(200).send(Buffer.from(wbout).toString('base64'));
https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings.html
So either
setup passthrough behaviour and send binary response body from your function as you are already doing,
or set the contentHandling property of the IntegrationResponse resource to CONVERT_TO_BINARY in addition to binaryMediaTypes setting already done and then send base64 response body.
I've been searching for a way to write to a JSON file in a S3 bucket from the pre signed URL. From my research it appears it can be done but these are not in Node:
http PUT a file to S3 presigned URLs using ruby
PUT file to S3 with presigned URL
Uploading a file to a S3 Presigned URL
Write to a AWS S3 pre-signed url using Ruby
How to create and read .txt file with fs.writeFile to AWS Lambda
Not finding a Node solution from my searches and using a 3rd party API I'm trying to write the callback to a JSON that is in a S3 bucket. I can generate the pre signed URL with no issues but when I try to write dummy text to the pre signed URL I get:
Error: ENOENT: no such file or directory, open
'https://path-to-file-with-signed-url'
When I try to use writeFile:
fs.writeFile(testURL, `This is a write test: ${Date.now()}`, function(err) {
if(err) return err
console.log("File written to")
})
and my understanding of the documentation under file it says I can use a URL. I'm starting to believe this might be a permissions issue but I'm not finding any luck in the documentation.
After implementing node-fetch I still get an error (403 Forbidden) writing to a file in S3 based on the pre signed URL, here is the full code from the module I've written:
const aws = require('aws-sdk')
const config = require('../config.json')
const fetch = require('node-fetch')
const expireStamp = 604800 // 7 days
const existsModule = require('./existsModule')
module.exports = async function(toSignFile) {
let checkJSON = await existsModule(`${toSignFile}.json`)
if (checkJSON == true) {
let testURL = await s3signing(`${toSignFile}.json`)
fetch(testURL, {
method: 'PUT',
body: JSON.stringify(`This is a write test: ${Date.now()}`),
}).then((res) => {
console.log(res)
}).catch((err) => {
console.log(`Fetch issue: ${err}`)
})
}
}
async function s3signing(signFile) {
const s3 = new aws.S3()
aws.config.update({
accessKeyId: config.aws.accessKey,
secretAccessKey: config.aws.secretKey,
region: config.aws.region,
})
params = {
Bucket: config.aws.bucket,
Key: signFile,
Expires: expireStamp
}
try {
// let signedURL = await s3.getSignedUrl('getObject', params)
let signedURL = await s3.getSignedUrl('putObject', params)
console.log('\x1b[36m%s\x1b[0m', `Signed URL: ${signedURL}`)
return signedURL
} catch (err) {
return err
}
}
Reviewing the permissions I have no issues with uploading and write access has been set in the permissions. In Node how can I write to a file in the S3 bucket using that file's pre-signed URL as the path?
fs is the filesystem module. You can't use it as an HTTP client.
You can use the built-in https module, but I think you'll find it easier to use node-fetch.
fetch('your signed URL here', {
method: 'PUT',
body: JSON.stringify(data),
// more options and request headers and such here
}).then((res) => {
// do something
}).catch((e) => {
// do something else
});
Was looking for an elegant way to transfer s3 file to an s3 signed url using PUT. Most examples I found were using the PUT({body : data}). I came across one suggestion to read the data to a readable stream and then pipe it to the PUT. However I still didn't like the notion of loading large files into memory and then assigning them to the put stream. Piping read to write is always better in memory and performance. Since the s3.getObject().createReadStream() returns a request object, which supports pipe, all that we need to do is to pipe it correctly to the PUT request which exposes a write stream.
Get object function
async function GetFileReadStream(key){
return new Promise(async (resolve,reject)=>{
var params = {
Bucket: bucket,
Key: key
};
var fileSize = await s3.headObject(params)
.promise()
.then(res => res.ContentLength);
resolve( {stream : s3.getObject(params).createReadStream(),fileSize});
});
}
Put object function
const request = require('request');
async function putStream(presignedUrl,readStream){
return new Promise((resolve,reject)=>{
var putRequestWriteStream = request.put({url:presignedUrl,headers:{'Content-Type':'application/octet-stream','Content-Length':readStream.fileSize }});
putRequestWriteStream.on('response', function(response) {
var etag = response.headers['etag'];
resolve(etag);
})
.on('end', () =>
console.log("put done"))
readStream.stream.pipe(putRequestWriteStream);
});
}
This works great with a very small memory foot print. Enjoy.
I want to call third-party API (to upload an image) on the node side that expects a File type object on key file.
The front end is in Angular so the flow is
.ts
const _file: File = __userAvatar.files[0];
const _userAvatarInfo = { userId: this.user.id, avatar: _file };
this.userService.updateUserAvatar(_userAvatarInfo).subscribe(
UserService.ts
const _formData = new FormData();
_formData.append("avatar", _userAvatarInfo.avatar);
_formData.append("userId", _userAvatarInfo.userId);
return this.http.post(`${this.context}/userservice/user/updateuseravatar`, _formData);
Node API layer using giuseppe
#Post("/user/updateuseravatar")
updateUserAvatar(#Req() req: any): Promise<any> {
return TrusteeFacade.uploadResource({ resourceId: "some_id", resource: req.files.avatar });
}
Facade Layer
static uploadResource(__resourceInfo: any): Promise<any> {
const _resourceData = new FormData();
_resourceData.append("mimetype", "image/png");
_resourceData.append("file", __resourceInfo.resource);
// this will not get printed
console.log("From**************", __resourceInfo.resource);
return axios({
method: "post",
url: `${process.env.REST_URL}/resources/${__resourceInfo.resourceId}`,
headers: _resourceData.getHeaders(),
data: _resourceData
});
}
At facade layer it is showing
TypeError: source.on is not a function
at Function.DelayedStream.create (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\delayed-stream\lib\delayed_stream.js:33:10)
at FormData.CombinedStream.append (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\combined-stream\lib\combined_stream.js:44:37)
at FormData.append (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\form-data\lib\form_data.js:74:3)
at Function.uploadResource (D:\QPP Workspace\ContentPlatform\webapplications\application-services\.bin\facade\trustee-facade.js:221:23)
at trustee_facade_1.TrusteeFacade.getFileResourceId.then (D:\QPP Workspace\ContentPlatform\webapplications\application-services\.bin\api\user-service.js:118:51)
at propagateAslWrapper (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\async-l
The __resourceInfo have correct info at facade layer, but creating FormData from it is the cause of the error?
This is how I have handled this at the facade layer, instead of this
_resourceData.append("file", __resourceInfo.resource);
I created the file field using Buffer.from
_resourceData.append("file", Buffer.from(__resourceInfo.resource.data), { filename: __resourceInfo.resource.name });
There could be another solution, but this solved my problem.
I'm using node.js to try to upload a csv file via slackAPI's upload file method. The method is post. I'm unsure how to make this possible because if I use the content argument instead of the file, I get the error:
{ ok: false, error: 'invalid_array_arg' }
If I use the file aargument, I still get the error:
{ ok: false, error: 'invalid_array_arg' }
There are multiple fault points in this code and I've tried to test each one but I'm sure I'm missing some information here. Here's the uploadFile Method that I created:
function uploadFile(file){
console.log(botToken);
axios.post('https://slack.com/api/files.upload', qs.stringify({token: botToken, file: file, channels: 'testing'}))
.then(function (response) {
var serverMessage = response.data;
console.log(serverMessage);
console.log("inside file upload function");
})
}
here's how I call the method:
var file = fs.createReadStream(__dirname + '/' + csvFilePath); // <--make sure this path is correct
console.log(__dirname + '/' + csvFilePath);
uploadFile(file);
And finally the output:
Bot has started!
C:\Users\i502153\WebstormProjects\slackAPIProject/accessLogs/CSV/1548430592860output.csv*
{ ok: false, error: 'invalid_array_arg' }
inside file upload function
What am I doing wrong and how to rectify this?
Links:
https://api.slack.com/methods/files.upload
https://www.npmjs.com/package/axios
Your solution won't work because you are attempting to take a stream object (file) and stringify it into a query string, which is just going to insert the nonsense string "[object]" into the query. It won't actually stream data to Slack.
Axios, unfortunately, doesn't work in node exactly like it does in the browser, and their docs can be a little confusing.
I would suggest an approach like this (untested):
const axios = require('axios');
const FormData = require('form-data');
function uploadFile(file) {
const form = new FormData();
form.append('token', botToken);
form.append('channels, 'testing');
form.append('file', file, 'optionalfilenamehere');
return axios.post('https://slack.com/api/files.upload', form, {
headers: form.getHeaders()
}).then(function (response) {
var serverMessage = response.data;
console.log(serverMessage);
console.log('inside file upload function');
});
}
I adapted this code from the suggestion in ticket https://github.com/axios/axios/issues/1006#issuecomment-320165427, there may be other helpful comments there as well if you run into issues. Good luck!
EDIT: For people reading this later, for a similar approach using request instead of axios, see related question Slack API (files.upload) using NodeJS.