I want to call third-party API (to upload an image) on the node side that expects a File type object on key file.
The front end is in Angular so the flow is
.ts
const _file: File = __userAvatar.files[0];
const _userAvatarInfo = { userId: this.user.id, avatar: _file };
this.userService.updateUserAvatar(_userAvatarInfo).subscribe(
UserService.ts
const _formData = new FormData();
_formData.append("avatar", _userAvatarInfo.avatar);
_formData.append("userId", _userAvatarInfo.userId);
return this.http.post(`${this.context}/userservice/user/updateuseravatar`, _formData);
Node API layer using giuseppe
#Post("/user/updateuseravatar")
updateUserAvatar(#Req() req: any): Promise<any> {
return TrusteeFacade.uploadResource({ resourceId: "some_id", resource: req.files.avatar });
}
Facade Layer
static uploadResource(__resourceInfo: any): Promise<any> {
const _resourceData = new FormData();
_resourceData.append("mimetype", "image/png");
_resourceData.append("file", __resourceInfo.resource);
// this will not get printed
console.log("From**************", __resourceInfo.resource);
return axios({
method: "post",
url: `${process.env.REST_URL}/resources/${__resourceInfo.resourceId}`,
headers: _resourceData.getHeaders(),
data: _resourceData
});
}
At facade layer it is showing
TypeError: source.on is not a function
at Function.DelayedStream.create (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\delayed-stream\lib\delayed_stream.js:33:10)
at FormData.CombinedStream.append (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\combined-stream\lib\combined_stream.js:44:37)
at FormData.append (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\form-data\lib\form_data.js:74:3)
at Function.uploadResource (D:\QPP Workspace\ContentPlatform\webapplications\application-services\.bin\facade\trustee-facade.js:221:23)
at trustee_facade_1.TrusteeFacade.getFileResourceId.then (D:\QPP Workspace\ContentPlatform\webapplications\application-services\.bin\api\user-service.js:118:51)
at propagateAslWrapper (D:\QPP Workspace\ContentPlatform\webapplications\application-services\node_modules\async-l
The __resourceInfo have correct info at facade layer, but creating FormData from it is the cause of the error?
This is how I have handled this at the facade layer, instead of this
_resourceData.append("file", __resourceInfo.resource);
I created the file field using Buffer.from
_resourceData.append("file", Buffer.from(__resourceInfo.resource.data), { filename: __resourceInfo.resource.name });
There could be another solution, but this solved my problem.
Related
I'm working on a Node Typescript Rest API with Express. I did an uploadFiles middleware using multer that store all the files uploaded on a diskStorage, and this works fine.
But then, I want to immediately send these same files to another API to process them and extract data.
I am stuck here because I can't achieve to send these files with axios in a formData.
I first tried to access my files on req.files, and then send them with axios like this :
const filesMulter = req.files as Express.Multer.File[];
for await (const cvFile of filesMulter) {
const formData = new FormData();
formData.append("file", cvFile);
const extractRes = await axios.post(
`${process.env.API_EXTRACT_BASEURL}/cv/`,
formData,
{
headers: {
"Content-Type": "multipart/form-data",
},
}
);
}
But I got this error in the append() :
Argument of type 'File' is not assignable to parameter of type 'string | Blob'.
So I tried to convert the file to a Blob, so the file is accepted in the formData.append(), but then I get this axios error :
AxiosError: Data after transformation must be a string, an ArrayBuffer, a Buffer, or a Stream
I tried to convert the file in different formats (Buffer, Stream, string...) but I get the append() error or the Axios error mentionned before...
I even tried to get again the file from the disk by using
import fs from "fs";
const getFile = async (filePath: string): Promise<Buffer> => {
return await new Promise((resolve, reject) => {
fs.readFile(filePath, (error, data) => {
if (error != null) {
reject(error);
} else {
resolve(data);
}
});
});
};
export default getFile;
And then :
const cvFileOnDisk = await getFile("/path/to/my/file/hardcoded/example.pdf");
This return a Buffer, but again I cannot add it to the formData with append()...
Any idea how I could solve this ?
Thanks a lot !
I have a controller made in my NestJS service S1, which accepts file using FileInterceptor, the function signature looks like this:
#UseInterceptors(FileInterceptor('file'))
#Post('uploads')
async uploadAttachment(#UploadedFile() file, #Query() queryParams: {filename: string}, #Res() res: Response, #Headers() header, #Req() req)
{
//Some logic here
}
Now I want to use this API to upload an image from a different service S2, but I only have the image's publicly accessible URL
I am trying like this:
import * as formData from 'form-data';
import fetch from "node-fetch";
let serv = await fetch(url);
let img = await serv.buffer();
let url = "http://localhost:8000/api/v1/ticketing/uploads?filename=checking.jpg"
form.append("file", Readable.from(img));
return firstValueFrom(this.httpService.post(url, form, {headers: {...header, ...form.getHeaders()}}).pipe(
map((resp) => resp.data),
catchError((_err) => {
throw new HttpException(
{success: false, error: {message: "failed"}, payload: _err.response.data},
HttpStatus.BAD_REQUEST,
);
})
)
);
But this is not working, apparently the file is undefined on the controller of service S1. Could someone please help me out with this? I am new to NodeJS
Also apologies for the poor editing.
I am using the #aws-sdk/client-lambda npm package for invoking lambdas. I have two Lambdas. Lambda A & Lambda B. Lambda A is trying to invoke Lambda B.
Lambda A invokes Lambda B by running the following code:
const { LambdaClient, InvokeCommand } = require('#aws-sdk/client-lambda');
module.exports = {
getGitHubToken: async () => {
const client = new LambdaClient({ region: process.env.REGION });
const params = {
FunctionName: process.env.GITHUB_TOKEN_FUNCTION,
LogType: 'Tail',
Payload: '',
};
const command = new InvokeCommand(params);
try {
const { Payload } = await client.send(command);
console.log(Payload);
return Payload;
} catch (error) {
console.error(error.message);
throw error;
}
},
};
The expected response from Lambda B should look like this:
{
statusCode: 200,
body: JSON.stringify({
token: '123',
}),
};
However, Payload looks to be returning this from the line console.log(Payload);:
I looked on the AWS SDK Website and it looks like Payload returns a Uint8Array. I guess this is because it's from a promise?
I have tried doing Payload.toString() however that comes back as simply a string of the values in the Unit8Array. Example being:
2021-04-13T14:32:04.874Z worker:success Payload: 123,34,115,116,97,116,117,115,67,111,100,101,34,58,50,48,48,44,34,98,111,100,121,34,58,34,123,92,34,116,111,107,101,110,92,34,58,92,34,103,104,115,95,111,114,101,51,65,109,99,122,86,85,74,122,66,52,90,68,104,57,122,122,85,118,119,52,51,50,111,67,71,48,50,75,121,79,69,72,92,34,125,34,125
My Question:
How do I resolve data from Unit8Array to the data I was expecting from the Lambda response? Which is a JSON Object?
I have confirmed the requested Lambda (Lambda B in this case) is returning the data correctly by going to CloudWatch. Thanks.
Okay, I found a way to get this working.
You have to specify a text encoder:
const asciiDecoder = new TextDecoder('ascii');
Then decode it so it looks like this:
const data = asciiDecoder.decode(Payload);
I have logged an issue on their repository asking why this isn't included in the module. I will post an update on any movement on this.
I need to upload a file to an API-Gateway. After adding some meta information, the file should be send to another (micro) service (as Content-Type: multipart/form-data). I am having some problems to build a FormData object within the API-Gateway. I do not want to persist the file on the gateway, so I am basically just trying to pass it through.
For creating the formData-object, I am using Form-Data
This is what a tried:
// Controller
#Post()
#UseInterceptors(FileInterceptor('file'))
async create(#Res() res, #UploadedFile('file') file, #Body() body: any) {
return await this.someService.create(file);
}
// Service
async create(file: any) {
const formData = new FormData();
formData.append('file', file);
formData.append('key', 'value');
const formHeaders = formData.getHeaders();
try {
const result = await this.httpService
.post('http://some-other-service/import', formData , {
headers: {
...formHeaders,
},
})
.toPromise();
return result.data;
} catch (e) {
throw new BadGatewayException();
}
}
This results in the following error:
TypeError: source.on is not a function
at Function.DelayedStream.create (/usr/app/node_modules/delayed-stream/lib/delayed_stream.js:33:10)
at FormData.CombinedStream.append (/usr/app/node_modules/combined-stream/lib/combined_stream.js:44:37)
at FormData.append (/usr/app/node_modules/form-data/lib/form_data.js:74:3)
at ImportService.<anonymous> (/usr/app/src/import/import.service.ts:47:18)
This question is a bit old, but someone might benefit from this solution.
The problem is that you are passing the whole #UploadedFile object to the formData.append method. The #UploadedFile object contains the data from from the file, but also mimetype, size, fieldname ('file' in this case), originalname (the original file name), etc.
You need to pass the actual contents of the file you are trying to upload to the formData.append method.
So to make it work, use
formData.append('file', file.buffer);
//OR
formData.append('file', file.buffer, file.originalname);
I'm new to node.js. What I'm trying to do is to stream the upload of a file from web browser to a cloud storage through my node.js server.
I'm using 'express', 'request' and 'busboy' modules.
var express = require("express");
var request = require("request");
var BusBoy = require("busboy");
var router = express.Router();
router.post("/upload", function(req, res, next) {
var busboy = new BusBoy({ headers: req.headers });
var json = {};
busboy.on("file", function (fieldname, file, filename, encoding, mimetype) {
file.on("data", function(data) {
console.log(`streamed ${data.length}`);
});
file.on("end", function() {
console.log(`finished streaming ${filename}`);
});
var r = request({
url: "http://<my_cloud_storage_api_url>",
method: "POST",
headers: {
"CUSTOM-HEADER": "Hello",
},
formData: {
"upload": file
}
}, function(err, httpResponse, body) {
console.log("uploaded");
json.response = body;
});
});
busboy.on("field", function(name, val) {
console.log(`name: ${name}, value: ${value}`);
});
busboy.on("finish", function() {
res.send(json);
});
req.pipe(busboy);
});
module.exports = router;
But I keep getting the following error on the server. What am I doing wrong here? Any help is appreciated.
Error: Part terminated early due to unexpected end of multipart data
at node_modules\busboy\node_modules\dicer\lib\Dicer.js:65:36
at nextTickCallbackWith0Args (node.js:420:9)
at process._tickCallback (node.js:349:13)
I realize this question is some 7 months old, but I shall answer it here in an attempt help anyone else currently banging their head against this.
You have two options, really: Add the file size, or use something other than Request.
Note: I edited this shortly after first posting it to hopefully provide a bit more context.
Using Something Else
There are some alternatives you can use instead of Request if you don't need all the baked in features it has.
form-data can be used by itself in simple cases, or it can be used with, say, got. request uses this internally.
bhttp advertises Streams2+ support, although in my experience Streams2+ support has not been an issue for me. No built in https support, you have to specify a custom agent
got another slimmed down one. Doesn't have any special handling of form data like request does, but is trivially used with form-data or form-data2. I had trouble getting it working over a corporate proxy, though, but that's likely because I'm a networking newb.
needle seems pretty light weight, but I haven't actually tried it.
Using Request: Add the File Size
Request does not (as of writing) have any support for using transfer-encoding: chunked so to upload files with it, you need to add the file's size along with the file, which if you're uploading from a web client means that client needs to send that file size to your server in addition to the file itself.
The way I came up with to do this is to send the file metadata in its own field before the file field.
I modified your example with comments describing what I did. Note that I did not include any validation of the data received, but I recommend you do add that.
var express = require("express");
var request = require("request");
var BusBoy = require("busboy");
var router = express.Router();
router.post("/upload", function(req, res, next) {
var busboy = new BusBoy({ headers: req.headers });
var json = {};
// Use this to cache any fields which are file metadata.
var fileMetas = {};
busboy.on("file", function (fieldname, file, filename, encoding, mimetype) {
// Be sure to match this prop name here with the pattern you use to detect meta fields.
var meta = fileMetas[fieldname + '.meta'];
if (!meta) {
// Make sure to dump the file.
file.resume();
// Then, do some sort of error handling here, because you cannot upload a file
// without knowing it's length.
return;
}
file.on("data", function(data) {
console.log(`streamed ${data.length}`);
});
file.on("end", function() {
console.log(`finished streaming ${filename}`);
});
var r = request({
url: "http://<my_cloud_storage_api_url>",
method: "POST",
headers: {
"CUSTOM-HEADER": "Hello",
},
formData: {
// value + options form of a formData field.
"upload": {
value: file,
options: {
filename: meta.name,
knownLength: meta.size
}
}
}
}, function(err, httpResponse, body) {
console.log("uploaded");
json.response = body;
});
});
busboy.on("field", function(name, val) {
// Use whatever pattern you want. I used (fileFieldName + ".meta").
// Another good one might be ("meta:" + fileFieldName).
if (/\.meta$/.test(name)) {
// I send an object with { name, size, type, lastModified },
// which are just the public props pulled off a File object.
// Note: Should probably add error handling if val is somehow not parsable.
fileMetas[name] = JSON.parse(val);
console.log(`file metadata: name: ${name}, value: ${value}`);
return;
}
// Otherwise, process field as normal.
console.log(`name: ${name}, value: ${value}`);
});
busboy.on("finish", function() {
res.send(json);
});
req.pipe(busboy);
});
module.exports = router;
On the client, you need to then send the metadata on the so-named field before the file itself. This can be done by ordering an <input type="hidden"> control before the file and updating its value onchange. The order of values sent is guaranteed to follow the order of inputs in appearance. If you're building the request body yourself using FormData, you can do this by appending the appropriate metadata before appending the File.
Example with <form>
<script>
function extractFileMeta(file) {
return JSON.stringify({
size: file.size,
name: file.name,
type: file.type,
lastUpdated: file.lastUpdated
});
}
function onFileUploadChange(event) {
// change this to use arrays if using the multiple attribute on the file input.
var file = event.target.files[0];
var fileMetaInput = document.querySelector('input[name=fileUpload.meta]');
if (fileMetaInput) {
fileMetaInput.value = extractFileMeta(file);
}
}
</script>
<form action="/upload-to-cloud">
<input type="hidden" name="fileUpload.meta">
<input type="file" name="fileUpload" onchange="onFileUploadChange(event)">
</form>
Example with FormData:
function onSubmit(event) {
event.preventDefault();
var form = document.getElementById('my-upload-form');
var formData = new FormData();
var fileUpload = form.elements['fileUpload'];
var fileUploadMeta = JSON.stringify({
size: fileUpload.size,
name: fileUpload.name,
type: fileUpload.type,
lastUpdated: fileUpload.lastUpdated
});
// Append fileUploadMeta BEFORE fileUpload.
formData.append('fileUpload.meta', fileUploadMeta);
formData.append('fileUpload', fileUpload);
// Do whatever you do to POST here.
}