Uploading JSON Object using s3.putObject uploads key but not the value - node.js

I am crafting a simple json object and uploading it to digital ocean using the s3.putObject function. There are no problems getting it to upload but when I look at it on digital ocean, only the key is there in the json object, and the value shows {}
Here is the code creating the JSON, and uploading it:
async function sendErrorData(error){
var errorfile = {
'errorLog' : error
}
console.log(errorfile)
const params = {
Body: JSON.stringify(errorfile),
Bucket: 'MyBucket'
Key: 'errors.json',
ContentType: "application/json"
};
await uploadToDO(params)
.then((data) => console.log(JSON.stringify(data)))
.catch((err) => console.log(JSON.stringify(err)))
console.log(errorfile)
}
function uploadToDO(params) {
return s3.putObject(params).promise()
}
The console logs before and after the upload show the object perfectly fine, but once uploaded it's missing the values like this.
{
"errorLog": ReferenceError: ....
}
Uploaded:
{
"errorLog": {}
}

{
"errorLog": ReferenceError: ....
}
Is invalid JSON by the looks of things. You ask AWS to upload as application/json file and it is not so fails.
Therefore when you construct the errorfile
var errorfile = {
'errorLog' : JSON.stringify(error)
}
Note: This will save the error possibly as a string and not as a JSON object. If you need it as a JSON object you'd need to construct it yourself.

You are awaiting on the function call uploadToDO(params). But the function uploadToDO is not defined as an async function.
it should be:
async function uploadToDO(params) {
return s3.putObject(params).promise()
}
Hope this helps.

Related

Send uploaded files with multer to another API from my own typescript API using axios

I'm working on a Node Typescript Rest API with Express. I did an uploadFiles middleware using multer that store all the files uploaded on a diskStorage, and this works fine.
But then, I want to immediately send these same files to another API to process them and extract data.
I am stuck here because I can't achieve to send these files with axios in a formData.
I first tried to access my files on req.files, and then send them with axios like this :
const filesMulter = req.files as Express.Multer.File[];
for await (const cvFile of filesMulter) {
const formData = new FormData();
formData.append("file", cvFile);
const extractRes = await axios.post(
`${process.env.API_EXTRACT_BASEURL}/cv/`,
formData,
{
headers: {
"Content-Type": "multipart/form-data",
},
}
);
}
But I got this error in the append() :
Argument of type 'File' is not assignable to parameter of type 'string | Blob'.
So I tried to convert the file to a Blob, so the file is accepted in the formData.append(), but then I get this axios error :
AxiosError: Data after transformation must be a string, an ArrayBuffer, a Buffer, or a Stream
I tried to convert the file in different formats (Buffer, Stream, string...) but I get the append() error or the Axios error mentionned before...
I even tried to get again the file from the disk by using
import fs from "fs";
const getFile = async (filePath: string): Promise<Buffer> => {
return await new Promise((resolve, reject) => {
fs.readFile(filePath, (error, data) => {
if (error != null) {
reject(error);
} else {
resolve(data);
}
});
});
};
export default getFile;
And then :
const cvFileOnDisk = await getFile("/path/to/my/file/hardcoded/example.pdf");
This return a Buffer, but again I cannot add it to the formData with append()...
Any idea how I could solve this ?
Thanks a lot !

#aws-sdk/client-lambda] - Invoke Lambda - Payload Response in Unit8Array - Convert to String

I am using the #aws-sdk/client-lambda npm package for invoking lambdas. I have two Lambdas. Lambda A & Lambda B. Lambda A is trying to invoke Lambda B.
Lambda A invokes Lambda B by running the following code:
const { LambdaClient, InvokeCommand } = require('#aws-sdk/client-lambda');
module.exports = {
getGitHubToken: async () => {
const client = new LambdaClient({ region: process.env.REGION });
const params = {
FunctionName: process.env.GITHUB_TOKEN_FUNCTION,
LogType: 'Tail',
Payload: '',
};
const command = new InvokeCommand(params);
try {
const { Payload } = await client.send(command);
console.log(Payload);
return Payload;
} catch (error) {
console.error(error.message);
throw error;
}
},
};
The expected response from Lambda B should look like this:
{
statusCode: 200,
body: JSON.stringify({
token: '123',
}),
};
However, Payload looks to be returning this from the line console.log(Payload);:
I looked on the AWS SDK Website and it looks like Payload returns a Uint8Array. I guess this is because it's from a promise?
I have tried doing Payload.toString() however that comes back as simply a string of the values in the Unit8Array. Example being:
2021-04-13T14:32:04.874Z worker:success Payload: 123,34,115,116,97,116,117,115,67,111,100,101,34,58,50,48,48,44,34,98,111,100,121,34,58,34,123,92,34,116,111,107,101,110,92,34,58,92,34,103,104,115,95,111,114,101,51,65,109,99,122,86,85,74,122,66,52,90,68,104,57,122,122,85,118,119,52,51,50,111,67,71,48,50,75,121,79,69,72,92,34,125,34,125
My Question:
How do I resolve data from Unit8Array to the data I was expecting from the Lambda response? Which is a JSON Object?
I have confirmed the requested Lambda (Lambda B in this case) is returning the data correctly by going to CloudWatch. Thanks.
Okay, I found a way to get this working.
You have to specify a text encoder:
const asciiDecoder = new TextDecoder('ascii');
Then decode it so it looks like this:
const data = asciiDecoder.decode(Payload);
I have logged an issue on their repository asking why this isn't included in the module. I will post an update on any movement on this.

Using HttpService to export as CSV: Unexpected token N in JSON

Currently, I am trying to export csv using nodejs as backend and angular as front end.
I know that the error 'Unexpected token N in JSON' means that there is a JSON parsing error. Which means that I need to change the {responseType: "blob" as "json"}. But there is a problem because I cannot include these in the parameters of httpServices because it only accepts HttpParams as parameters
So in respect to that I have tried these
1) return this.httpService.get(`/chatbots/get/${id}/download`, {responseType: 'text'}) which returns an error since httpService only accepts HttpParams as parameters
2) Change HttpServices from #core/services to HttpClient but it didnt work since I have another API call inside the current API call which uses HTTPServices.
3) Change it to a post method where I can attach {responseType: "blob" as "json"} but it didnt work and anyway it should work as a get?
Currently, output already shows the text in csv.
router.get('/get/:id/download', passport.authenticate('jwt', {
session: false
}), (req, res, next) => {
.....
console.log(output)
res.attachment('test.csv')
res.send(output)
}
In services:
import { HttpService } from '../../#core/services';
constructor(
private httpService: HttpService, ) {}
...
downloadCSV(id: string) {
return this.httpService.get(`/xxxx/get/${id}/download`)
}
in component
export() {
if (this.id) {
this.chatbotService.downloadChatbot(this.id)
.subscribe((data: any) => {
const blob = new Blob([data], { type: 'text/csv' });
const fileName = `${this.id}-test.csv`;
saveAs(blob, fileName);
})
}
}
Even though the status is 200, it says
SyntaxError: Unexpected token N in JSON at position 0
at JSON.parse ()
at XMLHttpRequest.onLoad (http://localhost:8080/vendor.js:22263:51)
at ....
By default, a request made with HttpClient is considered to have a JSON response.
To get a CSV, you can force a text response with
downloadCSV(id: string) {
return this.httpService.get(`/xxxx/get/${id}/download`, {responseType: 'text'});
}
You can also read this part of the documentation : https://angular.io/api/common/http/HttpRequest#responseType

Render raw image bytes to response body

I'm creating an API that creates authorized API calls to Google's APIs, specifically Drive for this question. My API is working fine and uses Google's Node API to make the requests. When I fire off a request to this resource, I get back the following response:
{
"kind": "drive#file",
"id": "...",
"name": "bookmobile.jpg",
"mimeType": "image/jpeg"
}
I use the above response to determine the MIME type of the file I'm to display later. I then make a subsequent call to the same endpoint, but specifying alt=media as an option to download the file as specified in Google's Guide. If I console.log or res.send() the response, I get the following output:
Which we can see is the raw image bytes from the API call. How do I render these bytes to the response body properly? My code is as follows:
// DriveController.show
exports.show = async ({ query, params }, res) => {
if (query.alt && query.alt.toLowerCase().trim() === 'media') {
// Set to JSON as we need to get the content type of the resource
query.alt = 'json'
// Get the Files Resource object
const options = createOptions(query, params.fileId)
const filesResource = await Promise.fromCallback(cb => files.get(options, cb))
// Grab the raw image bytes
query.alt = 'media'
await createAPIRequest(createOptions(query, params.fileId), 'get', res, filesResource)
} else {
await createAPIRequest(createOptions(query, params.fileId), 'get', res)
}
}
async function createAPIRequest (options, method, res, filesResource = {}) {
try {
const response = await Promise.fromCallback(cb => files[method](options, cb))
if (filesResource.hasOwnProperty('mimeType')) {
// Render file resource to body here
} else {
res.json(response)
}
} catch (error) {
res.json(error)
}
}
Searching through various answers here all seem to point to the following:
res.type(filesResource.mimeType)
const image = Buffer.from(response, 'binary')
fs.createReadStream(image).pipe(res)
But this kills my Express app with the following error:
Error: Path must be a string without null bytes
How would I go about rendering those raw image bytes to the response body properly?
The Google API client returns binary data as a string by default, which will corrupt image data when it happens. (The issue is discussed on this thread: https://github.com/google/google-api-nodejs-client/issues/618). To fix, use the encoding: null option when requesting the file contents:
files[method](options, { encoding: null }, cb))

Upload data to S3 via POST using AWS SDK for Node.js/Restify

I'm trying to figure out how to upload data to an Amazon S3 bucket via a RESTful API that I'm writing in Node.js/Restify. I think I've got the basic concepts all working, but when I go to connect to the body of my POST request, that's when things go awry. When I set up my callback function to simply pass a string to S3, it works just fine and the file is created in the appropriate S3 bucket:
function postPoint(req, res, next) {
var point = [
{ "x": "0.12" },
{ "y": "0.32" }
];
var params = { Bucket: 'myBucket', Key: 'myKey', Body: JSON.stringify(point) };
s3.client.putObject(params, function (perr, pres) {
if (perr) {
console.log("Error uploading data: ", perr);
} else {
console.log("Successfully uploaded data to myBucket/myKey");
}
});
res.send(200);
return next();
}
server.post('/point', postPoint);
Obviously, I need to eventually stream/pipe my request from the body of the request. I assumed all that I needed to do would be to simply switch the body of the params to the request stream:
function postPoint(req, res, next) {
var params = { Bucket: 'myBucket', Key: 'myKey', Body: req };
s3.client.putObject(params, function (perr, pres) {
if (perr) {
console.log("Error uploading data: ", perr);
} else {
console.log("Successfully uploaded data to myBucket/myKey");
}
});
res.send(200);
return next();
}
But that ends up causing the following log message to be displayed: "Error uploading data: [TypeError: path must be a string]" which gives me very little indication of what I need to do to fix the error. Ultimately, I want to be able to pipe the result since the data being sent could be quite large (I'm not sure if the previous examples are causing the body to be stored in memory), so I thought that something like this might work:
function postPoint(req, res, next) {
var params = { Bucket: 'myBucket', Key: 'myKey', Body: req };
req.pipe(s3.client.putObject(params));
res.send(200);
return next();
}
Since I've done something similar in a GET function that works just fine:(s3.client.getObject(params).createReadStream().pipe(res);). But that also did not work.
I'm at a bit of a loss at this point so any guidance would be greatly appreciated!
So, I finally discovered the answer after posting on the AWS Developer Forums. It turns out that the Content-Length header was missing from my S3 requests. Loren#AWS summed it up very well:
In order to upload any object to S3, you need to provide a Content-Length. Typically, the SDK can infer the contents from Buffer and String data (or any object with a .length property), and we have special detections for file streams to get file length. Unfortunately, there's no way the SDK can figure out the length of an arbitrary stream, so if you pass something like an HTTP stream, you will need to manually provide the content length yourself.
The suggested solution was to simply pass the content length from the headers of the http.IncomingMessage object:
var params = {
Bucket: 'bucket', Key: 'key', Body: req,
ContentLength: parseInt(req.headers['content-length'], 10)
};
s3.putObject(params, ...);
If anyone is interested in reading the entire thread, you can access it here.

Resources