I am uploading a single image (upload.single('file') is Multer middleware.), Koa framework for nodeJs. Everything works on localhost but not on the prod environment. Here is the the path for the prod environment.
enter image description here
Prod env: Cliet-> Api gateway-> NLB -> service to S3 bucket.
localhost env: Cliet-> service to S3 bucket.
The problem is the size of the image increased when it reaches to service before uploading to S3 bucket and it will be distorted image (237kb increases to 418kb). But using localhost all works fine. I used postman as a client for uploading image.
async create(ctx: Context, values: any): Promise<any> {
// The image size increased before sending to S3 bucket
console.log('REQUEST----', values.file)
const params = {
Bucket: env.Bucket,
Key: `images/${values.file.originalname}`,
Body: values.file.buffer,
ACL: 'public-read',
ContentType: values.file.mimetype,
ContentLength: values.file.size,
}
try {
const data = await s3.putObject(params).promise();
console.log('Successfully uploaded file.', data);
return true;
} catch(e) {
ctx.throw(HttpStatusCodes.InternalServerError, {
message: `Something went wrong when updating advertisement` + e,
});
}
}
Related
I've created an api to upload image to amazon s3 bucket with nodejs, multer, and multer-s3 .
Its working fine in development returning me a response of image URL which is downloadable and accessible but when I host my node app on Aws Lambda (API GATEWAY) the API returns me the same response with the image URL but this time when I open the downloaded image it shows me INVALID FILE FORMATTED
Here is my Code
const uploadImage = async (req,res) => {
let myFile = req.file.originalname.split(".")
const fileType = myFile[myFile.length - 1]
const params = {
Bucket: 'test-bucket2601',
Key: `${Date.now()}.${fileType}`,
Body: req.file.buffer
}
s3.upload(params, (error, data) => {
if(error){
res.status(500).send(error)
}
res.json({data})
})
}
Route of middleware
routes.route('/upload').post(upload,uploadImage);
in post method the first argument is upload middleware
Middleware Code:
const s3 = new aws.S3({
credentials: {
accessKeyId: awsKeys?.accessKeyId,
secretAccessKey: awsKeys?.secretAccessKey
}
});
const storage = multer.memoryStorage({
destination: function(req, file, callback) {
callback(null, '')
}})
const upload = multer({storage}).single('imageUrl')
I am trying to upload a locally stored image from my Node.js project's file structure using the aws-sdk package to my AWS S3 bucket and am able to successfully upload it, however, the uploaded image is a partially rendered version of the image. Only the top 1% (12kb) of it are visible when I view the URL created by AWS for the image. I've logged out the file to the console and made sure it was what I thought it was, and it is. But for some reason when I upload it to S3, it's a truncated / cut off version of the image.
All of the tutorials seem pretty straight forward but nobody seems to mention this problem. I've been grappling with it for hours but nothing seems to work. I've tried everything I can find online like:
Using fs.createReadStream(fileName) instead of just the file buffer but that didn't work (from Image file cut off when uploading to AWS S3 bucket via Django and Boto3)
Converting the buffer to base64 string and sending it that way
Adding the ContentLength param
Adding the ContentType to be the exact type of the image
Here's the relevant code:
const aws = require("aws-sdk")
const { infoLogger } = require("./logger")
async function uploadCoverImage() {
try {
aws.config.update({
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
region: "us-east-2",
})
const s3 = new aws.S3()
fs.readFile("cover.jpg", (error, image) => {
if (error) throw error
const params = {
Bucket: process.env.BUCKET_NAME,
Key: "cover.jpg",
Body: image,
ACL: "public-read",
ContentType: "image/jpg",
}
s3.upload(params, (error, res) => {
if (error) throw error
console.log(`${JSON.stringify(res)}`)
})
})
} catch (error) {
infoLogger.error(`Error reading cover file: ${JSON.stringify(error)}`)
}
}
module.exports = uploadCoverImage
I found out that it was uploading before the image had finished downloading via fs.createReadStream() in a different part of my codebase which is why it was partially loaded in S3. I never noticed because I only ever saw the fully loaded image in my local file system.
I am trying to upload an image through react app to s3 bucket and then receiving back the URL and showing the image on the screen.
I am able to upload the image (sort of) and get the URL back from the s3 server, but when I download it I am unable to open it - the formant is unsupported and I can't use the img tag to show it on the webpage. I guess that it is something to do with conversation to base64 but I can't figure out why it is not working.
The frontend(React) is:
const uploadImageToBucket = async (image) => {
console.log("fff",image)
let image_location
try {
const response = axios.post("http://localhost:5000/user/blogmanage/uploadimage",image)
image_location = response.then((response)=>response.data.body);
console.log("img loc", image_location)
return image_location;
} catch (error) {
}
}
The backend(nodejs) is
router.post("/blogmanage/uploadimage", async (req,res)=>{
const s3 = new AWS.S3({
accessKeyId: process.env["AWS_ACCESS_KEY_ID"],
secretAccessKey: process.env["AWS_SECRET_KEY"],
region: process.env['AWS_REGION']
});
const BUCKET_NAME = "mrandmrseatmedia";
var base64data = new Buffer.from( 'binary',req.body);
const params = {
Bucket: BUCKET_NAME,
Key: "test/test2.jpg",
Body: base64data
}
s3.upload(params, function (err,data){
if (err){
console.log(err)
res.status(404).json({msg:err});
}
else{
const image_location = `${data.Location}`;
console.log(`File uploaded successfully. ${data.Location}`);
res.status(200).json({body:image_location});
}
})
});
Thanks!
After a lot of testing and retesting and rewriting using this repo as an example
https://github.com/Jerga99/bwm-ng/blob/master/server/services/image-upload.js
It works.
The use of base64 is wrong in this case. It corrupts the file in some way. Multer library fixes it.
Good evening
I have this task. I have to upload an image to the S3 bucket using Node JS and generates a thumbnail on the go and not by using a lambda trigger. Everything should be done on my local machine terminal (or) in the local server(postman). I tried this code.
const fs = require('fs');
const ACESS_ID = 'A**********KV';
const SECRET_ID = 'G***********0';
const BUCKET_NAME = 'node-image-bucket';
// Initializing s3 interface
const s3 = new AWS.S3({
accessKeyId: ACESS_ID,
secretAccessKey: SECRET_ID,
});
// File reading function to S3
const uploadFile = (fileName) => {
// Read content from the file
const fileContent = fs.readFileSync(fileName);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
Key: 'scene2.jpg',
Body: fileContent
};
// Uploading files to the bucket
s3.upload(params, function(err, data){
if(err){
throw err;
}
console.log(data);
console.log(`File uploaded Successfully. ${data.Location}`);
});
};
uploadFile('./images/bg-hd.jpg');
Above code is working fine with a single image and the problem is every time I upload a file to the S3 bucket I need to change the S3 params key string value
I want to upload multiple images at once and creating a buffer for performance and it should create thumbnails automatically in the same bucket at the different folder.
Could anyone help me, guys! Please Any help Appreciated!!!
You cannot upload multiple files with one s3 operation but you can use the sharp module before uploading https://www.npmjs.com/package/sharp
to resize your image before calling the s3 api.
import * as sharp from 'sharp';
async function resize(buffer , width, height) {
return sharp(buffer).resize(width, height).toBuffer();
}
const thumbnailWidthSize = 200;
const thumbnailWidthHeight = 200;
const thumbnailImage = await resize(fileContent, thumbnailWidthSize, thumbnailWidthHeight)
You can then reuse your current upload function and run it as many times as many image resizes you need with different keys and wrap those calls around promise.all to make the operation fail if any of the upload fails.
await promise.all([
s3upload(image, imageKey),
s3upload(thumbnailImage, thumbnailImageKey)
])
So, there are two parts to your questions -
Converting the image to thumbnail on the fly while uploading to s3 bucket -
You can use the thumbd npm module and create a thumbd server.
Thumbd is an image thumbnailing server built on top of Node.js, SQS, S3, and ImageMagick.
Prerequistes for the thumbd server -
Thumbd requires the following environment variables to be set:
AWS_KEY the key for your AWS account (the IAM user must have access to the appropriate SQS and S3 resources).
AWS_SECRET the AWS secret key.
BUCKET the bucket to download the original images from. The thumbnails will also be placed in this bucket
AWS_REGION the AWS Region of the bucket. Defaults to: us-east-1.
CONVERT_COMMAND the ImageMagick convert command. Defaults to convert.
REQUEST_TIMEOUT how long to wait in milliseconds before aborting a remote request. Defaults to 15000.
S3_ACL the acl to set on the uploaded images. Must be one of private, or public-read. Defaults to private.
S3_STORAGE_CLASS the storage class for the uploaded images. Must be either STANDARD or REDUCED_REDUNDANCY. Defaults to STANDARD.
SQS_QUEUE the queue name to listen for image thumbnailing.
When running locally, I set these environment variables in a .env file and execute thumbd using pm2/forever/foreman.
Setup -
apt-get install imagemagick
npm install thumbd -g
thumbd install
thumbd start // Run thumbd as a server
After the thumbd server is up and running, refer the code below to change image to thumbnail while uploading to s3 bucket.
var aws = require('aws-sdk');
var url = require("url");
var awsS3Config = {
accessKeyId: ACESS_ID,
secretAccessKey: SECRET_ID,
region: 'us-west-2'
}
var BUCKET_NAME = 'node-image-bucket';
var sourceImageDirectory = "/tmp/"
var imageUploadDir = "/thumbnails/"
var imageName = 'image.jpg'
var uploadImageName = 'image.jpg'
aws.config.update(awsS3Config);
var s3 = new aws.S3();
var Client = require('thumbd').Client,
client = new Client({
awsKey: awsS3Config.accessKeyId,
awsSecret: awsS3Config.secretAccessKey,
s3Bucket: BUCKET_NAME,
sqsQueue: 'ThumbnailCreator',
awsRegion: awsS3Config.region,
s3Acl: 'public-read'
});
export function uploadAndResize(sourceImageDirectory, imageName, imageUploadDir, uploadImageName) {
return new Promise((resolve, reject)=>{
client.upload(sourceImageDirectory + imageName, imageUploadDir + uploadImageName, function(err) {
if (err) {
reject(err);
} else {
client.thumbnail(imageUploadDir + uploadImageName, [{
"suffix": "medium",
"width": 360,
"height": 360,
"background": "white",
"strategy": "%(command)s %(localPaths[0])s -resize %(width)sX%(height)s^ -gravity north -extent %(width)sX%(height)s %(convertedPath)s"
}, {
"suffix": "thumb",
"width": 100,
"height": 100,
"background": "white",
"strategy": "%(command)s %(localPaths[0])s -resize %(width)sX%(height)s^ -gravity north -extent %(width)sX%(height)s %(convertedPath)s"
}], {
//notify: 'https://callback.example.com'
});
var response = {};
//https://s3-ap-us-west-2.amazonaws.com/node-image-bucket/1/5825c7d0-127f-4dac-b802-ca24efba2bcd-original.jpeg
response.url = 'https://s3-' + awsS3Config.region + '.amazonaws.com/' + BUCKET_NAME + '/' + imageUploadDir;
response.uploadImageName = uploadImageName;
response.sourceImageName = imageName;
resolve(response);
}
})
})
Second, you wanted to upload multiple images without changing the string -
Loop over the below method for all the files in a localpath and you are good to go.
export function uploadFiles(localPath, localFileName, fileUploadDir, uploadFileName) {
return new Promise((resolve, reject) => {
fs.readFile(localPath + localFileName, function (err, file) {
if (err) {
reject(err);
}
var params = {
ACL: 'public-read',
Bucket: BUCKET_NAME,
Key: uploadFileName,
Body: file
};
s3.upload(params, function (err, data) {
fs.unlink(localPath + localFileName, function (err) {
if (err) {
reject(err);
} else {
resolve(data)
}
});
});
});
})
}
i want to
1-choose an image from my filesystem and upload it to server/local
2- get its url back using node js service . i managed to do step 1 and now i want to get the image url instead of getting the success message in res.end
here is my code
app.post("/api/Upload", function(req, res) {
upload(req, res, function(err) {
if (err) {
return res.end("Something went wrong!");
}
return res.end("File uploaded sucessfully!.");
});
});
i'm using multer to upload the image.
You can do something like this, using AWS S3 and it returns the url of the image uploaded
const AWS = require('aws-sdk')
AWS.config.update({
accessKeyId: <AWS_ACCESS_KEY>,
secretAccessKey: <AWS_SECRET>
})
const uploadImage = file => {
const replaceFile = file.data_uri.replace(/^data:image\/\w+;base64,/, '')
const buf = new Buffer(replaceFile, 'base64')
const s3 = new AWS.S3()
s3.upload({
Bucket: <YOUR_BUCKET>,
Key: <NAME_TO_SAVE>,
Body: buf,
ACL: 'public-read'
}, (err, data) => {
if (err) throw err;
return data.Location; // this is the URL
})
}
also you can check this express generator, which has the route to upload images to AWS S3 https://www.npmjs.com/package/speedbe
I am assuming that you are saving the image on the server file system and not a Storage solution like AWS S3 or Google Cloud Storage, where you get the url after upload.
Since, you are storing it on the filesystem, you can rename the file with a unique identifier like uuid or something else.
Then you can make a GET route and request that ID in query or path parameter and then read the file having that ID as the name and send it back.