I'm working on a Serverless Webapp (node.js) and have come across a problem which has been blocking me now for 2 days.
Scenario:
- User comes to the site, uploads an image directly to our S3 bucket.
- Event is fired and is added to the AWS SQS queue ready for Lambda to pick up
- Lambda picks up the image key
- Lambda then connects to Cloudinary (via SDK)
- Lambda then tries to upload via the SDK command cloudinary.v2.uploader.upload('s3://bucket-name/file-name', (error, result) => { console.log(error, result)});
Problem:
Nothing gets uploaded to Cloudinary, no Errors are found in Cloudwatch, console.log is empty from callback.
The S3 bucket is whitelisted, I've created a file in my S3 .wellknown/cloudinary/cloud_name
index.js
import imageTransform from "./imageTransform";
exports.imageFromQueue = async event => {
event.Records.forEach(async record => {
const body = JSON.parse(record.body).Records[0];
const { eventName, s3 } = body;
if (eventName === "ObjectCreated:Post") {
const { key } = s3.object;
console.log("here", key); --> This gets printed out
const data = await ImageTransform(key);
}
});
};
}
imageTransform.js
const cloudinary = require("cloudinary").v2;
cloudinary.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET
});
const ImageTransform = async imagePath => {
try {
const data = await cloudinary.uploader.upload(
`s3://${process.env.IMAGE_BUCKET}/${imagePath}`
);
console.log(data);
} catch (error) {
console.log("cloudinary - major error", error);
return { majorError: error };
}
}
Related
I've created an api to upload image to amazon s3 bucket with nodejs, multer, and multer-s3 .
Its working fine in development returning me a response of image URL which is downloadable and accessible but when I host my node app on Aws Lambda (API GATEWAY) the API returns me the same response with the image URL but this time when I open the downloaded image it shows me INVALID FILE FORMATTED
Here is my Code
const uploadImage = async (req,res) => {
let myFile = req.file.originalname.split(".")
const fileType = myFile[myFile.length - 1]
const params = {
Bucket: 'test-bucket2601',
Key: `${Date.now()}.${fileType}`,
Body: req.file.buffer
}
s3.upload(params, (error, data) => {
if(error){
res.status(500).send(error)
}
res.json({data})
})
}
Route of middleware
routes.route('/upload').post(upload,uploadImage);
in post method the first argument is upload middleware
Middleware Code:
const s3 = new aws.S3({
credentials: {
accessKeyId: awsKeys?.accessKeyId,
secretAccessKey: awsKeys?.secretAccessKey
}
});
const storage = multer.memoryStorage({
destination: function(req, file, callback) {
callback(null, '')
}})
const upload = multer({storage}).single('imageUrl')
I just started using aws-sdk on my app to upload files to S3, and i'm debating whether to use aws-sdk v2 or v3.
V2 is the whole package, which is super bloated considering i only need the s3 services, not the myriad of other options. However, the documentation is very cryptic and im having a really hard time getting the equivalent getSignedUrl function to work in v3.
In v2, i have this code to sign the url and it works fine. I am using express on the server
import aws from 'aws-sdk';
const signS3URL = (req,res,next) => {
const s3 = new aws.S3({region:'us-east-2'});
const {fileName,fileType} = req.query;
const s3Params = {
Bucket : process.env.S3_BUCKET,
Key : fileName,
ContentType:fileType,
Expires: 60,
};
s3.getSignedUrl('putObject',s3Params,(err,data)=>{
if(err){
next(err);
}
res.json(data);
});
}
Now I've been reading documentation and examples trying to get the v3 equivalent to work, but i cant find any working example of how to use it. Here is how I have set it up so far
import {S3Client,PutObjectCommand} from '#aws-sdk/client-s3';
import {getSignedUrl} from '#aws-sdk/s3-request-presigner';
export const signS3URL = async(req,res,next) => {
console.log('Sign')
const {fileName,fileType} = req.query;
const s3Params = {
Bucket : process.env.S3_BUCKET,
Key : fileName,
ContentType:fileType,
Expires: 60,
// ACL: 'public-read'
};
const s3 = new S3Client()
s3.config.region = 'us-east-2'
const command = new PutObjectCommand(s3Params)
console.log(command)
await getSignedUrl(s3,command).then(signature =>{
console.log(signature)
res.json(signature)
}).catch(e=>next(e))
}
There are some errors in this code, and the first I can identify is creating the command variable using the PutObjectCommand function provided by the SDK. The documentation does not clarify to me what i need to pass it as the "input" https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/interfaces/putobjectcommandinput.html
Does anyone with experience using aws-sdk v3 know how to do this?
Also a side-question, where can i find the api reference for v2???? cuz all i find is the sdk docs that say "v3 now available" and i cant seem to find the reference to v2....
thanks for your time
The following code would give you a signedUrl in a JSON body with the key as signedUrl.
const signS3URL = async (req, res, next) => {
const { fileName, fileType } = req.query;
const s3Params = {
Bucket: process.env.S3_BUCKET,
Key: fileName,
ContentType: fileType,
// ACL: 'bucket-owner-full-control'
};
const s3 = new S3Client({ region: 'us-east-2' })
const command = new PutObjectCommand(s3Params);
try {
const signedUrl = await getSignedUrl(s3, command, { expiresIn: 60 });
console.log(signedUrl);
res.json({ signedUrl })
} catch (err) {
console.error(err);
next(err);
}
}
Keep the ACL as bucket-owner-full-control if you want the AWS account owning the Bucket to access the files.
You can go to the API Reference for both the JS SDK versions from here
In reference to the AWS docs and #GSSwain's answer (cannot comment, new) this link will show multiple examples getSignedURL examples.
Below is an example of uploading copied from AWS docs
// Import the required AWS SDK clients and commands for Node.js
import {
CreateBucketCommand,
DeleteObjectCommand,
PutObjectCommand,
DeleteBucketCommand }
from "#aws-sdk/client-s3";
import { s3Client } from "./libs/s3Client.js"; // Helper function that creates an Amazon S3 service client module.
import { getSignedUrl } from "#aws-sdk/s3-request-presigner";
import fetch from "node-fetch";
// Set parameters
// Create a random name for the Amazon Simple Storage Service (Amazon S3) bucket and key
export const bucketParams = {
Bucket: `test-bucket-${Math.ceil(Math.random() * 10 ** 10)}`,
Key: `test-object-${Math.ceil(Math.random() * 10 ** 10)}`,
Body: "BODY"
};
export const run = async () => {
try {
// Create an S3 bucket.
console.log(`Creating bucket ${bucketParams.Bucket}`);
await s3Client.send(new CreateBucketCommand({ Bucket: bucketParams.Bucket }));
console.log(`Waiting for "${bucketParams.Bucket}" bucket creation...`);
} catch (err) {
console.log("Error creating bucket", err);
}
try {
// Create a command to put the object in the S3 bucket.
const command = new PutObjectCommand(bucketParams);
// Create the presigned URL.
const signedUrl = await getSignedUrl(s3Client, command, {
expiresIn: 3600,
});
console.log(
`\nPutting "${bucketParams.Key}" using signedUrl with body "${bucketParams.Body}" in v3`
);
console.log(signedUrl);
const response = await fetch(signedUrl, {method: 'PUT', body: bucketParams.Body});
console.log(
`\nResponse returned by signed URL: ${await response.text()}\n`
);
} catch (err) {
console.log("Error creating presigned URL", err);
}
try {
// Delete the object.
console.log(`\nDeleting object "${bucketParams.Key}"} from bucket`);
await s3Client.send(
new DeleteObjectCommand({ Bucket: bucketParams.Bucket, Key: bucketParams.Key })
);
} catch (err) {
console.log("Error deleting object", err);
}
try {
// Delete the S3 bucket.
console.log(`\nDeleting bucket ${bucketParams.Bucket}`);
await s3Client.send(
new DeleteBucketCommand({ Bucket: bucketParams.Bucket })
);
} catch (err) {
console.log("Error deleting bucket", err);
}
};
run();
I am trying to upload an image through react app to s3 bucket and then receiving back the URL and showing the image on the screen.
I am able to upload the image (sort of) and get the URL back from the s3 server, but when I download it I am unable to open it - the formant is unsupported and I can't use the img tag to show it on the webpage. I guess that it is something to do with conversation to base64 but I can't figure out why it is not working.
The frontend(React) is:
const uploadImageToBucket = async (image) => {
console.log("fff",image)
let image_location
try {
const response = axios.post("http://localhost:5000/user/blogmanage/uploadimage",image)
image_location = response.then((response)=>response.data.body);
console.log("img loc", image_location)
return image_location;
} catch (error) {
}
}
The backend(nodejs) is
router.post("/blogmanage/uploadimage", async (req,res)=>{
const s3 = new AWS.S3({
accessKeyId: process.env["AWS_ACCESS_KEY_ID"],
secretAccessKey: process.env["AWS_SECRET_KEY"],
region: process.env['AWS_REGION']
});
const BUCKET_NAME = "mrandmrseatmedia";
var base64data = new Buffer.from( 'binary',req.body);
const params = {
Bucket: BUCKET_NAME,
Key: "test/test2.jpg",
Body: base64data
}
s3.upload(params, function (err,data){
if (err){
console.log(err)
res.status(404).json({msg:err});
}
else{
const image_location = `${data.Location}`;
console.log(`File uploaded successfully. ${data.Location}`);
res.status(200).json({body:image_location});
}
})
});
Thanks!
After a lot of testing and retesting and rewriting using this repo as an example
https://github.com/Jerga99/bwm-ng/blob/master/server/services/image-upload.js
It works.
The use of base64 is wrong in this case. It corrupts the file in some way. Multer library fixes it.
I'm trying to upload files from a MERN application I'm working on. I'm almost done with the NodeJS back end part.
Said application will allow users to upload images(jpg, jpeg, png, gifs, etc) to an Amazon AWS S3 bucket that I created.
Well, lets put it this way. I created a helper:
const aws = require('aws-sdk');
const fs = require('fs');
// Enter copied or downloaded access ID and secret key here
const ID = process.env.AWS_ACCESS_KEY_ID;
const SECRET = process.env.AWS_SECRET_ACCESS_KEY;
// The name of the bucket that you have created
const BUCKET_NAME = process.env.AWS_BUCKET_NAME;
const s3 = new aws.S3({
accessKeyId: ID,
secretAccessKey: SECRET
});
const uploadFile = async images => {
// Read content from the file
const fileContent = fs.readFileSync(images);
// Setting up S3 upload parameters
const params = {
Bucket: BUCKET_NAME,
// Key: 'cat.jpg', // File name you want to save as in S3
Body: fileContent
};
// Uploading files to the bucket
s3.upload(params, function(err, data) {
if (err) {
throw err;
}
console.log(`File uploaded successfully. ${data.Location}`);
});
};
module.exports = uploadFile;
That helper takes three of my environment variables which are the name of the bucket, the keyId and the secret key.
When adding files from the form(that will eventually be added in the front end) the user will be able to send more than one file.
Right now my current post route looks exactly like this:
req.body.user = req.user.id;
req.body.images = req.body.images.split(',').map(image => image.trim());
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
That right there works great but takes the req.body.images as a string with each image separated by a comma. What would the right approach be to upload(to AWS S3) the many files selected from the Windows directory pop up?. I tried doing this but did not work :/
// Add user to req,body
req.body.user = req.user.id;
uploadFile(req.body.images);
const post = await Post.create(req.body);
res.status(201).json({ success: true, data: post });
Thanks and hopefully your guys can help me out with this one. Right now I'm testing it with Postman but later on the files will be sent via a form.
Well you could just call the uploadFile multiple times for each file :
try{
const promises= []
for(const img of images) {
promises.push(uploadFile(img))
}
await Promise.all(promises)
//rest of logic
}catch(err){ //handle err }
On a side note you should warp S3.upload in a promise:
const AWS = require('aws-sdk')
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
})
module.exports = ({ params }) => {
return new Promise((resolve, reject) => {
s3.upload(params, function (s3Err, data) {
if (s3Err) return reject(s3Err)
console.log(`File uploaded successfully at ${data.Location}`)
return resolve(data)
})
})
}
Bonus, if you wish to avoid having your backend handle uploads you can use aws s3 signed urls and let the client browser handle that thus saving your server resources.
One more thing your Post object should only contain Urls of the media not the media itself.
// Setting up S3 upload parameters
const params = {
Bucket: bucket, // bucket name
Key: fileName, // File name you want to save as in S3
Body: Buffer.from(imageStr, 'binary'), //image must be in buffer
ACL: 'public-read', // allow file to be read by anyone
ContentType: 'image/png', // image header for browser to be able to render image
CacheControl: 'max-age=31536000, public' // caching header for browser
};
// Uploading files to the bucket
try {
const result = await s3.upload(params).promise();
return result.Location;
} catch (err) {
console.log('upload error', err);
throw err;
}
I want to upload profile picture of a user sent from web app and mobile app via Base64 form.
On the POST request they need to send a JSON on the body that looks something like this.
{
"name":"profile-pic-123.jpg",
"file":"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAkGBxQTEhIUEhIUFBUV…K9rk8hCAEkjFMUYiEAI+nHIpsQh0AkisDYRTOiCAbWVtgCtI6IlkHh7LDTQXLH0EIQBj//2Q==" // the base64 image
}
Now on the server side using Node and Express, I used this npm module called azure-storage which offers a nice way of uploading files to azure blob storage using web service.
But there's something that I cannot understand on this. Here's a part of the code from my controller. I successfully created all neccessary connections and keys and whatnot to create a working blobService :
controllers.upload = function(req, res, next){
// ...
// generated some sastoken up here
// etc.
// ...
var uploadOptions = {
container: 'mycontainer',
blob: req.body.name, // im not sure about this
path: req.body.file // im not sure about this either
}
sharedBlobService.createBlockBlobFromLocalFile(uploadOptions.container, uploadOptions.blob, uploadOptions.path, function(error, result, response) {
if (error) {
res.send(error);
}
console.log("result", result);
console.log("response", response);
});
}
Im getting this error:
{
"errno": 34,
"code": "ENOENT",
"path": "iVBORw0KGgoAAAANSUhEUgAAABgAAAAYCAIAAAB..."
}
if you use javascript sdk v12 You can use this sample code. It's just that simple. I have this implemented in a function and it works great when all I need it to trigger an HTTP event.
index.js
const file = await require('./file')();
uploadOptions = {
container: 'mycontainer',
blob: req.body.name,
text: req.body.file
}
const fileUploader = await file(uploadOptions.text, uploadOptions.blob,
uploadOptions.container);
You can use a separate module for your logic and call this from the index.js above
file.js
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const Promise = require('bluebird');
module.exports = Promise.method(async function() {
return async function (data, fileName, container) {
const containerClient = await blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const matches = data.match(/^data:([A-Za-z-+\/]+);base64,(.+)$/);
const buffer = new Buffer(matches[2], 'base64');
return await blockBlobClient.upload(buffer, buffer.byteLength );
};
});
In this case, you should not use createBlockBlobFromLocalFile. Instead, you should use createBlockBlobFromText, because you are not uploading a local file, but content in the request body.
Here is the code:
var uploadOptions = {
container: 'mycontainer',
blob: req.body.name,
text: req.body.file
}
sharedBlobService.createBlockBlobFromText(uploadOptions.container,
uploadOptions.blob,
uploadOptions.text,
{
contentType: 'image/jpeg',
contentEncoding: 'base64'
},
function(error, result, response) {
if (error) {
res.send(error);
}
console.log("result", result);
console.log("response", response);
});
The blob is just the file name, which is "profile-pic-123.jpg" this case, and path is the local path to your file. Since you are not storing the file locally in the server side, path is meaningless in the case.
If you need more information about Storage, see this, and this