Upload image to s3 bucket - react native and node js - node.js

Within my app a user can select a profile image and i would like that image to be uploaded to an s3 bucket when the user saves their profile data
I pass the image data (and json, which consists of name, email, telephone for example) from my app to an express server and upload there
At present I can pass the image data (the url it seems at present) to an s3 bucket and it saves
I don't think i'm actually saving the image itself though, as when downloading from s3 (manually) and trying to open on my mac it states it may be damaged and i cannot see the image
Feel daft for asking but how do i actually upload the image itself? Thanks
React Native Side
const handleFormSubmit = formData => {
const jsonData = JSON.stringify({
...formData,
});
// Handle profile image
if (imageProps && imageProps.uri) {
const data = new FormData();
data.append('formBody', jsonData);
data.append('image', {
uri:
Platform.OS === 'android'
? imageProps.uri
: imageProps.uri.replace('file://', ''),
type: imageProps.type,
name: imageProps.fileName,
});
sendRequest(data);
} else {
sendRequest(jsonData);
}
};
const sendRequest = data => {
let responseData;
fetch('http://localhost:8080/users/api/update_user_profile', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Accept: 'application/json',
},
body: data,
})
.then(response => {
responseData = response;
return response.json();
})
.then(jsonData => {
console.log(jsonData)
})
.catch(error => {
console.log(error)
});
};
Server Side
const s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
});
// Setting up S3 upload parameters
const params = {
Bucket: 'bucket-folder',
ACL: 'public-read',
Key: req.files.image.name,
Body: req.files.image.path
};
const stored = await s3.upload(params).promise();

You can use Multer for uploading files to s3.
const multer = require('multer');
const AWS = require('aws-sdk');
const uniqid = require('uniqid');
const storage = multer.memoryStorage();
const upload = multer({ storage });
// ? Posts new file to amazon and saves to db
router.post(
'/:id',
upload.single('attachment'),
async (req, res) => {
const unique = uniqid.time();
const { file } = req;
const { filePath } = req.body;
const { id } = req.params;
const s3FileURL = process.env.AWS_UPLOADED_FILE_URL;
const region = process.env.AWS_REGION;
const secretAccessKey = process.env.AWS_SECRET_ACCESS_KEY;
const accessKeyId = process.env.AWS_ACCESS_KEY_ID;
const Bucket = process.env.AWS_BUCKET_NAME + '/' + filePath;
const Key = `${id}/${unique}-${file.originalname}`;
const Body = file.buffer;
const ContentType = file.mimetype;
const ACL = 'public-read';
const s3bucket = new AWS.S3({
accessKeyId,
secretAccessKey,
region,
});
const params = {
Bucket,
Key,
Body,
ContentType,
ACL,
};
s3bucket.upload(params, async (err, data) => {
if (err) {
res.status(500).json({ error: true, Message: err });
} else {
console.log(params);
const newFileUploaded = {
description: req.body.description,
fileLink: `${s3FileURL}${filePath}/${id}/${unique}-${file.originalname}`,
s3_key: params.Key,
};
try {
const response = await postFile({
name: req.body.name,
attachment: newFileUploaded,
alt: req.body.alt,
user: req.body.user,
relatedID: req.body.relatedID,
});
res.status(200).json({
message: response.message,
success: response.success,
result: response.result,
});
} catch (e) {
res.status(500).json({
message:
'File upoladed but Db couldnt saved request (upload by ID)',
success: false,
result: [],
});
}
}
});
}
);

Related

I would like to send multiple images to the S3 from amazon. This is my code so far, sending just one image

I'm using TYPESCRIPT and NODEJS. In addition to sending the results to the database in POSTGRESSQL.
ROUTER.TS
router.post(
"/image",
isAuthenticated,
upload.single("file"),
async (req, res) => {
const { file } = req;
const product_id = req.query.product_id as string;
const uploadImagesService = new UploadImagesService();
await uploadImagesService.execute(file);
const createImage = await prismaClient.uploadImage.create({
data: {
url: `https://upload-joias.s3.amazonaws.com/${file.filename}`,
id: file.filename,
product_id: product_id,
},
});
return res.send(createImage);
}
);
SERVICE.TS
import S3Storage from "../../utils/S3Storage";
class UploadImagesService {
async execute(file: Express.Multer.File): Promise<void> {
const s3Storage = new S3Storage();
await s3Storage.saveFile(file.filename);
}
}
export { UploadImagesService };
S3Storage.ts
async saveFile(filename: string): Promise<void> {
const originalPath = path.resolve(uploadConfig.diretory, filename);
const contentType = mime.getType(originalPath);
if (!contentType) {
throw new Error("File not found");
}
const fileContent = await fs.promises.readFile(originalPath);
this.client
.putObject({
Bucket: "upload-joias",
Key: filename,
ACL: "public-read",
Body: fileContent,
ContentType: contentType,
})
.promise();
await fs.promises.unlink(originalPath);
}
I'm having a hard time dealing with this, I'm new to node js and typescript. I'm grateful for any help.

MERN Stack - Download file from Amazon s3 using #aws-sdk/client-s3 & #aws-sdk/s3-request-presigner

I am trying to download an image from Amazon s3 using the #aws-sdk/client-s3 package. The image will download but I can't open it. I get an error and says it is an unrecognizable format.
React Component Download Function
const downloadImg = (e) => {
const href = e.target.getAttribute('img-data');
var img = href.split('/').pop();
const options = {
method: 'GET',
headers: { "Authorization" : `Bearer ${token}` },
};
fetch(`/download-img/${img}`, options)
.then(response => response.blob())
.then(blob => {
const url = window.URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = img;
document.body.appendChild(a);
a.click();
a.remove();
});
}
Node/Express Route
// #desc Download Screenshot
// #route get /download-img
// #access Private
app.get('/download-img/:id', authenticateToken, async(req, res) => {
const imgUrl = req.params.id;
try {
const getObjectParams = {
Bucket: awsBucketName,
Key: imgUrl,
}
const command = new GetObjectCommand(getObjectParams);
const file = await getSignedUrl(s3, command, {expiresIn: 60});
const img = axios.get(file)
.then(function (response) {
res.send(response.data)
})
.catch(function (error) {
// handle error
console.log(error);
})
} catch (err) {
console.log(err)
}
});
Response.data Output

Download image from URL, save to Firebase Store, and serve image URL

Having a hard time finding some code to do this and I'm actually having trouble myself trying to get this working. I want to create Firebase Function which would call an image
URL (this URL has a token associated with it from Mapbox). Then I want to write the image to Firebase Storage so I can reference it from there later to serve it. So far I've figured out how to fetch the image but saving it to storage seems to be too much for me! Help!
exports.getPolylineFromSessionId = functions.https.onRequest(async (req, res) => {
const sessionId = req.query.id;
if (sessionId) {
const sessionInfo = await db
.collection('sessions')
.doc(sessionId)
.get();
const session = sessionInfo.data();
const results = encodeURIComponent(polyline.encode(convertGpsTrack(session.gpsTrack)));
const url =
'https://api.mapbox.com/styles/v1/mapbox/satellite-v9/static/path-1+fff-0.5(' + results + ')/auto/700x200?access_token=' + mapboxToken;
const bucket = admin
.storage()
.bucket('gs://mybucket.appspot.com')
.file('thumbnails/' + sessionId + '.jpg');
const res = await fetch(url, {
method: 'GET',
headers: {
'Content-Type': 'image/jpeg',
},
});
const blob = await res.blob();
bucket.save(blob, {
metadata: {
contentType: 'image/jpeg',
},
});
res.status(200).send(session);
} else {
res.status(400).send('sessionId required');
}
});
I was able to figure it out finally! Below is the code I used!
var file = admin
.storage()
.bucket('gs://mybucket.appspot.com')
.file('thumbnails/' + sessionId + '.jpg');
request({ url: url, encoding: null }, function(err, response, buffer) {
var stream = file.createWriteStream({
metadata: {
contentType: response.headers['content-type'],
},
});
stream.end(buffer);
});

Amazon Rekogntion Image: error InvalidImageFormatException: Request has invalid image format

I am trying to compare faces calling AWS Rekognition from a Node.Js application. When comparing two images on a S3 bucket, all went fine, but when i tried to upload a local image from the client (React Native/Expo app) to compare with another image stored on this bucket, i got the error InvalidImageFormatException: Request has invalid image format.
This image is a jpeg 250px square and was sent as a valid base64 string (atob tested). Aparently, it meets the requisites presented here: https://docs.aws.amazon.com/rekognition/latest/dg/limits.html.
Below, some code snippets:
Capturing the image:
const takeImgHandler = async () => {
const img = await ImagePicker.launchCameraAsync(getImgProps);
editImg(img);
};
Editing the image:
const editImg = async img => {
...
const actions = [
{ resize: { 250, 250 } },
];
const saveOptions = {
base64: true,
};
const edited = await ImageManipulator.manipulateAsync(img.uri, actions, saveOptions);
setState({ ...state, img: edited });
};
Setting the detectFaces call to my server:
// sourceImg is appState.img.base64
const compareImagesHandler = async sourceImg => {
const targetImage = {
S3Object: {
Bucket: 'my-bucket-name',
Name: 'image-name.jpg',
},
};
const sourceImage = {
Bytes: sourceImg,
};
const comparison = await ajax({ method: 'POST', url: `url-to-server-route`, data: { sourceImage, targetImage }});
console.log('comparison: >>>>>> ', comparison);
return comparison;
};
The server controler runs this function:
const awsConfig = () => {
const config = new AWS.Config({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
region: process.env.AWS_DEFAULT_REGION,
});
AWS.config.update(config);
};
const compareImages = async (SourceImage, TargetImage, cb) => {
const client = new AWS.Rekognition();
// Logging the base64 string to validate it, externally, just to make
sure that it´s valid!
console.log('sourceImag.Bytes: >>>>>> ', SourceImage.Bytes);
const params = {
SourceImage,
TargetImage,
SimilarityThreshold: 50,
};
client.compareFaces(params, (err, response) => {
if (err) {
console.log('err: >>>>>> ', err);
return cb({ err });
}
if (!response.FaceMatches.length) {
return cb({ err: 'Face not recongized' });
}
response.FaceMatches.forEach(data => {
const position = data.Face.BoundingBox;
const similarity = data.Similarity;
console.log(`The face at: ${position.Left}, ${position.Top} matches with ${similarity} % confidence`);
return cb({ success: data.Similarity });
});
});
};
Solved!
Two tweaks are needed. first, encode the sourceImg file using encodeURIComponent:
const sourceImage = encodeURIComponent(sourceImg);
On the server, I should create a Buffer, instead of sending the base64 string:
const imageBuffer = Buffer.from(decodeURIComponent(SourceImage), 'base64');
So, the body sent to AWS should be:
const params = {
SourceImage: {
Bytes: imageBuffer,
}
TargetImage,
SimilarityThreshold: 50,
};

PDF not uploading to Amazon S3 with Nodejs and Reactjs

I am working on uploading PDF files to S3 using nodejs and react and I am running into an issue where some PDFs are being uploaded and some are not.
This endpoint gets a signed url from AWS
'/api/v1/upload/pdf',
requireAuth,
roleAuthorization(['admin']),
(req, res) => {
const date = new Date();
const year = date.getFullYear();
const key = `${date.toLocaleString('default', {
month: 'long',
})}-${short.uuid('0123456789').slice(0, 2)}-${year}.pdf`;
s3.getSignedUrl(
'putObject',
{
Bucket: 'bucket-name',
ContentType: 'application/pdf',
Key: key,
},
(err, url) => res.send({ key, url })
);
}
);
And this endpoint does the upload from react js and save the link to the file in the database
const createIssue = (dispatch) => async (issue, town) => {
try {
const token = await localStorage.getItem('token');
const uploadConfig = await axios.get('/api/v1/upload/pdf', {
headers: {
Authorization: `Bearer ${token}`,
},
});
const upload = await axios.put(uploadConfig.data.url, issue, {
headers: {
'Content-Type': issue.type,
},
});
const response = await api.post(`/api/v1/town/${town._id}/issue`, {
issue: uploadConfig.data.key,
});
dispatch({ type: 'create_issue', payload: response.data });
} catch (err) {
dispatch({
type: 'add_error',
payload: err.message,
});
}
};
This works but does not work for all PDF files, the file remains pending and does not upload.
Any help welcome.
thanks

Resources