Trying to request an image from Google CDN and upload it to S3.
Using the https://github.com/request/request library and Node / Express;
A little confused how to handle payload coming back from Google CDN.
The image comes back in the body field and in encoded. Not sure how it is encoded.
Given a URL to a Google CDN:
const fileURL = https://lh4.googleusercontent.com/EWE1234rFL006WfQKuAVrsYMOiKnM6iztPtLgXM5U…3i26LoPHQwPTQME7ne3XoMriKVjUo3hrhwWw1211223
request(fileURL, (err, res, body) => {
//NOT sure how to handle the response here??
//Trying base64
fs.writeFileSync(`.tmp/file1.png`, body, {encoding: 'base64'});
//Trying Binary
fs.writeFileSync(`.tmp/file.png`, body, {encoding: 'binary'});
}
body comes back as:
�PNG↵↵IHDRv&vл� IDATx�}���<z�f���];��o]��A�N�.po�/�/R���..............
1). Request an image from googleusercontent Google CDN (image was originally pasted in a Google Doc)
2). Create an image file and write to disk on the Server.
Neither of the fs.writeFileSync seem to produce a readable image file.
Any advice on handling this would be awesome..
Pass the response as the body to your S3 upload.
var request = require('request'),
fs = require('fs'),
aws = require('aws-sdk'),
s3 = new aws.S3(),
url = 'https://lh4.googleusercontent.com/-2XOcvsAH-kc/VHvmCm1aOoI/AAAAAAABtzg/SDdN1Vg5FFs/s346/14%2B-%2B1';
# Store in a file
request(url).pipe(fs.createWriteStream('file.gif'));
request(url, {encoding: 'binary'}, function(error, response, body) {
# Another way to store in a file
fs.writeFile('file.gif', body, 'binary', function(err) {});
# Upload to S3
s3.upload({
Body: body,
Bucket: 'BucketName',
Key: 'file.gif',
}, function(err, data) {});
});
Related
I'm having issues with getting the full image back from amazon s3 after sending a base64 string(about 2.43MB when converted to an image).
if I compress this image via https://compressnow.com/, and upload, this works fine and I get the full image.
Is it possible for me to compress the base64 string before sending to Amazon s3?
Here is logic to upload to amazon s3
await bucket
.upload({
Bucket: "test",
Key: "test",
Body: "test",
ContentEncoding: 'base64',
Metadata: { MimeType: "png },
})
Similar issue here Node base64 upload to AWS S3 bucket makes image broken
The ContentEncoding parameter specifies the header that S3 should send along with the HTTP response, not the encoding of the object as far as what is passed to the AWS SDK. According to the documentation the Body parameter is simply the "Object data". In other words, you should probably just drop the ContentEncoding parameter unless you have a specific need for it and pass along raw bytes:
const fs = require('fs');
var AWS = require('aws-sdk');
s3 = new AWS.S3({apiVersion: '2006-03-01'});
// Read the contents of a local file
const buf = fs.readFileSync('source_image.jpg')
// Or, if the contents are base64 encoded, then decode them into buffer of raw data:
// const buf = new Buffer.from(fs.readFileSync('source_image.b64', 'utf-8'), 'base64')
var params = {
Bucket: '-example-bucket-',
Key: "path/to/example.jpg",
ContentType: `image/jpeg`,
ACL: 'public-read',
Body: buf,
ContentLength: buf.length,
};
s3.putObject(params, function(err, data){
if (err) {
console.log(err);
console.log('Error uploading data: ', data);
} else {
console.log('succesfully uploaded the image!');
}
});
I've been trying to asynchronously send a Blob image to a REST Api using the request module and Azure Storage module. I don't want to download the Blob to a local file and then create a Readable stream from the local file because it's not performant. This is what I have attempted, but it is throwing the error "Unexpected end of MIME multipart stream. MIME multipart message is not complete." From the request docs, sending a file in the form data requires you pass it a Readable Stream. It seems the Readable Stream from the Azure Storage client isn't compatible with the request module's format. Any ideas how to get this to work?
const request = require('request');
const storage = require('azure-storage');
const blobService = storage.createBlobService(process.env.AzureWebJobsStorage);
let stream = blobService.createReadStream(
containerName,
blobName,
function(err, res) {
});
let formData = {
rootMessageId: messageId,
file: stream
};
request.post({
url:'https://host-name/Api/comment',
headers: {'Authorization': `Token ${authToken}`},
formData: formData
}, (err, res, body) => {
console.log(res)
}
});
I tried to use your code to upload an image blob to my owner local url http://localhost/upload, then I found there is missing some properties in the file property of your formData.
Here is my code works.
const request = require('request');
const storage = require('azure-storage');
var accountName = '<your storage account name>';
var accountKey = '<your storage account name>';
var blobService = storage.createBlobService(accountName, accountKey);
let stream = blobService.createReadStream(containerName, blobName, function(err, res){
formdata.file.options.contentType = res.contentSettings.contentType;
console.log(formdata);
});
var formdata = {
rootMessageId: messageId,
file: { // missing some properties
value: stream,
options: {
filename: function(blobName) {
var elems = blobName.split('/');
return elems[elems.length-1];
}(blobName),
knownLength: stream // a required property of `file` is `knownLength` which will cause server error if be missed.
},
}
}
request.post({
url: 'https://host-name/Api/comment', // I used my url `http://localhost/upload` at here
headers: {'Authorization': `Token ${authToken}`}, // I used a empty {} as header at here
formData: formdata
}, (err, res, body) => {
console.log(res)
}
});
Thinking for the code above, it must pipe a download stream to an upload stream and all data also need to flow through your webapp machine. Per my experience, I think you can generate a SAS url of a blob to post to your REST API and then download the blob via your REST server if you can change the code of your REST application server.
I need to send BLOB to the server in order to make an image on same.
I am using axios on reactJs client and sending data by using this code.
/**
* Returns PDF document.
*
*/
getPDF = (blob) =>
{
let formatData = new FormData();
formatData.append('data', blob);
return axios({
method: 'post',
url: 'http://172.18.0.2:8001/export/pdf',
headers: { 'content-type': 'multipart/form-data' },
data: {
blob: formatData
}
}).then(response => {
return {
status: response.status,
data: response.data
}
})
}
I tried to console.log this blob value on client and there is regular data.
But on server request body is empty.
/**
* Exports data to PDF format route.
*/
app.post('/export/pdf', function (request, response) {
console.log(request.body.blob);
response.send('ok');
});
If I remove headers still empty body when sending blob, but if I remove blob and send some string, a server receives data.
But when the blob is sent server has an empty body.
NodeJS natively does not handle multipart/form-data so you have to use external module eg :- multer
Code Example(Not Tested):
var upload = multer({ dest: __dirname + '/public/uploads/' });
var type = upload.single('upl');
/**
* Exports data to PDF format route.
*/
app.post('/export/pdf', type, function (request, response) {
// Get the blob file data
console.log(request.file);
response.send('ok');
});
you can read about multer here
I hope this will work for you.
Are you using body-parser?
body-parser doesn't handle multipart bodies, which is what FormData is submitted as.
Instead, use a module like multer
let multer = require('multer');
let upload = multer();
app.post('/export/pdf', upload.fields([]), (req, res) => {
let formData = req.body;
console.log('Data', formData);
res.status(200).send('ok');
});
I had 2 problems that I had to solve for this. 1 firebase functions has a bug that doesn't allow multer. 2 you may be getting a blob back from response.blob() and that doesn't seem to produce a properly formatted blob for firebase functions either.
While uploading the DeviceFarm S3 url for file uploading getting error code:ECONNRESET.
This is my code:
var AWS = require('aws-sdk');
var fs = require('fs');
var req = require('request');
var devicefarm = new AWS.DeviceFarm();
AWS.config.loadFromPath('C:/Users/Abin.mathew/AWSdata/config.json');
var apkPath= "D:/DTS/APKs/TPLegacyPlugin-googleplaystore-debug-rc_16.2.15.apk";
var stats = fs.statSync(apkPath);
var url= "https://prod-us-west-2-uploads.s3-us-west-2.amazonaws.com/arn%3Aaws%3Adevicefarm%3Aus-west-2%3A594587224081%3Aproject%3Ade07f584-7c64-4748-aebd-ec965ab107cf/uploads/arn%3Aaws%3Adevicefarm%3Aus-west-2%3A594587224081%3Aupload%3Ade07f584-7c64-4748-aebd-ec965ab107cf/5dd627eb-4eb2-4f2d-a300-0fde0720bde4/MyAppiumPythonUpload?AWSAccessKeyId";
fs.createReadStream(apkPath).pipe(req({
method: 'PUT',
url: url,
headers: {
'Content-Length': stats['size']
}
}, function (err, res, body) {
console.log(body);
console.log(res);
console.log(err);
}));
Your URL is incorrect. It represents the Appium test package but you are trying to upload an APK. Are you reusing a URL from a previous operation? Pre-signed URLs also expire after a period of time so they should not be reused.
To make this work,
Call CreateUpload and get the pre-signed URL from the result.
Post the correct file to the URL.
We have published a blog post that describes the procedure to follow. The code samples use the CLI but translating them to nodejs should be trivial.
I'm trying to upload a file to s3 using request.js but the file data seems to be incorrect after I upload it. Should I be using a data property of the response object instead?
var flickrPhotoUrl = 'http://c1.staticflickr.com/3/2207/2305523141_1f98981685_z.jpg?zz=1';
request.get(flickrPhotoUrl, function(error, response, body){
if (body) {
s3.upload({
Body: body,
Bucket: 'my-uploads',
Key: 'photo.jpg',
}, function (err) {
done(err);
});
}
else
{
done('no response');
}
});
When I grab the file from s3 after the upload, it's not a recognizable image and seems to be twice as big.
request by default converts your image binary data to utf8 string. That's why the file size is larger than the actual size. Try passing encoding:null to keep the body as buffer:
request.get({encoding:null, uri: flickrPhotoUrl}, function(error, response, body){
...
})
Update:
I think you can also pass a readable stream in Body parameter. This is faster than above for large files.
var stream = request.get(flickrPhotoUrl)
s3.upload({
Body: stream,
Bucket: 'my-uploads',
Key: 'photo.jpg',
}