node.js pulling file form S3 to local - node.js

Currently i am trying to load a file from S3 into my temp file. i am using Node.js running on lambda.
Here is the code i am trying to use
var options = {
Bucket: 'my_bucket',
Key: dstKey
};
var input = './temp.pdf';
var file = require('fs').createWriteStream(input);
s3.getObject(options).createReadStream().pipe(file);
I have also tried using the __dirname in my input path with the same results.
I keep getting a file write error.
2016-06-14T19:31:02.859Z 5ea0381c-3266-11e6-a2e3-312b48485905 Error: EACCES: permission denied, open './temp.pdf' at Error (native)

Related

Error when attempting to PUT file to URL with Node request

I am trying to PUT a file to an S3 pre-signed URL. Doing so with bash/cURL works fine but with Node I am getting the following error:
Error: write EPIPE
at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:94:16) {
errno: -32,
code: 'EPIPE',
syscall: 'write'
}
Here is the code
const fs = require('fs');
const request = require('request');
stream = fs.createReadStream('/tmp/file');
r = request.put('https://s3.eu-west-2.amazonaws.com/bucketname/path?X-Amz-Content-Sha256=....&...');
stream.pipe(r).on('error', function(err) {
console.log(err);
});
EPIPE means that the writing request failed because the other end closed the connection. Looks like there might be some additional settings required inorder to work with amazon s3. I know that curl has native support for multipart/form-data. You can use this library to create readable multipart/form-data streams.
https://nicedoc.io/form-data/form-data
Or you can use third party libraries to send data
https://www.npmjs.com/package/s3-upload-stream
https://www.npmjs.com/package/streaming-s3

How to read remote image into memory?

I must read image file into memory on Firebase Cloud Function so I can convert to base64. But I get error:
Error: ENOENT: no such file or directory
// Read the file into memory.
const fs = require('fs');
const imageFile = fs.readFileSync('https://upload.wikimedia.org/wikipedia/commons/f/f7/Lower_Manhattan_skyline_-_June_2017.jpg');
// Convert the image data to a Buffer and base64 encode it.
const encoded = Buffer.from(imageFile).toString('base64');
How I can read remote image file into memory?

Getting mkdir error while using uploading image using multer

while uploading file and creating a path , I am getting creating a folder error :-
Error: EACCES: permission denied, mkdir '/opt/bitnami/apps/NodeJS-Login/uploads'
at Object.fs.mkdirSync (fs.js:885:18)
at Function.sync (/opt/bitnami/apps/NodeJS-Login/node_modules/mkdirp/index.js:71:13)
at new DiskStorage (/opt/bitnami/apps/NodeJS-Login/node_modules/multer/storage/disk.js:21:12)
at module.exports (/opt/bitnami/apps/NodeJS-Login/node_modules/multer/storage/disk.js:65:10)
at new Multer (/opt/bitnami/apps/NodeJS-Login/node_modules/multer/index.js:15:20)
I am using bitnami on AWS to host my MEAN app.
on my main server.js file I have added this:-
app.use(multer({ dest: './uploads/',
rename: function (fieldname, filename) {
return filename;
},
}));
on schema model :-
companyLogo: {
data: Buffer,
type: String
}
and in controller for route :-
admin.companyLogo = fs.readFileSync(req.files.comLogo.path)
admin.companyLogo.type = 'image/png';
What should I do to make image upload ? Also do I have to pass other key values in form-data instead of raw ?
/opt is write protected by default, so here are possible fixes
1) Change permissions for /opt and allow the user to write in this folder (Not Recommended)
OR
2) Run the server.js with the super user, this way you have complete right over the directory and it will allow you to do anything (Not Recommended)
OR
3) Just change the path to somewhere the user has access to write (Recommended)

How to get filepath of uploaded file to createReadStream

I am using Nodejs readStream.pipe(writeStream). How can I get the full path of the file I am uploading and assign it to createReadStream. I get only filename and When the file is in nodejs root directory it is fine no error but when I upload from anywhere else I get an error:
events.js:183
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open 'C:\Users\Esen\Documents\GitHub\gbmsturbo\nike.png'
I know that this happens because nike.png that I am uploading is not in "C:\Users\Esen\Documents\GitHub\gbmsturbo\".
It is inside this folder:
"C:\Users\Esen\Documents\GitHub\filestoupload"
I tried
function throwErrorDontCrash() {
process.on('uncaughtException', function (err) {
console.error(err.message)
})
}
and this prevents nodejs crash and uploads the file but with empty content (0 byte)
router.post('/upload', (req, res) => {
var filePath = req.body.file
var ext = path.extname(filePath)
var filename = path.basename(filePath, ext)
var newVerFilePath = "public/uploads/" + newVerFileName+ext
const readStream = fs.createReadStream(filePath)
throwErrorDontCrash()
const writeStream = fs.createWriteStream(newVerFilePath)
readStream.pipe(writeStream)
function throwErrorDontCrash() {
process.on('uncaughtException', function (err) {
//console.error(err.message)
})
}
and here is my form file
<form class="center" action="/upload" method="post">
<input id="file" type="file" name="file" required encrypt="multipart/form-data"/>
<input type="submit" value="UPLOAD">
I want filePath to include the directory path where ever the file is uploaded when user clicks on Choose or Browse button.
Currently, filePath gets only filename such as nike.png and my expectation is
to get "C:/Users/Esen/Documents/GitHub/filestoupload/nike.png"
it looks like you are writing an express app on nodejs. Your issue is here:
const readStream = fs.createReadStream(filePath);
I think that you believe this is how you "read" the file the user is uploading, but that's actually not possible - the "fs" module doesn't exist in the browser. The user navigating your website is uploading the image from their computer via a form, which means the image is coming in from the HTTP request (the req object), not from the file system.
(This can be confusing because in your case, you probably are running this express app locally on your machine, but it's easier to imagine it in production - the express app runs on a big server somewhere, and is a different computer than your user's computer, where the file being uploaded lives.)
Check out the related S.O. question How to upload, display and save images using node.js and express. Also, see the tutorial Upload Files or Images to Server using Nodejs.

Node BigQuery package -- ENOENT error on Ubuntu

I am trying to use the BigQuery package found here:
https://www.npmjs.org/package/bigquery
Setup: Ubuntu 14, latest Node, nginx, plus that bigquery package and its dependencies.
I believe I've set it up correctly, including the PEM files, but I am getting an error from gauth when I try to load the key files:
[2014-05-04 02:14:57.008] [ERROR] gauth - { [Error: ENOENT, open './key.mydomain.com.p12.pem']
errno: 34,
code: 'ENOENT',
path: './key.mydomain.com.p12.pem' }
Error: ENOENT, open './key.mydomain.com.p12.pem'
I am running just a simple test script that looks like so (I've Xxxx'd out my project ID):
var http = require('http')
, bq = require('bigquery')
, fs = require('fs')
, prjId = 'xxxxxxxxxx'; //you need to modify this
bq.init({
scope: 'https://www.googleapis.com/auth/bigquery',
client_secret: './client_secrets.json',
privatekey_pem: './private.mydomain.com.p12.pem',
key_pem: './key.mydomain.com.p12.pem'
});
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Testing BigQuery... \n');
bq.job.query(prjId, 'select count(*) from publicdata:samples.wikipedia', function(e,r,d){
if(e) console.log(e);
console.log( JSON.stringify(d) );
});
res.end('Done. \n');
}).listen(3000, "127.0.0.1");
console.log('Server running at http://127.0.0.1:3000/');
I've tried referencing the file different, using __dirname and also no slashes.
Any thoughts? I'm looking at the Google code in the dependencies, too, but just not figuring this one out.
A 34 error means 'no such file or directory'. Are you sure the file key.mydomain.com.p12.pem exists in the same directory as your index file?
Ah, figured it out: the P12 file I used to generate the private and public keys was mismatched with my client_secrets.
So, if anyone else gets this issue, the ENOENT could be caused by have a client_secrets.json and a set of keys for a service account that were not created together.

Resources