Upload files to aws s3 using NodeJs - node.js

I am working on a project (based on MEAN Stack) which shows different images according to different logged in users. I am using amazon S3 for storing those images.
Currently I have creatde a different route for the admin panel where the admin can sign in and upload the images on amazon s3 for different users.(Also,is this the correct flow of the application?)
I have the below line of code in my js file:
AWS.config.update({ accessKeyId: xxxxxx, secretAccessKey: xxxxxx });
I have read that this should only be for development purposes and I should not be having my accesskeyId and secretacesskey in the code like this.
I want to know that for production what should be done?

For production you need to store these keys in environment. The Aws module will itself pick these from environment.
AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY keys should be set.
For example on local machine you can test these with terminal commands
export AWS_ACCESS_KEY_ID=XXXXX and export AWS_ACCESS_KEY_ID=XXXXX respectively.
For production you will need to do the same. Except you will need to do this via the node process manager you are using. Here is an example of pm2 process manager doing it.
module.exports = {
apps : [
{
name: "myapp",
script: "./app.js",
watch: true,
instance_var: 'INSTANCE_ID',
env: {
"PORT": 3000,
"NODE_ENV": "development",
"AWS_ACCESS_KEY_ID": "XXXXX",
"AWS_SECRET_ACCESS_KEY": "XXXXX"
}
}
]
}
http://pm2.keymetrics.io/docs/usage/environment/#specific-environment-variables
The flow is similar even if you are using a different process manager.

Related

How to read .env file variables in Nodejs within Azure function

I have the following code which works fine locally, but when I deploy to an Azure function it fails to read in the contents of the .env file at runtime, when debugging each of the config items is "undefined". .Env file is deployed to Azure with correct entries and the function executes correctly when I hard code the config variables to test. I assume I need to do something differently to get this to work on Azure?
const sql = require('mssql')
require('dotenv').config();
const dbConfig = {
server: process.env.databaseServer,
database: process.env.databaseName,
user: process.env.databaseUser,
password: process.env.databasePassword,
port: 1433,
options: {
encrypt: true,
"enableArithAbort": true
}
};
Azure function is a packed service, its process.env has reloaded properties of the Azure function environment, by default, it will not load your .env file.
It is recommended that defining all your .env content in Azure function application settings:
Simple demo to get this value:
Related doc sees here.

I get "ConfigError: Missing region in config" when I use AWS SES to send an email

I'm using node and next js.
I wrote a function to send an email when someone subscribes to a newsletter; it sends a welcome email. The error comes out when I call the function.
In terms of what I tried to fix this credentials problem:
I used the "aws configure" command that the aws cli provides.
I used a .env file with the credentials.
I set the enviroment variables on my pc via the set command (this did set them up but I still have the problem).
Currently I'm using this on top of my function (I'm trying to make it work with the .env file):
var AWS = require("aws-sdk")
AWS.config.update({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.SECRET_ACCESS_KEY,
region: process.env.AWS_REGION
})
Something that might be useful is that when I use console.log to log out the enviroment variables the result is "undefined". I tested it out with another enviroment variable (NODE_ENV) and it does come out correctly.

aws s3 node js sdk method generatePutUrl returns differnt results for localhost and deployment on heroku

I am trying to manage direct file upload to S3 according to heroku recomendations
first one need to generate presigned URL at ones server
use this url in client to direct upload of image from browser to S3 bucket
and finally manage to works it locally.
but when I tried to deploy server on heroku it starts to fail with no reason or readable error. Just common error and strange message when I try to print it
what looks strange for me that presigned urls are completely different when I make call from local host or from heroku
response for localhost looks like this:
https://mybucket.s3.eu-west-1.amazonaws.com/5e3ec346d0b5af34ef9dfadf_avatar.png?AWSAccessKeyId=<AWSKeyIdHere>&Content-Encoding=base64&Content-Type=image%2Fpng&Expires=1581172437&Signature=xDJcRBiA%2FmQF1qKhBZrnhFXWdaM%3D
and response for heroku deployment looks like this:
https://mybucket.s3.u-west-1.amazonaws.com/5e3ee2bd1513b60017d85c6c_avatar.png?Content-Type=image%2Fpng&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=<credentials-key-here>%2F20200208%2Fu-west-1%2Fs3%2Faws4_request&X-Amz-Date=20200208T163315Z&X-Amz-Expires=900&X-Amz-Signature=<someSignature>&X-Amz-SignedHeaders=content-encoding%3Bhost
server code is almost like in examples:
const Bucket = process.env.BUCKET_NAME
const region = process.env.BUCKET_REGION
AWS.config = new AWS.Config({
accessKeyId: process.env.S3_KEY,
secretAccessKey: process.env.S3_SECRET,
region,
logger: console
})
const s3 = new AWS.S3()
async function generatePutUrl(inputParams = {}) {
const params = { Bucket, ...inputParams }
const { Key } = inputParams
const putUrl = await s3.getSignedUrl('putObject', params)
const getUrl = generateGetUrlLocaly(Key)
return {putUrl, getUrl}
}
the only difference that I can imagine is SSL - I run local server VIA http and heroku goes over https by default...
but I don't understand how it may influence here.
I will appreciate any meaningful advises how to debug and fix it.
thank you.
It looks like that your bucket region is incorrect. Shouldn't it be eu-west-1 instead of u-west-1?
Please update your BUCKET_REGION in environment variables at Heroku Server settings from
u-west-1
to
eu-west-1
and restart the dynos. It may solve your problem.

Invoke Local Lambda using AWS-SDK in NodeJS

I have Lambda setup locally using Docker and Sam. I can hit an endpoint and run the Lambda method locally, but if i want to test the below code, I have to actually deploy the Lambda since I'm not sure how to get the aws-sdk to work in a local environment.
const payload = JSON.stringify({
"bucket": process.env.AWS_S3_ENV_BUCKET,
"region": process.env.AWS_REGION,
"folder": 'somePath/',
"files": ['somefile.jpg', 'anotherfile.jpg'],
"zipFileName": 'zipZippedFile.zip'
})
const params = {
FunctionName: 'zippidyDoDah',
Payload: payload
}
global.Lambda.invoke(params, function (error, data) {
console.log('error: ', error)
console.log('data: ', data)
})
Does anyone have any insight on this?
If you install the AWS Command Line Interface and run an aws configure you can enter the access key and secret key of the user that you want this code to be executed as. These credentials are stored in ~/.aws/credentials. You should be able to inject the AWS CLI and these credentials into your docker container and (assuming they are your [default]) they should get picked up automatically by your process. You should read about AWS CLI Profiles too.

Cant set AWS credentials in nodejs

I'm working on a cloud project using NodeJS.
I have to run EC2 instances so have done a npm install aws-sdk.
I believe we have to add our credentials now before we run the application?
I could not aws folder so I have created a folder and added the credentials in the credentials.txt file.
C:\Users\jessig\aws
I keep getting this error:
{ [TimeoutError: Missing credentials in config]
message: 'Missing credentials in config',
code: 'CredentialsError',
I tried setting the Access key and secret key in environment variables but still get the same error..
Not sure why I cant find the \.aws\credentials (Windows) folder..
Can anyone please help?
As Frederick mentioned hardcoding is not an AWS recommended standard, and this is not something you would want to do in a production environment. However, for testing purpose, and learning purposes, it can be the simplest way.
Since your request was specific to AWS EC2, here is a small example that should get you started.
To get a list of all the methods available to you for Node.js reference this AWS documentation.
var AWS = require('aws-sdk');
AWS.config = new AWS.Config();
AWS.config.accessKeyId = "accessKey";
AWS.config.secretAccessKey = "secretKey";
AWS.config.region = "us-east-1";
var ec2 = new AWS.EC2();
var params = {
InstanceIds: [ /* required */
'i-4387dgkms3',
/* more items */
],
Force: true
};
ec2.stopInstances(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
I used the following programmatic way, combined with the popular npm config module (which allows different config files for development vs production, etc.):
const config = require('config');
const AWS = require('aws-sdk');
const accessKeyId = config.get('AWS.accessKeyId');
const secretAccessKey = config.get('AWS.secretAccessKey');
const region = config.get('AWS.region');
AWS.config.update(
{
accessKeyId,
secretAccessKey,
region
}
);
And the json config file, e.g. development.json, would look like:
{
"AWS": {
"accessKeyId": "TODO",
"secretAccessKey": "TODO",
"region": "TODO"
}
}
There are multiple ways to configure the sdk to work with node js
There are a few ways to load credentials. Here they are, in order of
recommendation:
Loaded from IAM roles for Amazon EC2 (if running on EC2),
Loaded from the shared credentials file (~/.aws/credentials),
Loaded from environment variables,
Loaded from a JSON file on disk,
Hardcoded in your application
Although the hardcoded one is not recommended.
If you want to use a shared credentials files, on windows it would be
C:\Users\jessig\.aws\credentials
(note the . before aws). Your file should be something like
[default]
aws_access_key_id = your_access_key
aws_secret_access_key = your_secret_key
Adding accessKeyId and secretAccessKey in the config for AWS is deprecated as of today. As the AWS Docs for SDK for Node.js states:
The SDK automatically detects AWS credentials set as variables in your environment and uses them for SDK requests. This eliminates the need to manage credentials in your application. The environment variables that you set to provide your credentials are:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_SESSION_TOKEN (Optional)
https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/loading-node-credentials-environment.html
You may want to use dotenv package to load those environment variables.
The AWS credentials can be set as ENVIRONMENT VAR in the running container.
You would either add the following two ENVIRONMENT VAR directly:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
or set these ENVIRONMENT VAR programmatically within NODE as
var AWS = require('aws-sdk')
AWS.config = new AWS.Config();
process.env.AWS_ACCESS_KEY_ID = "AKIA************L55A"
process.env.AWS_SECRET_ACCESS_KEY = "Ef*******+C5LrtOroSj**********yNE"
AWS.config.region = "us-east-2"
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/loading-node-credentials-environment.html

Resources