How to read .env file variables in Nodejs within Azure function - node.js

I have the following code which works fine locally, but when I deploy to an Azure function it fails to read in the contents of the .env file at runtime, when debugging each of the config items is "undefined". .Env file is deployed to Azure with correct entries and the function executes correctly when I hard code the config variables to test. I assume I need to do something differently to get this to work on Azure?
const sql = require('mssql')
require('dotenv').config();
const dbConfig = {
server: process.env.databaseServer,
database: process.env.databaseName,
user: process.env.databaseUser,
password: process.env.databasePassword,
port: 1433,
options: {
encrypt: true,
"enableArithAbort": true
}
};

Azure function is a packed service, its process.env has reloaded properties of the Azure function environment, by default, it will not load your .env file.
It is recommended that defining all your .env content in Azure function application settings:
Simple demo to get this value:
Related doc sees here.

Related

Configure Enviroment Variables For Production Node.JS

I am trying to deploy my test Express.js app on Heroku using GitHub for resources and mlab for my database. In development th app doesn't have problems when I pass mLab connection string but in production... How must my production environment look?
Here is my config.js:
const env=require('dotenv').config();
module.exports = {
development: {
port: process.env.PORT|| 3000,
dbPath: process.env.DB_CONNECTION,
},
production: {
port: process.env.PORT|| 3000,
dbPath: process.env.DB_CONNECTION_MLAB,
}
};
Your .env file probably isn't (and shouldn't be) used in production. It should be ignored in your Git repository.
This means your production database configuration needs to come from somewhere else. If you're using the official mLab addon you can access your connection string via the MONGODB_URI environment variable, which the addon sets automatically.
If you're not using the official addon you should set the appropriate environment variable yourself, e.g. via
heroku config:set MONGODB_URI=...
In either case, make sure the name of the environment variable in your code matches what's set in the environment. Generally there is no need for separate development and production variables since they are set in different environments. I recommend using MONGODB_URI everywhere.

Setting up dotenv in firebase functions

I am trying to move the tiny node-express app I made into firebase functions.
The file have dotenv variables. Earlier I thought If I just deploy and put dotenv in dependency, It will work but that didn't happen so..
So, I went to environment configuration article of firebase to understand how I can set .env
Which states to set things by doing something like this
firebase functions:config:set someservice.key="THE API KEY" someservice.id="THE CLIENT ID"
But I have so many environment configuration and doing that some what seems to be cumbersome task.
So let's say this is environment file
# App port Address
PORT = 8080
# Google Secret
GOOGLE_CALLBACK_URL = http://localhost:8080/auth/google/callback
GOOGLE_CLIENT_ID = 4048108-bssbfjohpu69vl6jhpgs1ne0.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET = lTQHpjzY57oQpO
# Client Address
CLIENT_ADDRESS = http://localhost:3000/
# Meetup Secret
MEETUP_CALLBACK_URL = http://localhost:8080/auth/meetup/callback
MEETUP_CLIENT_ID = ef6i9f7m6k0jp33m9olgt
MEETUP_CLIENT_SECRET = sk3t5lnss2sdl1kgnt
#EventBrite Secret
EVENTBRITE_CALLBACK_URL = http://localhost:8080/auth/eventbrite/callback
EVENTBRITE_CLIENT_ID = UU2FXKGYHJRNHLN
EVENTBRITE_CLIENT_SECRET = NA55QG52FAOF6GDMLKSJBKYOPIGQU4R46HHEU4
How Can I best set up so that when I do firebase firebase serve --only functions,hosting it doesn't throw any errors such as
OAuth2Strategy requires a clientID option
As of Feb 16, 2022 Firebase now supports .env, .env.prod, .env.dev, .env.local files natively!
https://firebase.google.com/docs/functions/config-env
Set your variables in the corresponding environment, and then run firebase use dev or firebase use prod before you deploy.
Your variables can be accessed via process.env.VARIABLE_NAME
UPDATED 2019-06-04
I'm very sorry. This solution is wrong.
I found the correct way.
https://stackoverflow.com/a/45064266/1872674
You should put a .runtimeconfig.json into the functions directory. Your dotenv variables move to .runtimeconfig.json with json format.
This is my solution.
const functionConfig = () => {
if (process.env.RUN_LOCALLY) {
const fs = require('fs');
return JSON.parse(fs.readFileSync('.env.json'));
} else {
return functions.config();
}
};
The functionConfig() was called by your Firebase Function.
exports.helloWorld = functions.https.onRequest((request, response) => {
response.send("someservice id is: " + functionConfig().someservice.id);
});
.env.json is like:
{
"someservice": {
"key":"THE API KEY",
"id":"THE CLIENT ID"
}
}
Finally, run the command with the RUN_LOCALLY variable.
RUN_LOCALLY=1 firebase serve
When we will deploy functions,
don't forget to update the environment configuration in Firebase using the .env.json.
The Firebase CLI currently doesn't allow you to set process environment variables on deployment. This may change in the future. The configuration vars it supports today (that you linked to) are not actually process environment variables - they stored somewhere else that's not actually the process environment.
If you absolutely need to be able to set process environment variables, you will have to deploy your function with gcloud, which means that you also won't be able to use the firebase-functions module to define your function. Start with the Google Cloud Functions documentation to learn about deployment from a Cloud perspective.
If you want to use the Firebase tools, I'd recommend that you find a different way to configure your function that doesn't involve process environment variables.
If you want to have your functions use the process.env variables, you you can set them by going to google cloud console and cloud functions. You will be able to find the deployed firebase functions there. You can select each function one by one and then set the environment variables there.
what I did was create a env.json file into functions FOLDER:
//env.json
{
"send_email_config": {
"FOW_ADMIN_EMAIL": "admin.example#gmail.com",
"FOW_ADMIN_EMAIL_PASSWORD": "adminPassExample",
"FOW_ADMIN_RECEIVER_EMAIL": "adminReceiver.example#gmail.com"
}
}
then I created a file called env.js where I created a function to return the value of functions.config() which is kind of env module
//env.js
// TO UPDATE functions.config().env RUN INSIDE functions FOLDER:
// firebase functions:config:set env="$(cat env.json)"
const functions = require('firebase-functions');
const getEnvConfig = () => {
/*
I return functions.config().env cause I set the env.json values into env
property running firebase functions:config:set env="$(cat env.json)"
*/
return functions.config().env
}
exports.getEnvConfig = getEnvConfig;
exports.PROCESS_ENV = getEnvConfig();
then I just call PROCESS_ENV to access the values that I set on env.json file for example:
const { PROCESS_ENV } = require('./utils/env');
exports.mailCredentials = {
main: {
email: PROCESS_ENV.send_email_config.FOW_ADMIN_EMAIL,
password: PROCESS_ENV.send_email_config.FOW_ADMIN_EMAIL_PASSWORD
},
receiver: {
email: PROCESS_ENV.send_email_config.FOW_ADMIN_RECEIVER_EMAIL
}
}
IMPORTANT!!!!
for this to work you have to deploy functions.config().env with the values of the env.json file
to deploy functions.config().env you just have to run INSIDE the functions FOLDER the next command: firebase functions:config:set env="$(cat env.json)"
and also don't forget to add env.json in your .gitignore
if you have firebase functions FOLDER inside your react project just add */env.json into your react .gitignore file
your .env file need to be in the "functions" folder
Here you have the documentation about this
https://firebase.google.com/docs/functions/config-env
you can access variables like this :
This is your .env :
PLANET=Earth
AUDIENCE=Humans
and a firebase functions
// Responds with "Hello Earth and Humans"
exports.hello = functions.https.onRequest((request, response) => {
response.send(`Hello ${process.env.PLANET} and ${process.env.AUDIENCE}`);
});
For your local environment you can have a .env.local file and the contents of .env.local take precedence over .env when you are using the emulators suite

Upload files to aws s3 using NodeJs

I am working on a project (based on MEAN Stack) which shows different images according to different logged in users. I am using amazon S3 for storing those images.
Currently I have creatde a different route for the admin panel where the admin can sign in and upload the images on amazon s3 for different users.(Also,is this the correct flow of the application?)
I have the below line of code in my js file:
AWS.config.update({ accessKeyId: xxxxxx, secretAccessKey: xxxxxx });
I have read that this should only be for development purposes and I should not be having my accesskeyId and secretacesskey in the code like this.
I want to know that for production what should be done?
For production you need to store these keys in environment. The Aws module will itself pick these from environment.
AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY keys should be set.
For example on local machine you can test these with terminal commands
export AWS_ACCESS_KEY_ID=XXXXX and export AWS_ACCESS_KEY_ID=XXXXX respectively.
For production you will need to do the same. Except you will need to do this via the node process manager you are using. Here is an example of pm2 process manager doing it.
module.exports = {
apps : [
{
name: "myapp",
script: "./app.js",
watch: true,
instance_var: 'INSTANCE_ID',
env: {
"PORT": 3000,
"NODE_ENV": "development",
"AWS_ACCESS_KEY_ID": "XXXXX",
"AWS_SECRET_ACCESS_KEY": "XXXXX"
}
}
]
}
http://pm2.keymetrics.io/docs/usage/environment/#specific-environment-variables
The flow is similar even if you are using a different process manager.

Strongloop: setup Storage Component for Amazon S3

I'm new to Node.js and Loopback. I have been using Deployd so far and I'm trying to migrate to Loopback. The S3 bucket module on Deployd was working great.
So...:
I'm on this website https://github.com/strongloop/loopback-component-storage
I run, in my project folder,
npm install loopback-component-storage
I then need to create a datasource.
To setup the new datasource, I tried
slc loopback:datasource
It doesn't provide me with the option to create a source that is a storage. So I rule that option out I guess
I see there is this piece of code on the github (link above):
var ds = loopback.createDataSource({
connector: require('loopback-component-storage'),
provider: 'filesystem',
root: path.join(__dirname, 'storage')
});
var container = ds.createModel('container');
app.model(container);
I guess this is the right way to create a datasource, but where do I place this code and how do I execute it?
How do I adapt this code to work with Amazon?
{ provider: 'amazon', key: '...', keyId: '...' }
I suppose key is my secret key and keyId my access key id, but can you confirm?
I'm just having trouble getting started... thanks for your help in advance
Where to put the code: https://github.com/strongloop/loopback-component-storage/blob/master/example/app.js
tl;dr, just put it in app.js (1.x structure) or server/server.js (2.x structure)
This example I linked to is using the old LoopBack 1.x structure. I will be updating that example in the coming weeks to use the new LoopBack 2.x structure.
Amazon provider example: http://docs.strongloop.com/display/LB/Storage+service
You can add a datasource manually in server/datasources.json too. This way, you should be able to create a container model using the storage data source.
To do this by code as you illustrated, you can either modify server/server.js or drop a JS file into server/boot with an exported function as:
module.exports = function(app) {
// your code
};
Thanks #Raymond, I took the second option.
I created file server/boot/xyz.js and put this in there:
module.exports = function(server) {
var path = require('path');
var ds = server.loopback.createDataSource({
connector: require('loopback-component-storage'),
provider: 'filesystem',
root: path.join(__dirname, '../../storage')
});
var container = ds.createModel('container');
server.model(container);
};
I cannot see the model in the explorer but I can call the service with:
http://localhost:3000/api/containers

Building applications for different environments with nodejs

Historically on most large projects I have worked on we have a build script of some sort which is run by a developer to setup their environment, such as setting up iis, database migrations, templating configuration files etc.
As each environment is different, I would generally have a set of configuration files for each environment, such as dev.config, qa.config or release-candidate.config, then when the build process runs it would use the <environment>.config file to find out what its settings should be. This way my dev environment can build fine and the build server can do its stuff fine too in an automated fashion.
Anyway in .net we have configuration files such as web.config which can be templated, is there any notion of this within the nodejs world? as I want to have a file which contains all the configuration values required for the application to run without putting them in the application code.
You can do that very simple, actually. Just place all your configuration in .js file, like
{
mongodb: {
connection: 'http://mongodb.com/mydb:12323/user/password',
options: {
keepConnection: true
}
},
app: {
deploymentKey: '1234'
},
// etc
}
So, you will have 2-3 files, depending on the number of environments you have.
/config/development.js
/config/production.js
/config/test.js
/config/staging.js
The environment type is typically exposed by NODE_ENV variable. Then you have really simple module, to load the configuration:
var util = require('util');
var env = process.env.NODE_ENV || 'development';
var config = util.format('/%s.config.js', env);
module.exports = require(__dirname + config);
Checkout some real code here;
This is why I created the properties module.
Check the environment example.

Resources