Is it secure way to store private values in .env file? - node.js

I'm trying to build a node.js server with express framework, and I want to store a private key for admin APIs in my server.I'm now using .env file to store those values, and in my routes, using that values by calling like process.env.ADMIN_KEY.
Question
Is it secure way to handle private datas? or there's another way better than this?

It is more secure to store your secrets in a .env file than in the source code itself. But you can do one better. Here are the ways I've seen secrets managed, from least to most secure:
Hard-code the secrets in the code.
Pros: None. Don't do this.
Cons: Your developers will see your production secrets as part of their regular work. Your secrets will be checked into source control. Both are security risks. Also, you have to modify the code to use it in different environments, like dev, test, and production.
Put secrets in environment variables, loaded from a .env file.
Pros: Developers won't see your production secrets. You can use different secrets in dev, test, and production, without having to modify the code.
Cons: Malicious code can read your secrets. The bulk of your application's code is probably open-source libraries. Bad code may creep in without you knowing it.
Put secrets in a dedicated secret manager, like Vault by HashiCorp or Secret Manager by Google Cloud.
Pros: It's harder for malicious code to read your secrets. You get auditing of who accessed secrets when. You can assign fine-grained roles for who updates secrets and who can read them. You can update and version your secrets.
Cons: It's additional technology that you have to learn. It may be an additional piece of software that you need to set up and manage, unless it's included in the cloud platform you're using.
So the choice is really between items 2 and 3 above. Which one you pick will depend on how sensitive your secrets are and how much extra work it would be to use a dedicated secret manager. For example, if your project is running on Google Cloud Platform, the Secret Manager is just one API call away. It may be just as easy on the other major cloud platforms, but I don't have first-hand experience with them.

Simple answer is YES, .env is used to store keys and secrets. It is not pushed to your repo i.e. github or bitbucket or anywhere you store your code. In that way it is not exposed.
Here are the tutorial links for correct usage:
managing-environment-variables-in-node-js-with-dotenv
how-secure-is-your-environment-file-in-node-js

Secrets stored in environment variables are in risk of getting exposed (for non-private node apps) as for example libraries you use might print the environment into the log in case of an error. So it would be more safe to store them in a file outside of source control and import it where needed.
https://movingfast.io/articles/environment-variables-considered-harmful/

It is yes. An additional security check can be added by using encrypted values. Also avoid to checkin your .env file in public repo.

You can and should store secrets, credentials or private data securely inside a .env is a secure environment config section in your projects, useful for storing API keys and app credentials. Only invited collaborators are able to see the contents of your .env file.

Related

Sharing my read-only Azure App Configuration Connection String in a public repo

I'm developing an application and I want it to be open-source.
In production, the application is using the Azure Key Vault Service only to store the database connection string. The connection string is stored on an Environment variable of the production server.
In local, I'm using an InMemory Database from EntityFramework. No sensitive data is accessible.
In production too, the application is using the Azure App Configuration Service. While being able to update the configuration of an already running application, it also allows me to centralize the configuration data of my application.
In local, I'm using the Azure App Configuration Service too. The READ-ONLY connection string is stored in my User Secrets.
And that's the point I'm struggling with. Is it considered a bad practice to share the READ-ONLY App Configuration Connection String on a Github or something else public ? Even if I don't store any sensitive data ?
The Key Vault Service is especially designed to safety store the sensitive data, so in theory the App Configuration Service doesn't have any sensitive data available.
But I can't find any relevant documentation on that topic, and the fact that every tutorials I can find are storing the connection string in the user secrets is warning me. How can I share my configuration in a safety way to make my project open-source ?
From security perspective you are violating principle of least privilege, giving read access to public that they don't need.
This could raise several risks:
You or someone else maintaining the App Configuration might "forget" about public read access and put vulnerable data there
An attacker might exploit a security bug in App Configuration itself and escalate read-only permission to read-write, which would not happen if they didn't have read-only access in the first place
You might think that probability of that happening is marginal (which is probably the case), but it is there and in security we always stay on the safe side - that's why we have the principle mentioned and it is indeed generally considered bad practice to violate it.
Finally, we always need to choose between usability and security, so in the end you might willfully agree to slightly less security if this makes your life easier and potential trouble from the risks does not scare you.
In case you would like not to expose the connection string you can think about:
abstracting configuration fetching in a similar way you did for secrets, so that production app would use App Configuration while for local development you can use InMemory database
replacing connection string with Terraform script so that you or any other developer can spin up and populate a dedicated App Configuration instance for local development purposes

Google Cloud Secrets - Reusing a secret

I am using Google Cloud Secrets in a NodeJS Project. I am moving away from using preset environment variables and trying to find out the best practice to store and reuse secrets.
The 3 main routes I've found to use secrets are:
Fetching all secrets on startup and set them as ENV variables for later use
Fetching all secrets on startup and set as constant variables
Each time a secret is required, fetch it from Cloud Secrets
Google's own best practice documentation mentions 2 conflicting things:
Use ENV variables to set secrets at startup (source)
Don't use ENV variables as they can be accessed in debug endpoints and traversal attacks among other things (source)
My questions are:
Should I store secrets as variables to be re-used or should I fetch them each time?
Does this have an impact on quotas?
The best practice is to load one time the secret (at startup, or the first time is it accessed) to optimize performances and prevent API call latency. And yes, the access secret quotas is impacted on each access.
If a debugger tool is connected to the environment, Variables and Env Var data can be compromised. The threat is roughly the same. Be sure to secure correctly the environment.

What is the recommended way to store environment variables in Azure Functions for different environments?

Currently, I'm storing all key/value pairs in Application Settings, but I'm not happy with this approach. What is the recommended way to store settings for dev, test, stage, and prod? I need to make sure that prod settings are not visible to developers. Is there a way to create 4 different JSON files and define access permissions on them? Or do I need to create 4 different Function apps (or subscriptions)?
Azure App Configuration is a relatively new service that sounds like it could help in terms of managing the config values centrally with more control than individual instance App Settings.
Beyond that, you could perhaps build segregation by limiting devs to pushing code only and not accessing the hosting environment (Azure portal, etc). The layer in between would be something like Azure DevOps or Github Actions that has access to Azure, while devs are limited to pushing code that triggers deployment.
Also worth reminding ourselves that devs ultimately have a lot of access by virtue of writing the code. If they want to get at runtime data, they can, somehow. If you consider the devs untrusted, you may have bigger problems. If it's just a matter of preventing mistakes, a solid devops process is the key.

What's the best way to store API secrets and encryption keys?

I am using environment variables to store API secrets and data encryption keys. I wonder is environment variables are the most secure way to store such data ? If hacker get into my server, can he access environment vars ?
It depends on the platform, and it is probably somewhat opinionated, but in general I think environment variables are a good way to store secrets in many scenarios.
If for example your application is vulnerable to SQL injection, local file inclusion or some other application level vulnerability, any secret stored in a database or in a file could be easily compromised. The same attack is probably not possible if environment variables are used, local file inclusion for example can't be used to retrieve environment variables.
Also using environment variables helps with version control issues, it helps to avoid checking secrets into your VCS. It may allow you to manage secrets better across environments, only allowing relevant people to be able to learn those secrets in production.
However, in case of a full compromise of your server, the attacker can also inspect environment variables of course. But if your server is compromised to that level, you lost anyway.
Examples of better ways to store secrets could be probably listed, but they are specific to the environment and technology stack you are using. For example in Azure, Key Vault could sometimes be better, in Amazon a similar facility is the Key Management Service (KMS), etc.

How to share dynamically generated secrets between Docker containers

I have linked together a couple of Docker containers that use each others API endpoints. These API endpoints are protected by a secret and are generated on container startup. I'm looking for a safe way to share these secrets between those services without doing anything static (e.g. hardcoding). These services are created and linked together using docker-compose and it is possible for the secret to be overridden using an environment variable. This behavior is not encouraged for production however.
What is in my case the safest way to distribute these secrets?
Things I have considered:
Using a central data container which stores these secrets as a file. The clients can then link to this container and lookup the secret in the file.
This huge downside this approach has is that it limits the containers to run on the same node.
Generating a docker-compose file with these random secrets hardcoded into them before deploying the containers.
The downside to this approach would be that it wouldn't be possible to simply use the docker-compose file but limiting yourself to a bash script to generate something as mission critical as these secrets. This would also not adhere to my sidenote that the solution should be dynamically adaptable to secret changes.
Sidenote
Ultimately, I would prefer it if the solution could also adapt dynamically to secret changes. For example, when a container fails, it will restart automatically, thus also generating a new secret.

Resources