Access environment variables from yaml in js (Git secret) - node.js

We are using a Zoom API in our React app and we want to use the Git and Heroku Secret to push and deploy our web application.
In our YAML file, we define the git as well as Heroku keys
jobs:
build:
runs-on: ubuntu-latest
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # auto-generated
HEROKU_KEY: ${{ secrets.HEROKU_API_KEY }}
HEROKU_EMAIL: ${{ secrets.HEROKU_EMAIL }}
HEROKU_APP_NAME: ${{ secrets.HEROKU_APP_NAME }}
REACT_APP_ZOOM_KEY: ${{ secrets.ZOOM_API_KEY}}
REACT_APP_ZOOM_SECRET: ${{ secrets.ZOOM_API_SECRET }}
Now we want to access the Zoom Key and Secret in our config file to generate the signature (used for generating a Zoom meeting).
We wanted to access them with process.env, like this:
sdkKey: isProduction()? process.env.ZOOM_Key : process.env.REACT_APP_ZOOM_KEY,
sdkSecret: isProduction()?process.env.ZOOM_Secret: process.env.REACT_APP_ZOOM_SECRET,
If I do this, I get the error, that my Zoom Key and secret are required and need to be strings.
I already tried to use JSON.stringify(process.env.REACT_APP_ZOOM_KEY) but I get the same error.
Maybe also important to mention, if I hardcode the key and secret directly in the config file, it is working, so I assume the error is with accessing the environment variable from the YAML file.
We would really appreciate it if someone could help us :)

Related

Using secrets with azure/docker-login

I have a GitHub Action that uses azure/docker-login#v1 for building and pushing images to the Azure image registry, and it works.
Now, I want to pass GITHUB_TOKEN using Docker's secret flag, but it only accepts a file, and I don't know how to create a file using this action.
Is it possible?
For example, with docker/build-push-action I can do this bellow
- name: Build docker image
uses: docker/build-push-action#v2
with:
context: .
secrets: |
"github_token=${{ secrets.GITHUB_TOKEN }}"
How can I secure my image using azure/docker-login?
As the readme.md of the azure/docker-login action suggests:
Use this GitHub Action to log in to a private container registry such as Azure Container registry. Once login is done, the next set of actions in the workflow can perform tasks such as building, tagging and pushing containers.
You can setup your workflow so that it logs in using azure/docker-login and builds and pushes the image using docker/build-push-action, like this:
- uses: azure/docker-login#v1
with:
login-server: contoso.azurecr.io
username: ${{ secrets.ACR_USERNAME }}
password: ${{ secrets.ACR_PASSWORD }}
- uses: docker/build-push-action#v2
with:
push: true
context: .
secrets: |
"github_token=${{ secrets.GITHUB_TOKEN }}"

How to loop through user-defined variables in a YAML pipeline?

I am trying to loop through user-defined variables in an Azure DevOps YAML pipeline.
The variables have been created through the UI:
Below the YAML pipeline code that I'm using:
trigger:
- dev
- main
pr:
- dev
pool:
vmImage: ubuntu-latest
stages:
- stage:
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
When running the above pipeline only system and build variables are listed (e.g. system, system.hostType, build.queuedBy, etc.).
Any help to loop through user-defined variables would be much appreciated.
Unfortunately, no luck fetching the variables defined in UI. However, if your variables are non-secrets, you can bring them over into the YAML, and they will show up in the loop.
- stage:
variables:
myyamlvar: 1000 # this will show up in the loop
jobs:
- job: TestVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: ${{ var.key }}
Alternatively, instead of using a compile time expression, you can list variables using a runtime construct, for example:
- job: TestRuntimeVars
steps:
- script: |
for var in $(compgen -e); do
echo $var ${!var};
done
This will list all variables including ones defined in the UI.
From the Microsoft docs link you provided, it specifies that:
"Unlike a normal variable, they are not automatically decrypted into
environment variables for scripts. You need to explicitly map secret
variables."
However, one workaround could potentially be to run an azure cli task and get the pipeline variables using az pipelines variable list
Assuming your intention is to get the actual values, in which case maybe that won't suffice. Having said that, you should consider a variable group even if you're not using them in other pipelines since the group can be linked to an Azure KeyVault and map the secrets as variables. You can store your sensitive values in a KeyVault and link it to the variable group which can be used like regular variables in your pipeline.
Or you can access KeyVault secrets right from the AzureKeyVault pipeline task.
To expand on the awnser below. It is a bit round about but you can use the azure devopps CLI. This may be a bit overkill but it does do the job.
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- bash: az --version
displayName: 'Show Azure CLI version'
- bash: az devops configure --defaults organization=$(System.TeamFoundationCollectionUri) project=$(System.TeamProject) --use-git-aliases true
displayName: 'Set default Azure DevOps organization and project'
- bash: |
az pipelines variable list --pipeline-id $(System.DefinitionId)
displayName: 'Show build list varibales'
env:
AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)
This approach was taken from a combination of:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
and
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#list-variables
If the agent is self hosted you may need to install the dev opps cli.

Github Actions and git clone issue

Having some problems using git clone from within a Github Actions, i get the following no matter what i try:
The code that fails in my main.yml:
jobs:
terraform:
name: 'Terraform with Github Actions!'
runs-on: ubuntu-latest
steps:
- name: 'Login to Azure'
uses: azure/login#v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: 'Checkout'
uses: actions/checkout#master
- name: 'Preparing blueprint-environment'
run: |
snip
git clone git#github.com:ourcompany/whateverrepo.git
Error message:
git#github.com: Permission denied (publickey).
Ive seen many posts on adding ssh-keys, but thats locally, not in a ubuntu-release running from Github actions - what am i missing here? I cant generate ssh-keys and add the private key on the fly to the Github repo-settings, how can i fix this?
If you need to checkout two repositories I would recommend using checkout again to a relative path. See the documentation to checkout multiple repos side by side. You may need to use a repo scoped Personal Access Token (PAT)
- name: 'Checkout'
uses: actions/checkout#v2
- name: 'Preparing blueprint-environment'
uses: actions/checkout#v2
with:
token: ${{ secrets.PAT }}
repository: ourcompany/whateverrepo
path: whateverrepo
If it's really necessary, Deploy keys can be used to clone a repository via SSH.
Create a new SSH key pair for your repository. Do not set a passphrase.
Copy the contents of the public key (.pub file) to a new repository deploy key and check the box to "Allow write access."
Add a secret to the repository containing the entire contents of the private key.
As shown in the example below, configure actions/checkout to use the deploy key you have created.
steps:
- uses: actions/checkout#v2
with:
ssh-key: ${{ secrets.SSH_PRIVATE_KEY }}

How AWS Credentials works at GitHub Actions?

At my unit tests, I'm using aws-sdk to test the SES, which needs some credentials, we are facing a problem to access the secrets with GitHub Actions.
At beginning I was trying to set the values to ~/.aws/credentials using the run command from github workflows:
# .github/workflows/nodejs.yml
steps:
...
- name: Unit Test
run: |
mkdir -p ~/.aws
touch ~/.aws/credentials
echo "[default]
aws_access_key_id = ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key = ${{ secrets.AWS_SECRET_KEY_ID }}
region = ${AWS_DEFAULT_REGION}
[github]
role_arn = arn:aws:iam::{accountID}:role/{role}
source_profile = default" > ~/.aws/credentials
npm test
env:
AWS_DEFAULT_REGION: us-east-1
CI: true
Originally my test file:
// ses.test.js
const AWS = require("aws-sdk")
const credentials = new AWS.SharedIniFileCredentials({ profile: "github" })
AWS.config.update({ credentials })
...
I tried to use another way to get credentials at my tests like, and also doesn't work:
const AWS = require("aws-sdk")
const credentials = new AWS.ChainableTemporaryCredentials({
params: {RoleArn: "arn:aws:iam::{accountID}:role/{role}"},
masterCredentials: new AWS.EnvironmentCredentials("AWS")
)}
AWS.config.update({ credentials })
Finally I tried to create an Action customized (using actions js library like: #actions/core, #actions/io, #actions/exec), to get the AWS env values and set it at ~/.aws/credentials, but also doesn't work as expected
One way that worked was exposing AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY (without use GitHub Actions secrets, not ideal, for security purposes)
Someone has any ideas how AWS credentials could work at GitHub Actions with secrets ?
Thanks a lot for your attention.
Luckily the aws-sdk should automatically detect credentials set as environment variables and use them for requests
To get access to secrets in your action, you need to set them in the repo. Then you can expose them to the step as an env var.
For more details see GitHub Encrypted secrets
On GitHub, navigate to the main page of the repository
Under your repository name, click the ⚙ Settings tab
Repository settings button
In the left sidebar, click Secrets
Type a name for your secret in the "Name" input box
Type the value for your secret
Click Add secret
In your case you will want to add secrets for both AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
Now that those are set you can pass those values into the action via the workflow yaml:
steps:
...
- name: Unit Test
uses: ...
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
run: ...
Avoid using long term and hard coded credentials.
The configure-aws-credentials action provides a mechanism to configure AWS credential and region environment variables for use in other GitHub Actions. The environment variables will be detected by both the AWS SDKs and the AWS CLI to determine the credentials and region to use for AWS API calls.
I recommend configuring configure-aws-credentials to use OpenID Connect (OIDC). This allows your GitHub Actions workflows to access resources in AWS, without needing to store the AWS credentials as long-lived GitHub secrets. The GitHub Configuring OpenID Connect in AWS post walks through setting this up.
To give you a practical example, I set up a pipeline to upload dummy data to a S3 bucket. First set up an OpenID Connect provider, and a role for github to federate into in your AWS account. The examples in configure-aws-credentials are written in CloudFormation but I've translated them to the Python Cloud-Development-Kit(CDK) below. Make sure to change the role condition to match your repository.
github_oidc_provider = iam.OpenIdConnectProvider(
self,
"GithubOIDC",
url="https://token.actions.githubusercontent.com",
thumbprints=["a031c46782e6e6c662c2c87c76da9aa62ccabd8e"],
client_ids=[
"sts.amazonaws.com"
]
)
github_actions_role = iam.Role(
self,
"DeployToBucketRole",
max_session_duration=cdk.Duration.seconds(3600),
role_name="github-actions-role",
description="Github actions deployment role to S3",
assumed_by=iam.FederatedPrincipal(
federated=github_oidc_provider.open_id_connect_provider_arn,
conditions={
"StringLike": {
# <GITHUB USERNAME>/<YOUR REPO NAME>
"token.actions.githubusercontent.com:sub": 'repo:arbitraryrw/cdk-github-actions-demo:*'
}
},
assume_role_action="sts:AssumeRoleWithWebIdentity"
)
)
bucket = s3.Bucket(
self,
f"example_bucket",
bucket_name="cdk-github-actions-demo",
encryption=s3.BucketEncryption.S3_MANAGED,
enforce_ssl=True,
block_public_access=s3.BlockPublicAccess.BLOCK_ALL,
removal_policy=cdk.RemovalPolicy.DESTROY,
auto_delete_objects=True
)
# Give the role permissions to read / write to the bucket
bucket.grant_read_write(github_actions_role)
You can then reference this in your pipeline and run AWS CLI / SDK commands using these credentials. Notice that the snippet references Github Encrypted Secrets, I recommend leveraging this functionality:
name: Example CDK Pipeline
on:
push:
branches: [ main ]
jobs:
build:
name: Emulate build step
runs-on: ubuntu-latest
steps:
- name: Checking out repository
uses: actions/checkout#v2
- name: "Upload artifacts"
uses: actions/upload-artifact#v2
with:
name: build-artifacts
path: ${{ github.workspace }}/resources
deploy:
needs: build
name: Deploy build artifacts to S3
runs-on: ubuntu-latest
# These permissions are needed to interact with GitHub's OIDC Token endpoint.
permissions:
id-token: write
contents: read
steps:
- name: "Download build artifacts"
uses: actions/download-artifact#v2
with:
name: build-artifacts
path: ${{ github.workspace }}/resources
- name: Configure AWS credentials from Test account
uses: aws-actions/configure-aws-credentials#v1
with:
aws-region: us-east-1
role-to-assume: ${{ secrets.AWS_ROLE_FOR_GITHUB }}
role-session-name: GitHubActions
- run: aws sts get-caller-identity
- name: Copy files to the test website with the AWS CLI
run: |
aws s3 sync ./resources s3://${{ secrets.BUCKET_NAME }}
For a full example on how to set this up using the CDK you can take a look at the cdk-github-actions-demo repo I set up.
Take a look at: https://github.com/aws-actions/configure-aws-credentials
It allows you to configure AWS credential and region environment variables for use in other GitHub Actions. The environment variables will be detected by both the AWS SDKs and the AWS CLI to determine the credentials and region to use for AWS API calls.
I was hitting my head against the wall on the same thing for a while.
In my case the setting profile = default was the issue.
I was able to remove that from my script and only having env. If I had both it would throw an error.
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: 'us-east-1'
If running aws from the command line is acceptable for you, you can set the following ENV vars and just use aws commands without needing to run aws configure:
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: us-east-1
AWS_DEFAULT_OUTPUT: json

How to import private data with GitHub actions?

I'm working on a Node project involving several API keys. I stored the API keys in a configuration file config.js. Then I added config.js to .gitignore so that the API keys aren't revealed in the public repository. But when I try to npm run build with GitHub actions, there's an import error because config.js isn't in the repository.
Can I "simulate" config.js somehow on GitHub? Or should I setup an action to download config.js from elsewhere? Is there a better approach?
I'm using GitHub's boilerplate nodejs.yml:
name: Node CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [8.x, 10.x, 12.x]
steps:
- uses: actions/checkout#v1
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node#v1
with:
node-version: ${{ matrix.node-version }}
- name: npm install, build, and test
run: |
npm install
npm run build --if-present
env:
CI: true
I'm fairly new to CI/CD. Thanks in advance!
UPDATE: I solved this problem using the accepted answer below. I stored config.js in a secret variable config on GitHub. Then I added a step in the workflow that creates config.js before it's needed:
...
- name: create config.js
run: echo '${{ secrets.config }}' > path/to/config.js
- name: npm install, build, and test
...
You could declare your key as a secret in GitHub Actions under the name you want (for instance 'my_secret_key')
See also "Creating and using secrets (encrypted variables)"
Said key can be referenced in your config.js as a variable $my_secret_key.

Resources