Bitbucket pipelines, use ENV VARS in a NodeJS script to deploy to S3 Deploy - node.js

Right now I have a bitbucket pipeline that works well with a single step like so:
( options: docker: true )
- docker build --rm -f Dockerfile-deploy ./ -t deploy --build-arg AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID --build-arg AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
This sets the keys in the Docker container, which then deploy's to an ELB using a bash script and the AWS-CLI to commit, so I don't actually try to expose the env vars, but eb deploy sure does and it works.
When trying to run a pipeline with the image: node:latest and the steps
- npm i
- npm run build ( Babel transpile )
- npm run deploy ( node script to send to S3 )
That final step I need the node script to have access to the env vars that I've added to the bitbucket repo pipelines config, instead I get a named variable representation of that variable:
// NodeJS Config File
module.exports = {
AWS_S3_BUCKET : process.env.AWS_S3_BUCKET || undefined,
AWS_ACCESS_KEY : process.env.AWS_ACCESS_KEY || undefined,
AWS_ACCESS_SECRET : process.env.AWS_ACCESS_SECRET || undefined,
}
-
// NodeJS deploy file... parts
const aws = {
params: {
Bucket: config.AWS_S3_BUCKET
},
accessKeyId: config.AWS_ACCESS_KEY,
secretAccessKey: config.AWS_ACCESS_SECRET,
distributionId: config.CLOUDFRONT_DISTRIBUTION_ID,
region: "us-east-1"
}
console.log('-----START AWS-----')
console.log(aws)
console.log('------END AWS------')
Then the bitbucket pipelines echo's this for the console.logs
-----START AWS-----
{ params: { Bucket: '$AWS_S3_BUCKET' },
accessKeyId: '$AWS_ACCESS_KEY',
secretAccessKey: '$AWS_ACCESS_SECRET',
distributionId: '$CLOUDFRONT_DISTRIBUTION_ID',
region: 'us-east-1' }
------END AWS------
Any thoughts?

Well my problem was that I copy pasted the variables from AWS with a trailing space... which is a character, but certainly not in the expected secret or key string. Oops

Related

Gitlab CI/CD stage test fails when fetching external resource with error: dh key too small

I'm trying to work with Gitlab CI/CD but the test stage fails with the following error:
write EPROTO 140044051654592:error:141A318A:SSL routines:tls_process_ske_dhe:dh key too small:../deps/openssl/openssl/ssl/statem/statem_clnt.c:2171:
gitlab-ci.yml
image: node:16.15.1
stages:
- test
test-job:
stage: test
script:
- npm run test
To be note that this is an integration test that calls an external resource with axios, and I have tried to set rejectUnauthorized: false and minVersion: "TLSv1" as suggested here and here
const axiosOptions = {
httpsAgent: new https.Agent({
rejectUnauthorized: false,
minVersion: "TLSv1",
})
};
const axiosInstance = axios.create(axiosOptions);
const response = await axiosInstance.get('https://www.some-domain.com/some-article.html');
This is not a problem with the test itself as it runs fine on my PC, but I suppose with the TLS of the gitlab runner.
Thanks

Github action write to a repo in Node with #actions/core or #actions/github?

Learning Github Actions I'm finally able to call an action from a secondary repo, example:
org/action-playground
.github/workflows/test.yml
name: Test Write Action
on:
push:
branches: [main]
jobs:
test_node_works:
runs-on: ubuntu-latest
name: Test if Node works
strategy:
matrix:
node-version: [12.x]
steps:
- uses: actions/checkout#v2
with:
repository: org/write-file-action
ref: main
token: ${{ secrets.ACTION_TOKEN }} # stored in GitHub secrets created from profile settings
args: 'TESTING'
- name: action step
uses: ./ # Uses an action in the root directory
id: foo
with:
who-to-greet: 'Darth Vader'
- name: output time
run: |
echo "The details are ${{ steps.foo.outputs.repo }}"
echo "The time was ${{ steps.foo.outputs.time }}"
echo "time: ${{ steps.foo.outputs.time }}" >> ./foo.md
shell: bash
and the action is a success.
org/write-file-action
action.yml:
## https://docs.github.com/en/actions/creating-actions/metadata-syntax-for-github-actions
name: 'Write File Action'
description: 'workflow testing'
inputs:
who-to-greet: # id of input
description: 'Who to greet'
required: true
default: './'
outputs:
time: # id of output
description: 'The time we greeted you'
repo:
description: 'user and repo'
runs:
using: 'node12'
main: 'dist/index.js'
branding:
color: 'green'
icon: 'truck' ## https://docs.github.com/en/actions/creating-actions/metadata-syntax-for-github-actions#brandingicon
index.js that is built to dist/index.js
fs = require('fs')
const core = require('#actions/core')
const github = require('#actions/github')
try {
// `who-to-greet` input defined in action metadata file
const nameToGreet = core.getInput('who-to-greet')
console.log(`Hello ${nameToGreet}!`)
const time = new Date().toTimeString()
core.setOutput('time', time)
const repo = github.context.payload.repository.full_name
console.log(`full name: ${repo}!`)
core.setOutput('repo', repo)
// Get the JSON webhook payload for the event that triggered the workflow
const payload = JSON.stringify(github.context.payload, undefined, 2)
console.log(`The event payload: ${payload}`)
fs.writeFileSync('payload.json', payload) // Doesn't write to repo
} catch (error) {
core.setFailed(error.message)
}
package.json:
{
"name": "wite-file-action",
"version": "1.0.0",
"description": "workflow testing",
"main": "dist/index.js",
"scripts": {
"build": "ncc build ./index.js"
},
"dependencies": {
"#actions/core": "^1.4.0",
"#actions/github": "^5.0.0"
},
"devDependencies": {
"#vercel/ncc": "^0.28.6",
"prettier": "^2.3.2"
}
}
but at current workflow nothing is created in action-playground. The only way I'm able to write to the repo is from a module using the API with github-api with something like:
const GitHub = require('github-api')
const gh = new GitHub({
token: config.app.git_token,
}, githubUrl)
const repo = gh.getRepo(config.app.repoOwner, config.app.repoName)
const branch = config.app.repoBranch
const path = 'README.md'
const content = '#Foo Bar\nthis is foo bar'
const message = 'add foo bar to the readme'
const options = {}
repo.writeFile(
branch,
path,
content,
message,
options
).then((r) => {
console.log(r)
})
and passing in the repo, org or user from github.context.payload. My end goal is to eventually read to see if it exists, if so overwrite and write to README.md a badge dynamically:
`![${github.context.payload.workflow}](https://github.com/${github.context.payload.user}/${github.context.payload.repo}/actions/workflows/${github.context.payload.workflow}.yml/badge.svg?branch=main)`
Second goal from this is to create a markdown file (like foo.md or payload.json) but I cant run an echo command from the action to write to the repo, which I get is Bash and not Node.
Is there a way without using the API to write to a repo that is calling the action with Node? Is this only available with Bash when using run:
- name: output
shell: bash
run: |
echo "time: ${{ steps.foo.outputs.time }}" >> ./time.md
If so how to do it?
Research:
Passing variable argument to .ps1 script not working from Github Actions
How to pass variable between two successive GitHub Actions jobs?
GitHub Action: Pass Environment Variable to into Action using PowerShell
How to create outputs on GitHub actions from bash scripts?
Self-updating GitHub Profile README with JavaScript
Workflow syntax for GitHub Actions

Parameterisation of a groovy Pipeline for use with jenkins

I have a groovy pipeline which I've inherited from a project that I forked.
I wish to pass in Jenkins Choice Parameters as a Parameterised build. At present, I only wish to expose the environment in which to run ( but will want to parameterise further at a later stage), such that a user can choose it from the Jenkins dropdown and use on demand.
I used the snippet generator to help.
Can someone please help with the syntax? I am using Node with a package.json to run a script and with to pass in either dev or uat:
properties([[$class: 'BuildConfigProjectProperty', name: '', namespace: '', resourceVersion: '', uid: ''], parameters([choice(choices: 'e2e\nuat', description: 'environment ', name: 'env')])])
node('de-no1') {
try {
stage('DEV: Initialise') {
git url: 'https://myrepo.org/mycompany/create-new-user.git', branch: 'master', credentialsId: CREDENTIALS_ID
}
stage('DEV: Install Dependencies') {
sh 'npm install'
}
stage('${params.env}: Generate new users') {
sh 'npm run generate:${params.env}'
archiveArtifacts artifacts: '{params.env}-userids.txt', fingerprint: true
}
This currently fails with:
npm ERR! missing script: generate:{params.env}
Assume you want to replace ${params.env} with a value when you call npm?
If this is the case, you need to use double quotes " to let Groovy know you will be doing String templating...ie:
sh "npm run generate:${params.env}"

Serverless variable from external file nested property

I have serverless yml and a config file
config file
app: {
port: 3000,
db: {
connectionString: 'xxxxx'
},
lambdaDeploy:{
stage : "DEV",
region : "es-west-1"
}
Trying to use these variables in yml like below
yml
provider:
name: aws
runtime: nodejs6.10
stage: ${file(./appconfiguration.json).app.stage}
region: ${file(./appconfiguration.json).app.region}
But its reading and taking default
Please advise.
Thanks
The syntax used here is not correct.
stage: ${file(./appconfiguration.json).app.stage}
Use colon instead:
stage: ${file(./appconfiguration.json):app.stage}
More details here: https://www.serverless.com/framework/docs/providers/aws/guide/variables/#reference-variables-in-other-files

LambdaFunction - Value of property Variables must be an object with String (or simple type) properties

I am using serverless to deploy my Lambda based application. It was deploying just fine, and then it stopped for some reason. I paired down the entire package to the serverless.yml below and one function in the handler - but I keep getting this error:
Serverless Error ---------------------------------------
An error occurred: TestLambdaFunction - Value of property Variables must be an object with String (or simple type) properties.
Stack Trace --------------------------------------------
Here is the serverless.yml
# serverless.yml
service: some-api
provider:
name: aws
runtime: nodejs6.10
stage: prod
region: us-east-1
iamRoleStatements:
$ref: ./user-policy.json
environment:
config:
region: us-east-1
plugins:
- serverless-local-dev-server
- serverless-dynamodb-local
- serverless-step-functions
package:
exclude:
- .gitignore
- package.json
- README.md
- .git
- ./**.test.js
functions:
test:
handler: handler.test
events:
- http: GET test
resources:
Outputs:
NewOutput:
Description: Description for the output
Value: Some output value
Test Lambda Function in Package
#handler.test
module.exports.test = (event, context, callback) => {
callback(null, {
statusCode: 200,
body: JSON.stringify({
message: 'sadfasd',
input: event
})
})
}
Turns out, this issue does not have any relationship to the Lambda function. Here is the issue that caused the error.
This does NOT work:
environment:
config:
region: us-east-1
This DOES work:
environment:
region: us-east-1
Simply put, I don't think you can have more than one level in your yaml environment variables.
Even if you try sls print as a sanity check, this issue will not pop up. Only in sls deploy.
You have been warned, and hopefully saved!
Other thing that might cause this kind of error is using invalid yaml syntax.
It's easy to get confused about this.
Valid syntax for environment variables
environment:
key: value
Invalid syntax for environment variables
environment:
- key: value
Notice little dash in the code below?
In yaml syntax - means array and therefore, code below is interpreted as array, not object.
So that's why error tells "Value of property Variables must be an object with String (or simple type) properties."
This can be easily fixed by removing - in front of all keys.

Resources