Want Jenkins pipeline script to create docker container with test database, test against, it, destroy container - node.js

I've created a git repo for application (A) that contains a Dockerfile and docker-compose.yml that stands up a postgres database and creates and populates some tables. I use this as a support app for testing purposes during development as a disposable database.
I'd like to use this docker app in a Jenkins pipeline for testing my main application (B), which is a NodeJS app that reads and writes to the database. Application B is also in git and I want to use a Jenkins pipeline to run its tests (written in Mocha). So my overall pipeline logic would be something like this:
Triggering Event:
Code for application B is pushed to some branch (feature or master) to git.
Pipeline:
git checkout code for Application B (implicit)
git checkout code for Application A (explicitly)
cd to Application A directory:
docker-compose up -d // start postgres container
cd's to Application B directory:
npm install
npm run test (kicks off my Mocha tests that expect postgres db with localhost:5432 url)
cd to Application A directory
docker-compose down // destroy postgres container
// if tests pass, deploy application B
I'm trying to figure out the best way to structure this. I'm really checking out code from two repos: The one I want to test and build, and another repo that contains a "support" application for testing, essentially mocking my real database.
Would I use a script or declarative pipeline?
The pipeline operates in a workspace directory for application B that is implicitly checked out when the pipeline is triggered. Do I just checkout the code for Application A within this workspace and run docker commands on it?

Related

How would you setup default dockerfile per project with an Azure pipeline to handle them?

As of now I have a simple solution structure(WebAPI projects, which are gonna be microservices) with default generated dockerfiles for each project in the solution, like:
Solution X
| Project A
| | Dockerfile
| Project B
| | Dockerfile
| Project C
| | Dockerfile
| Project D
| | Dockerfile
| azure-pipeline.yml
From the development and debuggind point of view everything works(through "Docker" as launcher), but after creating with the Azure wizard the first pipeline for the "Project A" my build always fails at a COPY instruction at the build step:
COPY ["Project A/ProjectA.csproj", "Project A/"]
With the error from the pipeline run as:
COPY failed: stat /var/lib/docker/tmp/docker-builder196561826/Project A/ProjectA.csproj: no such file or directory
##[error]COPY failed: stat /var/lib/docker/tmp/docker-builder196561826/Project A/ProjectA.csproj: no such file or directory
Am not an expert in docker neither in azure but I guess I'm setting up this solution in the wrong way to accomplish such thing.
What could be a better setup or fix?
no such file or directory
This is a very common error people encountered after they migrate the Docker project from Visual studio into Azure Devops, even they can build docker very successfully in local.
This caused by the different work logic between Visual Studio(local) and Azure Devops. For local, the docker runs at the Repos/solution level. BUT, for Azure devops CI, it running the docker in the directory where the dockerfile lives, which is at project level. At this time, the relevant path which work fine locally, will not suitable in Azure devops any more.
I guess you may not want to make changes to your dockerfile. So here you just need specify the build context in Docker task:
Specify $(Build.Repository.LocalPath) to the Docker 2.* argument Build context.
Check my previous answer.

Is posible do after build or have multiple build steps with webpack?

I have webpack frontend application that have config file (config/default.json). I read this with config-webpack. My config file should change between environments (staging/production/etc)
I have CI pipeline that build my js code, minimize and package in docker image of nginx to be published in docker registry (inmutable|agnostic of my config file).
My docker container must be initialize env key with execution value
ENVIRONMENT=staging|production|stress
In docker container i have connection to secrets store and retrieve config data for my execution ENVIRONMENT.
Doing all the build on container start is expensive.
I need post process my build to attach config/default.json to webpack bundle. No build all my project.

How to create mysql databse in git lab CI/CD yaml file?

I am new gitlab CI/CD. I have created a simple yaml script to build and test my php application. I am using shared runner option available in gitlab CI.
I have specified the database name "MYSQL_DATABASE" and it doesn't seem to make any effect.
How do I specify that? Is there any other way to create database in YAML file. When I specify create database, build is getting failed stating
"/bin/bash: line 78: create: command not found".
It is hard to help without knowing more about your configuration. As user JGC already stated, the main error cause seems to be that you are trying to run create database as bash command.
If you want to create a MySQL database directly on a Linux command-line, you can use
mysql -uroot -ppassword -e "CREATE DATABASE database-name
However, with GitLab CI you should try to use one of the solutions described at https://docs.gitlab.com/ee/ci/services/mysql.html
With the Docker executor (e.g. with the SaaS version via gitlab.com) you can just use the following in your .gitlab-ci.yml:
services:
- mysql:latest
variables:
MYSQL_DATABASE: database-name
MYSQL_ROOT_PASSWORD: mysql_strong_password

GitLab CI/CD pull code from repository before building ASP.NET Core

I have GitLab running on computer A, development environment (Visual studio Pro) on computer B and Windows Server on computer C.
I set up GitLab-Runner on computer C (Windows server). I also set up .gitlab-ci.yml file to perform build and run tests for ASP.NET Core application on every commit.
I don't know how can I get code on computer C (Windows server) so I can build it (dotnet msbuild /p:Configuration=Release "%SOLUTION%"). It bothers me that not a single example .gitlab-ci.yml I found on net, doesn't pull code form GitLab, before building application. Why?
Is this correct way to set-up CI/CD:
User create pull request (a new branch is created)
User writes code
User commit code to branch from computer B.
GitLab runner is started on computer C.
It needs to pull code from current branch (CI_COMMIT_REF_NAME)
Build, test, deploy ...
Should I use common git command to get the code, or is this something GitLab runner already do? Where is the code?
Why no-one pull code from GitLab in .gitlab-ci.yml?
Edited:
I get error
'"git"' is not recognized as an internal or external command
. Solution in my case was restart GitLab-Runner. Source.
#MilanVidakovic explain that source is automatically downloaded (which I didn't know).
I just have one remaining problem of how to get correct path to my .sln file.
Here is my complete .gitlab-ci.yml file:
variables:
SOLUTION: missing_path_to_solution #TODO
before_script:
- dotnet restore
stages:
- build
build:
stage: build
script:
- echo "Building %CI_COMMIT_REF_NAME% branch."
- dotnet msbuild /p:Configuration=Release "%SOLUTION%"
except:
- tags
I need to set correct variable for SOLUTION. My dir (where GitLab-Runner is located) currently holds this folder/files:
- config.toml
- gitlab-runner.exe
- builds/
- 7cab42e4/
- 0/
- web/ # I think this is project group in GitLab
- test/ # I think this is project name in GitLab
- .sln
- AND ALL OTHER PROJECT FILES #Based on first look
- testm.tmp
So, what are 7cab42e4, 0. Or better how to get correct path to my project structure? Is there any predefined variable?
Edited2:
Answer is CI_PROJECT_DIR.
I'm not sure I follow completely.
On every commit, Gitlab runner is fetching your repository to C:\gitlab-runner\builds.. on the local machine (Computer C), and builds/deploys or does whatever you've provided as an action for the stage.
Also, I don't see the need for building the source code again. If you're using Computer C for both runner and tests/acceptance, just let the runner do the building and add Artifacts item in your .gitlab-ci.yaml. Path defined in artifacts will retain your executables on Computer C, which you are then able to use for whatever purposes.
Hope it helps.
Edit after comment:
When you push to repository, Gitlab CI/CD automatically checks your root folder for .gitlab-ci.yaml file. If its there, the runner takes over, parses the file and starts executing jobs/stages.
As soon as the file itself is valid and contains proper jobs and stages, runner fetches the latest commit (automatically) and does whatever script item tells it to do.
To verify that everything works correctly, go to your Gitlab -> CI / CD -> Pipelines, and check out whats going on. You should see something like this:
Maybe it would be best if you posted your .yaml file, there could be a number of reasons your runner is not picking up the code. For instance, maybe your .yaml tags are not matching what runner is created to pick up etc.

How to read TAP report from another server in Jenkins?

I've setup Jenkins to run unit test on NodeJS and deploy it to another servers if the test coverage match with my condition.
I use AWS with 2 instances to host Jenkins and Apps. Below is steps that I follow:
Setup Jenkins on instance 1.
Launch Jenkins and configure the build step
At build step, I ssh to instance 2.
cd to src folder at instance 2 and git pull my repository.
Run the unit test using Istanbul and export to test.TAP, now the test.TAP is placed in instance 2.
Back to Jenkins in instance 1, I configure Publish TAP result on Post-build Actions.
<-- My concerns right here is how can I get the test.tap file in instance 2 to read the report and display in Jenkin?
Please help me.
Thank you.

Resources