How to release built artifacts back-and-forth from one to another repo on GitLab? - gitlab

I got a requirement to generate, archive and reuse the artifacts between two different repositories
Repository A: Compile Angular code and create a XLF file
Repository B: Use the 'XLF File' generated above and create a new XLF file
Repository A: Again use the newly generated XLF file to create the final output file
The activities mentioned above should be done using gitlab-ci.yml. I am not sure how to handle this using GitLab CI.
We can push the artifact from Repo A to Repo B. However, CI on Repo A should wait until Repo B pushes a new artifact to Repo A to complete the process

Ideally, you would not push a generated artifact to another Git source repository.
But a GitLab pipeline can retrieve an artifact produced by another one from its URL.
To avoid the back and forth, I would rather have 3 jobs instead of two
the first generates XLF file
the second curls/fetches that file, and use it to generate new XLF file
the third job curls/fetches that file, and complete the process.

How to release built artifacts back-and-forth from one to another repo on GitLab?
Repository A:
Compile Angular code and create a XLF file
Send a hook to repository B that it just compiled
just trigger: https://docs.gitlab.com/ee/ci/yaml/#trigger , works like a charm. It's even nicely visible in the gui.
or API https://docs.gitlab.com/ee/ci/triggers/
pass variables: PARENT_PIPELINE_ID: $CI_PIPELINE_ID to repository B so it can download artifacts from specific pipeline
Repository B:
Use the 'XLF File' generated above
use needs: https://docs.gitlab.com/ee/ci/yaml/#artifact-downloads-to-child-pipelines to download artifacts
or API: have personal access token from repository A https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html added to environment variables and use API to download artifacts https://docs.gitlab.com/ee/api/job_artifacts.html .
create a new XLF file
use trigger: or API to trigger repository A
but this time trigger different .gitlab-ci.yml file like: trigger: - project: repositoryA file: second_stage.gitlab-ci.yml https://docs.gitlab.com/ee/ci/yaml/#trigger-child-pipeline-with-files-from-another-project
or use like variables: SECOND_STAGE: "true" and use a variable to differentiate
Repository A:
run pipeline from the file second_stage.gitlab-ci.yml
download artifacts from repository B - needs: or API
use the newly generated XLF file to create the final output file
Overall, what you need is rules: and needs: documentation. On older gitlab, it was done with API.
CI on Repo A should wait until Repo B pushes a new artifact to Repo A to complete the process
Don't wait. Let the API trigger it.

I tried the following approach and it worked fine or at least I was able to proceed
Due to some reason 'variables' along with CURL did not work as expected but I did not analyze the root cause
Repo A - Pipeline
trigger-repob: (Trigger Project B of Repo B)
stage: repob
trigger:
project: repob-namespace/projectb
branch: devops
test_job:
image: $CI_REGISTRY/$CI_PROJECT_PATH/base-image:latest
stage: test_pipeline
when: delayed
start_in: 2 minutes
needs: (Use artifacts from Repo B/Project B)
-
project: repob-namespace/projectb
job: buildprojectb
ref: devops
artifacts: true
script:
- do something here
Repo B Pipeline
buildprojectb:
image: php:7.4.11
stage: build
script:
- do something here
artifacts:
paths:
- outputs/*.xlf

Related

How to run a script from repo A to the pipeline B in Gitlab

I have two repositories in GitLab, repositories A and B let's say.
Repo A contains:
read_ci.yml
read_ci.sh
read_ci.yml contains:
stages:
- initialise
create checksum from pipeline:
stage: initialise
script:
- chmod +x read_ci.sh
- source ./read_ci.sh
Repo B contains:
gitlab-ci.yml
gitlab-ci.yml contains:
include:
project: 'Project/project_name'
file:
- '.gitlab-ci.yml'
ref: main
Obviously, this doesn't do what my intention is.
What I want to achieve is in the project B pipeline to run the project A script.
The reason is that I want project A to be called from multiple different pipelines and run there.
an alternative to this for GitLab: Azure Pipelines. Run script from resource repo
Submodules would absolutely work as Davide mentions, though it's kinda like using a sledgehammer to hang a picture. If all you want is a single script from the repository, just download it into your container. Use the v4 API with your CI_JOB_TOKEN to download the file, then simply run it using sh. If you have many files in your secondary repository and want access to them all, then use Submodules as Davide mentiones, and make sure your CI job retrieves them by setting the submodule strategy like this:
variables:
GIT_SUBMODULE_STRATEGY: normal
If you want to run the project A script in the project B pipeline, you can add the repository B as a git submodule in A
git submodule add -b <branch-B> <git-repository-B> <target-dir>
You need also to add in the CI job, the variable GIT_SUBMODULE_STRATEGY: recursive.

Already up to date gitlab pipeline stackoverflow

i trying gitlab pipeline. now i make some changes & code pushed in master branch
pipeline showing already update to date but i have changes in code
I try to pull in three phase but still same issue
.gitlab-ci.yml
before_script:
- echo "Before script"
building:
stage: build
script:
- git pull origin master
testing:
stage: test
script:
- git pull origin master
deploying:
stage: deploy
script:
- git pull origin master
If the gitlab-ci workflow starts by cloning your repository, no amount of git pull will change the fact you already have the full history, and, at the time of the workflow, this is "already up to date".
In other words, a git pull would not be needed in your gitlab-ci.yml file.
If your pipeline is running on the same repo that you changed, there is no need to use git pull. Although, if your pipeline triggers (on repo A) another pipeline on another repository (repo B), to access files in repo A, you have to pull repo A in repo B pipeline.

How to copy files from projects artifact to another project using cicd

i go a CICD that make an artifact and save them in public folder
i need to copy some files of these artifacts to another project
what script i can't use or what the best way to do it?
Assuming, you want to copy file into another gitlab project.
Once artifact is built in a job, in a stage (ie : build), it will be available in all jobs of the next stage (ie : deploy).
build-artifact:
stage: build
script:
- echo "build artifact"
artifacts:
name: "public"
paths:
- "public/"
deploy_artifact:
stage: deploy
script:
- cd public/ # here are your artifact files.
- ls -l
- cd ..
# now implement copy strategy,
# for example in a git repo :
- git clone https://myusername:mypassord#myrepo.gitlab.com/mygroup/another_project.git
- cd another_project
- cp ../public/source_file ./target_file
- git add target_file
- git commit -m "added generated file"
- git push
If your destination is not a git repository, you can simply replace by a scp directly to server, or any other copy strategy, directly in the script.
Another way to do it is to get the last artifact of project A, inside the ci of Project B.
Please see https://docs.gitlab.com/ee/ci/pipelines/job_artifacts.html#retrieve-job-artifacts-for-other-projects
When you do trigger to downstream pipelines, just use needs from job that push artifacts, then it will be passed to downstream.

Azure DevOps Common Branch Policy Build Pipeline for all Repositories

we are setting up Policies for our Org. One need we have is to have a build and annotation at every Pull Request with Sonarcloud. Is there a way to create a common ci build pipeline which will run at every PR, checking out the respective repo, detect the type of project (or read a manifest file or so) do the code analytics and build, annotates the PR?
So in the Azure DevOps repo you can create a common branch policy for all the master branches. I tried to use a standalone yaml pipeline but it never started when I created the pr. Can someone help me on the right track? Do I need to create a resource in the yaml? Is there any variable I can use from the PR to detect the repo and the branches?
Just for everybody to understand, you can create common branch policies and individuals.
Thanks a lot
You will need to add all the repositories in the resources section in the yaml because of the known issue reported in this thread.
resources:
repositories:
- repository: MyRepo
type: git
name: MyRepo
However, if you use a classic UI Pipeline in the Build Validation instead of the yaml pipeline. You donot need to add all the repositories in the resources section. But you need to skip the pipeline to sync the source: (Go pipeline edit page-->Click Get sources-->check Don't sync sources)
You can use the predefined variables to get the information about the pull request. See below:
$(Build.Repository.Name)
$(System.PullRequest.PullRequestId)
Then you can run the git commands in a script task to check out the pull request branch. See below example:
resources:
repositories:
- repository: MyRepo
type: git
name: MyRepo
pool:
vmImage: windows-latest
steps:
- checkout: none
- powershell: |
git clone "https://$(System.AccessToken)#dev.azure.com/OrganizationName/$(System.TeamProject)/_git/$(Build.Repository.Name)"
cd "$(Build.Repository.Name)"
git fetch origin pull/$(System.PullRequest.PullRequestId)/merge:pr-$(System.PullRequest.PullRequestId)
# checkout pr branch
git checkout pr-$(System.PullRequest.PullRequestId)
Note: you need to grant build service account read permission for target repository

Is there a way to upload GitLab CI artifacts to an Openshift container?

I have a GitLab CI pipeline which builds a few artifacts. For example:
train:job:
stage: train
script: python script.py
artifacts:
paths:
- artifact.csv
expire_in: 1 week
Now I deploy the repository to OpenShift using the following step in my GitLab pipeline. This will pull my GitLab repo inside OpenShift. It does not include the artifacts from the 'testing'.
deploy:app:
stage: deploy
image: ayufan/openshift-cli
before_script:
- oc login $OPENSHIFT_DOMAIN --token=$OPENSHIFT_TOKEN
script:
- oc start-build my_app
How can I let OpenShift use this repository, plus the artifacts created in my pipeline?
In general OpenShift build pipelines rely on the s2i build process to build applications.
The best practice for reusing artifacts between s2i builds would either be through using incremental builds or chaining multiple BuildConfig definitions (the output image of one BuildConfig being fed as source image into another BuildConfig) together via the spec.source.images or spec.source.git configuration in the BuildConfig definition.
In your case since you are using a Jenkins pipeline to generate your artifacts instead of the OpenShift build process you really only need to combine your artifacts with your source code and the runtime container image.
To do this you might create a builder container image that pulls those artifacts down from an external source during the assemble phase (via curl, wget, etc) of the s2i workflow. You could then configure your BuildConfig to point at your source repository. At build time the BuildConfig will pull down your source code and the assemble script will pull down your artifacts.

Resources