Display motd in GitLab pipeline - gitlab

How do you display runtime information in GitLab pipelines?
I would like to include a motd at the beginning of each job to display information about current versions in use. This could help users troubleshoot their projects.
You could print the information in a before_script, but this would fail if we have jobs with different container images.

Related

How Do I Query GitLab (issues, milestones, etc) from within a GitLab Pipeline

Problem Summary
I am creating my first GitLab pipeline and setting it up to do a release. I would like to add a change log to the published release information. (found in Deployments -> Releases in the GitLab UI).
I would like to, from the pipeline, query the project for something like "closed issues since the date of the last release" and add those tickets to the published release.
Planned Approach
I am successfully using the "release-cli" tool from GitLab to create the release. I followed the example: "Create release metadata in custom script" on this page.
I found the description tag in the release element can be given a .md file, which seemed perfect for posting my change log. I have made a test .md file and added it as the release description and it looks great. It shows up right under "Evidence Collection" and so I planned to do "Change Log" as the header in my .md file. In the above example, GitLab is clearly using an .md file in their description element for extra information such as this.
Where I'm Stuck
I just can't figure out how to query the gitlab project from the pipeline (for issues or whatever).
I have been looking at GitLab's cli project and see likely has all the tools I need. I can not find any examples of using it in a pipeline. I cannot find for sure that its in a docker image that I can use similar to the release-cli tool.
I am stuck. The lack of information makes me wonder if I'm going down a dead end path. Is there anybody that has done something similar that can give me a basic example? Or an alternative?

Search inside gitlab jobs logs

Many times, I have an image hash which I know was published in one of the pipelines in my gitlab project. I would like to find what pipeline created it.
Is there an easy way to grep logs of all/some jobs/pipelines in gitlab?

Host multiple test reports with gitlab pages

We have a test suite in our gitlab pipeline which produces Allure test reports. To make these results available after the test, we currently publish the results to artifacts and have an allure serve running which makes them available over a subdomain, based on the branch name.
We would like to host the test results with gitlab pages for each branch. However we can only ever host one version of a page through gitlab pages at one time. This is a problem since we want to host the test results for each branch, not only for the last executed branch. It seems like this is currently not possible without hacks.
I also found this 3 year old gitlab issue about the topic which indicates this is coming in some version of gitlab in the future.
Is there a better way to do this? Or is our best bet currently to wait until this becomes available in gitlab?
If you want to show a test report summary for each pipeline build, you can use GitLab's "Unit test reports" feature. It is slightly different to GitLab Pages, but it's easier to use, because you don't have to configure and host web pages yourself.
You only need to specify the paths to the XML files of the test results, something like this:
java:
stage: test
script:
- gradle test
artifacts:
when: always
reports:
junit: build/test-results/test/**/TEST-*.xml
It will show the summary on GitLab's web page of the pipeline build results, like this:
Source and guidance:
https://docs.gitlab.com/ee/ci/testing/unit_test_reports.html
https://docs.gitlab.com/ee/ci/testing/unit_test_report_examples.html
https://docs.gitlab.com/ee/ci/pipelines/job_artifacts.html#browsing-artifacts
and one HTML file that you can view directly online when GitLab Pages is enabled > (opens in a new tab). Select artifacts in internal and private projects can only > be previewed when GitLab Pages access control is enabled.
This sound to me like you just need GitLab pages enabled and then can browser HTML files from the artifacts repository.
(correct me if a m wrong)
From https://docs.gitlab.com/ee/user/project/pages/pages_access_control.html:
"You can enable Pages access control on your project if your administrator has enabled the access control feature on your GitLab instance. When enabled, only members of your project (at least Guest) can access your website"

Self hosted azure agent - how to configure pipelines to share the same build folder

We have a self-hosted build agent on an on-prem server.
We typically have a large codebase, and in the past followed this mechanism with TFS2013 build agents:
Daily check-ins were built to c:\work\tfs\ (taking about 5 minutes)
Each night a batch file would run that did the same build to those folders, using the same sources (they were already 'latest' from the CI build), and build the installers. Copy files to a network location, and send an email to the team detailing the build success/failures. (Taking about 40 minutes)
The key thing there is that for the nightly build there would be no need to get the latest sources, and the disk space required wouldn't grow much. Just by the installer sizes.
To replicate this with Azure Devops, I created two pipelines.
One pipeline that did the CI using MSBuild tasks in the classic editor- works great
Another pipeline in the classic editor that runs our existing powershell script, scheduled at 9pm - works great
However, even though my agent doesn't support parallel builds what's happening is that:
The CI pipeline's folder is c:\work\1\
The Nightly build folder is c:\work\2\
This doubles the amount of disk space we need (10gb to 20gb)
They are the same code files, just built differently.
I have struggled to find a way to say to the agent "please use the same sources folder for all pipelines"
What setting is this, as we have to pay our service provider for extra GB storage otherwise.
Or do I need to change my classic pipelines into Yaml and somehow conditionally branch the build so it knows it's being scheduled and do something different?
Or maybe, stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before?
(I did try looking for the same question - I'm sure I can't be the only one).
There is "workingDirectory" directive available for running scripts in pipeline. This link has details of this - https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/command-line?view=azure-devops&tabs=yaml
The number '1' '2'...'6' of work folder c:\work\1\, c:\work\2\... c:\work\6\ in your build agent which stands for a particular pipeline.
Agent.BuildDirectory
The local path on the agent where all folders for a given build
pipeline are created. This variable has the same value as
Pipeline.Workspace. For example: /home/vsts/work/1
If you have two pipelines, there will also be two corresponding work folders. It's an except behavior. We could not configure pipelines to share the same build folde. This is by designed.
If you need to use less disk space to save cost, afraid stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before is a better way.

configure gitlab to build the source code on another machine

We have two servers in our organisation.
1) server with gitlab
2) Build server
I would like to create an automate build happen in the second machine(Build server ) for the source code in the gitlab server.
How can I achieve this using gitlab ?
Thanks,
siva
If you are moving from an "pull" continuous integration system (e.g. using a kind of crontab that regularly checks if the source code on the versioning system has changed and start the configure/build/test/deploy stages if it has), then know that gitlab has a much better way of doing this.
gitlab approach is to configure a "pull" system: every time the code is updated (in any branch) on the git repository then the script defined in your .gitlab-ci.yml is read to see if continuous integration jobs have to be launched. jobs are send to your configured gitlab runners. gitlab runners are defined on your build server(s) and takes the job when they are coming.
Definition of what to do is also describes in the .gitlab-ci.yml.
Here is a list of documentation to start learning about gitlab CI:
the official documentation can be helpful
A general introduction to gitlab ci using docker can be found in this blog article (the first slides are great). If your build server or your intended build is on Linux, I would recommend using the "docker executor" (e.g. gitlab runners are executed inside a docker machine inside your build server). It is easy and quick to setup.
Hope this helps you starting...

Resources