I'd like to know if is it possible change the CI/CD variables assigned in repo settings by request. I know previously i can pass env variables by curl request to trigger the pipeline.
But now i have another situation which an automatic build and a docker image is pushed to a private registry happen when the master or dev branch suffers a merge. To do this, i use the CI/CD variables already setted.
And oftenly i will create a new registry, which it is done automatically in a bash script. After this what i'd like is setup the new keys accesses to the gitlab repo via request or other automatic way.
If someone could help me with some idea, thanks in advance.
If I am not mistaken this it what you are looking for:
Project-level Variables API
https://docs.gitlab.com/ee/api/project_level_variables.html
Related
I would like to receive some help on how I could create some environment variables to be used in README.md within a locally hosted Gitlab instance.
I need two variables to be replaced in README.md when someone accesses the web interface, variables that define the name of the repo and the name of the branch.
Any idea is welcome.
Thanks!
I don't think you can use these variables inside a GitLab readme. There is a feature request for this but it isn't implemented yet.
A way around this is to use the predefined variables that are present in GitLab. However, these variables are accessible to GitLab's CI pipelines, and not to any readme files. But perhaps you can find a solution in this answer. It suggests that you keep a placeholder in your readme file, then have a job run that switches out the readme's placeholder with the required value using the sed command. Since the job will have access to both the readme file and the variables, this should work.
I'm using Gitlab CI/CD to deploy my .netcore application to AWS Beanstalk.
How can I update the appSettings.json values in my .netcore application when deploying to different environments using variables defined in my CI/CD pipeline settings?
Azure DevOps has a JSON variable substitution feature which I really liked.
GitHub Actions can also hook into this feature
How can I achieve this with Gitlab CI/CD?
I want to use this method of deployment because
I won't have to store sensitive production config values in my repository. Values will be updated by Masked variables setup in the CI/CD Pipeline.
I don't have to rebuild my artefacts every time I deploy to a new environment.
If this can't be done in Gitlabs, whats the recommended best practice?
thanks
I do something similar here with gitlab and the solution was to build a shell script that replaces some string from variable values before starting the deploy job
something like this
script:
- sed -i 's/STRING_TO_REPLACE/$GITLAB_VARIABLE/g' file.json
Just pay attention to escape json quotes correctly to make this possible
I use the same workaround. I filed an issue about this months ago:
https://gitlab.com/gitlab-org/gitlab-foss/-/issues/78486
and I recently talked about this at the gitlab forum:
https://forum.gitlab.com/t/use-variables-per-branch-in-gitlab/43685
unfortunately there does not seem to be a nice/clean solution at the moment.
so I use the workarround:
1: use environment scopes for variables so the same variable can have different values (for test/prod environments)
2: define the environments and branches in the jobs in gitlab-ci.yml
3: create a bunch of sed one liners to do a search/replace..
call it ugly, call it low level but I searched for a nice/clean/gitlab native solution an did not find it.
I have setup a Git project + CI (using Gitlab-runner) on Gitlab v12.3.5. I have a question about issues and pipelines. Let's say I create an issue and assign it to myself. So this create a branch/merge request. Then, I open up the WebIDE to modify some files in an attempt to fix the issue. Now I want to see what if my changes will fix the issue. In order to run the pipeline, is it necessary to commit the changes into the branch or is there some other way?
The scenario I have is that it may take me 20 times to fix the files to make the pipeline 'clean'. In that case, I would have to keep committing on each change to see the results. What is the preferred way to accomplish this? Is it possible to run the pipeline by just staging the changes to see if they work?
I am setting up the gitlab-ci.yaml file. Hence it is taking a lot of trials to get it working properly.
You should create a branch and push to that. Only pushed changes will trigger pipeline runs. After you're done, you can squash and merge the branch so that the repo's history will be clean.
Usually though, you won't have to do this because you'll have automated tests set up to check whether your code works. You should also try testing the Linux commands (or whichever commands you're running in your GitLab CI scripts) locally first. If you're worried about whether your .gitlab-ci.yml syntax is correct, you can navigate to the file in your repository and check there (there's a button at the top which lints it).
I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.
I can't find out how to access variables in a build-script provided by the gitlab-ci.yml-file.
I have tried to declare variables in two ways:
Private Variables in the Web-Interface of GitLab CI
Variable overrides/apennding in config.toml
I try to access them in my gitlab-ci.yml-files commands like that:
msbuild ci.msbuild [...] /p:Configuration=Release;NuGetOutputDir="$PACKAGE_SOURCE"
where $PACKAGE_SOURCE is the desired variable (PACKAGE_SOURCE) but it does not work (it does not seem to be replaced). Executing the same command manually works just as expected (replacing the variable name with its content)
Is there some other syntax required i am not aware of?
I have tried:
$PACKAGE_SOURCE
$(PACKAGE_SOURCE)
${PACKAGE_SOURCE}
PS: Verifying the runner raises no issues, if this matters.
I presume you are using Windows for your runner? I was having the same issue myself and couldn't even get the following to work:
script:
- echo $MySecret
However, reading the Gitlab documentation it has an entry for the syntax of environment variables in job scripts:
To access environment variables, use the syntax for your Runner’s shell
Which makes sense as most of the examples given are for bash runners. For my windows runner, it uses %variable%.
I changed my script to the following, which worked for me. (Confirmed by watching the build output.)
script:
- echo %MySecret%
If you're using the powershell for your runner, the syntax would be $env:MySecret
In addition to what was said in the answer marked as correct above, you should also check whether your CI variables in the gitlab settings are set as "protected". If so, you may not be able to use them in a branch that's not protected.
"You can protect a project, group or instance CI/CD variable so it is only passed to pipelines running on protected branches or protected tags." -> check it https://docs.gitlab.com/ee/ci/variables/index.html#protect-a-cicd-variable
Be aware that.. in certain cases SOME of the variables Gitlab CI CD offer are not always available..
In my case I wanted to use ${CI_COMMIT_BRANCH} but if you read the doc
https://docs.gitlab.com/ee/ci/variables/predefined_variables.html
The commit branch name. Available in branch pipelines, including pipelines for the default branch. Not available in merge request pipelines or tag pipelines.