Push code to a different project in Gitlab - gitlab

I am wondering what is the best way to push code from a CI/CD job in Gitlab. For example,
project1: where you run the gitlab-ci
project2: where you want to push your code
Both projects are in Gitlab, and the idea is that we push code from the CI at proejct1 to project2. The idea is that based on project1 we compile and build some artifacts that later are pushed to project2.
One quick solution would be to use a PERSONAL_ACCESS_TOKEN:
git push https://${GITLAB_USER_LOGIN}:${PERSONAL_ACCESS_TOKEN}#<your-gitlab-server>/<your-project-2-name>.git HEAD:master
To configure the personal token:
Your user profile - Settings - Access Token - "Create personal access token"
Project 1 - Settings - CI/CD - Secret variables
PERSONAL_ACCESS_TOKEN:
But this solution has some major problems:
All members of the project will be able to see the PERSONAL_ACCESS_TOKEN
If you enable the CI_DEBUG_TRACE flag, the personal token will be in the output of your job
I guess it can be a temporal solution. But I am wondering what is the best solution.
Thanks!

Project mirroring seems to be the feature you are looking for :
Go to your project page
Settings->Repository Settings
There, you can configure pushong to another repository, only protected branches or everything, you choose.
I hope it'll help you.
If not, please provide more info.

Related

Secrets Detection in Gitlab CI/CD

I'm having some troubles understanding how to activate Secrets Detection in Gitlab CI/CD.
I created a new NodeJS Express from template and then i activated auto-devops from Settings > CI/CD and checked the checkbox Default to Auto Devops pipeline under the Auto Devops Menu. After that i opened the app.js file in the project folder and inserted a variable that looks like a key-value. Here's the piece of code where i inserted the line:
...
var app = express();
var key = "api-12321321321321321";
// view engine setup
app.set('views', path.join(__dirname, 'views'));
...
After committing the changes i expected the pipeline to fail because of the leak of the secret. Here's an images that shows that secret dection passed.
Can anyone tell me how to make so that the pipeline reports the error?
This behavior confused me as well when I first tried it out.
However, it seems that GitLab did this on purpose. Here is what the official documentation says:
So with a free or premium account, you can just use this reporter, but you won't see any results, unless you download the JSON report.
Also, there is NO mention that the job will fail. This is just our expectation.
However, there are reasons why this makes sense, especially for a paid version of GitLab. In an MR, you want to know if there are any vulnerabilities in your code. However, not all reports are accurate. So you may get what is known as a false positive. Or, the reported issue is very generic and your software is not affected.
However, if you are on a free/premium account, this feature is almost useless, as nobody will go to the job and manually inspect it.
The only workaround would be to override the secret_detection job, parse the gl-secret-detection-report.json, check if it passed or failed and decide to PASS or FAIL the JOB.
This should be easier with GitLab 13.12 (May 2021, Ultimate only):
Configuration tool for Secret Detection
Following in the footsteps of the GitLab SAST configuration tool we are adding support for Secret Detection on the Security Configuration page.
We believe that security is a team effort and this configuration experience makes it easier for non-CI experts to get started with GitLab Secret Detection.
The tool helps a user create a merge request to enable Secret Detection scanning while leveraging best configuration practices like using the GitLab-managed SAST.gitlab-ci.yml template.
The Configuration tool can create a new .gitlab-ci.yml file if one does not exist or update existing simple GitLab CI files, allowing the tool to be used with projects that already have GitLab CI setup.
See Documentation and Epic.
See GitLab 14.5 (November 2021, for all tiers)
Additional Secret Detection pattern support
We’ve updated the GitLab Secret Detection scanner to detect 47 new ‘well-identifiable’ secret patterns for widely used applications. This brings the GitLab Secret Detection detection up to over 90 detectable patterns.
If you are a SaaS application vendor and your app generates secret tokens with well-identifiable patterns, and you’d like GitLab to be able to detect them, please add your regex pattern and a few invalid sample tokens in a comment on this issue and we’ll get them added to GitLab Secret Detection.
See Documentation and Issue.
GitLab has a full post about setting this up in a pipeline: https://docs.gitlab.com/ee/user/application_security/secret_detection/
EDIT:
The given instructions are a bit unclear:
You need to add the include tag at "root-level" of your configuration.
Example
stages:
- build
- test
image: node:latest
build:
stage: build
script:
- echo "Building"
- npm install typescript
- yarn run build
test:
stage: test
script:
- echo "Testing"
include:
- template: Security/Secret-Detection.gitlab-ci.yml
The secret detection will run in the test stage:
Pipeline

How to run GitLab CI pipeline on every push to GitHub?

I created a GitLab "CI/CD for external repo" and linked my GitHub.
I then set up mirroring to Pull from that GitHub.
I would have expected when I push to my GitHub, it would show the latest code in the GitLab and automatically start running my GitLab CI pipeline right away, but I noticed this only happens after about 1 hour delay, so I've been pressing the "Update Now" button every time.
This is extremely inconvenient, so am I missing a step to have it simply get the latest code and run the pipeline on every push to GitHub?
When mirroring a GitHub repository using GitLabs "CI/CD for external repo" feature, you must use an account with admin access on the GitHub repository so that GitLab can use your credentials to set up a web hook, which will notify GitLab when there are changes to be pulled.
The webhook URL will look like https://gitlab.com/api/v4/projects/12345678/mirror/pull. I don't think you can create this manually as it needs to be set up with a secret, so you probably need to remove the project from GitLab and reconnect it.

How to merge a Git branch using a different identity?

We are using Git for a website project where the develop branch will be the source of the test server, and the master branch will serve as the source for the live, production site. The reason being to keep the git-related steps (switching branches, pushing and pulling) to a minimum for the intended user population. It should be possible for these (not extremely technical) users to run a script that will merge develop into master, after being alerted that this would be pushed to live. master cannot be modified by normal users, only one special user can do the merge.
This is where I'm not sure how to integrate this identity change into my code below:
https://gist.github.com/jfix/9fb7d9e2510d112e83ee49af0fb9e27f
I'm using the simple-git npm library. But more generally, I'm not sure whether what I want to do is actually possible as I can't seem to find information about this anywhere.
My intention would be of course to use a Github personal token instead of a password.
Git itself doesn't do anything about user or permission management. So, the short answer is, don't try to do anything sneaky. Rather, use Github's user accounts they way they were intended.
What I suggest is to give this special user their own Github account, with their own copy of the repo. Let's say the main repo is at https://github.com/yourteam/repo, and the special repo is at https://github.com/special/repo.
The script will pull changes from the team repo's develop branch, and merge this into it's own master branch and push to https://github.com/special/repo.
Then, it will push its changes to the team's master branch. This step can optionally be a forced push, since no one else is supposed to mess with master, anyway. (In case someone does, using a forced push here means they have to fix their local repo to match the team repo later on, rather than having the script fail until someone fixes the team repo.)
At the same time, your CI software will notice that master has changed at https://github.com/special/repo, and will publish as you normally would. This is the linchpin: the CI doesn't pay attention to the team repo, so although your team has permission to change it, those changes don't make it into production.
This special user will need commit access to the team repo, in addition to its own GitHub repo. The easiest way is probably to use an SSH key, and run the git push command from the script, rather than trying to use the GitHub API.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

Bitbucket Pipelines access other node repository

I have enabled Bitbucket Pipelines in one of my node.js repositories to have it run the build on every commit. My repository depends on another node.js repository. For development I've linked the one to the other using npm link.
I've tried a git clone of that repository that is specified in the bitbucket-pipelines.yml file, but the build gets stuck on that command. I guess it's because git is asking for authentication at that point.
Is there a way to allow the container to access other repositories in the same team? Or is there a better way altogether on how to solve this? I'd also be fine with switching to another CI tool if Bitbucket Pipelines aren't capable of this – the only requirement is that it's free for teams < 5 people.
Btw. I'd like to avoid paying for npm private packages if possible.
Thanks!
You can organize access by ssh key for another repo like described in official docs https://confluence.atlassian.com/bitbucket/access-remote-hosts-via-ssh-847452940.html

Resources