Assuming that there is a public project publicProject. And you have got your own private Project privateProject.
Is there a way to automatically trigger a CI/CD pipeline wihin privateProject every time publicProject receives a new push? Note, that although publicProject is public, there is no way to modify the pipeline within publicProject - so no trigger would be possible.
Are there any chances this could be possible?
If no trigger is possible, you might need, as a workaround, to schedule your own pipline (see "Configuring pipeline schedules") in order to:
pull the public repository
check if new commits have been done
The last fetched commit could be cached (see Caching in GitLab CI/CD) in a file which can persists between subsequent jobs.
Related
Requirement - In my self hosted gitlab instance there are multiple projects maintained by different users which are all using one particular tag of an image from the container registry of my project. That tag is now outdated and I have created a new tag for the image and I would like to notify all the users to use the new tag
Is there any webhook available in gitlab which can be enabled for all PULL request of image:tag to send notifications (email,slack) to the authors of ci/cd pipelines?
Or maybe configure the pipeline to detect the image and tag being used and if it is the one in question then send notifications?
P.S. - Gitlab instance is using docker container registry
An approach that involves custom scripting. Less elegant than VonC's suggestion ;-)
… detect the image and tag being used and if it is the one in question then send notifications?
You could try tailing the logs while pulling the old tag manually.
Searching for the image & tag name in your log slice should help determine how the usernames of associated events can be parsed out. Probably with jq.
A custom script could then be added to regularly repeat that parsing and for example send an email to users who trigger those events.
"Webhook" ("custom HTTP callbacks") means a local listener to GitLab events.
Considering you are managing your GitLab instance, a better option would be to create a pipeline for external pull requests (since GitLab 12.3, Aug. 2019)
on-pull-requests:
script: echo 'this should run on pull requests'
only:
- external_pull_requests
This pipeline can check if a Dockerfile is being merged, and if that Dockerfile uses the wrong tag.
If it does, it can deny said pull request.
I’m working on automating a repository creation process. I want to have a main branch, along with a set of branches created automatically from it. I’m using python-gitlab but ofc I can use HTTP requests if needed.
My issue is that I do not want developers to be able to branch from main (i.e. they should only work on the unprotected branches, or other branches created from these). They should still be able to create merge requests from the default branches into main. Is there a way to enforce this?
Code for clarity, not really required to understand:
p_branch = project.protectedbranches.create({
'name': 'main',
'merge_access_level': gitlab.MAINTAINER_ACCESS,
'push_access_level': gitlab.MAINTAINER_ACCESS
})
for branch_name in branch_names:
branch = project.branches.create({'branch': branch_name, 'ref': 'main'})
LE:
It would also mean this option would be disabled when trying to edit a file on the protected branch (not automatically creating this ugly branch name):
You can add protected branches with the Protected Branches API.
curl --request POST --header "PRIVATE-TOKEN: <your_access_token>" "https://gitlab.example.com/api/v4/projects/5/protected_branches?name=*-stable"
This curl commands adds all branches that match the pattern *-stable as protected branches. You can also change what roles can merge, push, unprotect branches, etc. using additional attributes listed in the docs.
If you use the defaults, only members with the Maintainer role or higher (Owner's) can push to protected branches, merge them, unprotect them, etc.
Note: I'm not sure if there's support for the Protected Branches API in the python library (I don't use it myself), so you might need to write out the requests manually.
i have a mono repo structure and created a pipeline that only runs for specific sub folders which have changed.
Now if i change something in Context1 and the pipeline does not succeed, and then change something in Context2 and the pipeline succeeds, a created merge request would say it is mergeable because the last pipeline succeeded.
How can i trigger a new pipeline on a merge request that runs the pipeline that runs for all sub folders? (Not asking how to create such a pipeline, only how to trigger one)
Is there any option to execute script after branch command: 'git branch new123? If user will create new branch I would like to auto checkout into new branch and auto create folder 'new123' with branch name and two other folder inside this folder like 'int', 'ext'.
You would need:
either a GitLab webhook, that is a listener on a server you control, which would receive JSON push events from GitLab, and, when detecting a new branch, would perform the operations you describe in a local clone of the repository.
or a GitLab-CI job which can run on every push, and whose job would check if that push is a new branch, before doing the same operation in its workspace (so on GitLab side rather than on a private server like a webhook does)
I am trying to use repository mirroring to streamline my workflow.
I have a repository that uses pull mirroring from the upstream.
I am trying to trigger mirror update via api
My questions:
Can I find out that the mirror update I triggered if finished?
Alternatively, can I find out if the mirror (or a particular branch on a mirror) is up-to-date with the upstream?
To give you more context, I want to:
refresh the mirror
create a branch on the mirror
create a merge request to the upstream
I refresh the mirror to ensure that the merge request can be merged without conflicts. My pipeline is the only source of merge requests to the upstream. I am afraid of race condition between refresh mirror request and create branch request.
There are two fields ton the get single project endpoint GET /projects/:id
import_status (with "finished" as a marker of success)
import_error (with null as a marker of success)
I wasn't able to find out the timestamp of last mirror update though.