Requirement - In my self hosted gitlab instance there are multiple projects maintained by different users which are all using one particular tag of an image from the container registry of my project. That tag is now outdated and I have created a new tag for the image and I would like to notify all the users to use the new tag
Is there any webhook available in gitlab which can be enabled for all PULL request of image:tag to send notifications (email,slack) to the authors of ci/cd pipelines?
Or maybe configure the pipeline to detect the image and tag being used and if it is the one in question then send notifications?
P.S. - Gitlab instance is using docker container registry
An approach that involves custom scripting. Less elegant than VonC's suggestion ;-)
… detect the image and tag being used and if it is the one in question then send notifications?
You could try tailing the logs while pulling the old tag manually.
Searching for the image & tag name in your log slice should help determine how the usernames of associated events can be parsed out. Probably with jq.
A custom script could then be added to regularly repeat that parsing and for example send an email to users who trigger those events.
"Webhook" ("custom HTTP callbacks") means a local listener to GitLab events.
Considering you are managing your GitLab instance, a better option would be to create a pipeline for external pull requests (since GitLab 12.3, Aug. 2019)
on-pull-requests:
script: echo 'this should run on pull requests'
only:
- external_pull_requests
This pipeline can check if a Dockerfile is being merged, and if that Dockerfile uses the wrong tag.
If it does, it can deny said pull request.
Related
In a GitLab issue, you can associate a branch with an issue, and in the issue there will be the line
#whoever created the branch branchname to address this issue.
Is there a way of getting that branch using the issues API? I'm trying to set up an automation script that will merge all branches associated with issues that have a certain label into the prod branch, then push the result as a development brnach so I can deploy that to a dev environment. I don't want to use merge requests as they will be used when the dev work is complete and ready to be merged for deployment to production.
Unfortunately, there currently is no official API to fetch an issue's related branches.
Some possible ways you can work around this:
Use the notes API
When a user uses the issue interface to create the branch, you will see a system message, like you mention. This message will be present in the notes API for the issue.
Example using the python-gitlab library:
import re
...
branch_note_pattern = '^created branch \[\`(.*)\`\].*to address this issue'
issue = project.issues.get(ISSUE_NUMBER)
all_notes = list(issue.notes.list(as_list=False))
system_notes = [note for note in all_notes if note.system]
related_branches = []
for note in system_notes:
match = re.match(branch_note_pattern, note.body):
if match:
branch = match.groups()[0]
related_branches.append(branch)
print('BRANCHES RELATED TO ISSUE', ISSUE_NUMBER)
for branch_name in related_branches:
print(branch_name)
However, it is possible to have a related branch without that note appearing because the related branches is just based on naming convention. So, if someone just creates a branch with named like <issue_number>-some-branch-name then it will show up as a related branch, but there will not be a system message in the API.
So, if you rely on the notes API, you may miss related branches created manually.
Use the unofficial frontend API
The issues controller only returns related branches for the purposes of the frontend to render as HTML.
If you request /<:project_url>/-/issues/<:issue_number>/related_branches?format=json you will get a JSON response containing the HTML for the frontend to insert in the issue view. You can parse this HTML to get the related branches.
This will reliably fetch the same related branches you'll see in the UI, but is more work to implement, and is fragile because the API is not guaranteed to be stable between versions of GitLab.
In GitLab I am able to see who has forked my repository by clicking the following link:
Does anyone know if there is a similar capability within Azure DevOps?
I was able to find the following API call however it doesn't seem to work:
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/forks/list?view=azure-devops-rest-6.0
-> I pass it a repo and various container ids where I know there are forks of that repo and it returns empty results.
Even when I find a fork in Azure devops by manually traversing the UI I cannot find a place to view where that repo was forked from.
Any help on the above would be greatly appreciated.
Even when I find a fork in Azure devops by manually traversing the UI I cannot find a place to view where that repo was forked from.
You could use the REST API Forks - Get Fork Sync Requests to retrieve all requested fork sync operations on this repository:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryNameOrId}/forkSyncRequests?api-version=6.0-preview.1
As the test result:
We could to know the repo ID where current repo was forked from.
And we could use the Repositories - List to know the name of the Repo:
Update:
In GitLab I am able to see who has forked my repository by
clicking the link. Does anyone know if there is a similar capability
within Azure DevOps?
If you want to know who forks your repo, you could just use your REST API
Forks - List:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryNameOrId}/forks/{collectionId}?api-version=6.0-preview.1
And we could use the oraganiztion id for the collectionId. We could use following RETS API to get the oraganiztion id:
Post https://dev.azure.com/{organization1}/_apis/Contribution/HierarchyQuery?api-version=5.0-preview.1
Body:
{
"contributionIds": ["ms.vss-features.my-organizations-data-provider"],
"dataProviderContext":
{
"properties":{}
}
}
The test result:
If you create a pull request for a repository that has been forked, its "into" options will include a select box that includes the list of forks from that repository.
Our Gitlab pipeline generates some performance graphs, which I would like to be sent to every team member via e-mail.
So far, they are marked as artifacts so Gitlab keeps them. Is there any way within Gitlab to achive this? Or should I do that within the job script?
There is no way currently to send artifacts via email from the gitlab interface. You will indeed have to send them from your job scripts.
Gitlab can send an email after a pipeline is finished (see in Settings>Integrations>Pipeline emails), but it doesn't attach artifacts.
Another way to share them would be to publish them in gitlab pages from your job script (doc here : https://docs.gitlab.com/ee/user/project/pages/index.html), but it wouldn't send an email.
It seems that a few years down the road nothing has changed yet (or I do not know about it).
send_email:
stage: notify
when: on_failure
script: curl -s --user "api:$MAILGUN_API_KEY"
"https://api.mailgun.net/v3/$MAILGUN_DOMAIN/messages"
-F from='Gitlab <gitlab#example.com>'
-F to=$GITLAB_USER_EMAIL
-F subject='Test results + report'
-F text='Testing some Mailgun awesomeness!'
-F attachment='#reports/report.html'
There are a few things you need to get this to work:
generate an artifact in another job (the file you want to upload; mine is reports/report.html)
define the variables MAILGUN_API_KEY and MAILGUN_DOMAIN
I needed something similar so here is a snippet from my pipeline.
I have also documented everything in a blog post. https://medium.com/#vdespa/send-gitlab-ci-reports-artifacts-via-e-mail-86bc96e66511
I hope this helps a bit.
We have a GITlab(8.14) running for collaboration within the company.
I am working on a python script to collect information about merge requests being raised by developers across projects. I can very easily isolate the merge requests using 'git log'
git log --merges
However, I haven't been able to locate the correct command or option to retrieve all the discussion/comments taking place in the Merge Request.
Solution 1: use Gitlab Log System
Have you thought to use the Gitlab Log System instead of using a Git command?
It contains information about all performed requests.... Also you can see all
SQL request that have been performed and how much time it took.
Please take a look here https://docs.gitlab.com/ee/administration/logs.html
So in your Python script of collecting information, you can use queries like that:
SELECT <things> FROM "merge_requests" WHERE <condition>
Solution 2: use Gitlab API
Another way is to directly request Gitlab API to get a list of all notes for a single merge request.
Notes are comments on snippets, issues or merge requests.
like this:
GET /projects/:id/merge_requests/:merge_request_id/notes
The complete API reference for merge request notes is available here.
Does this help you?
Is it possible for GitHub to trigger a new test deployment when a pull request is submitted? I'd like for it to create a new folder on the server (Azure preferred) so that a test URL (e.g. http://testserver.com/PR602/) is generated that we can refer to in the pull request.
This would allow anyone to test a pull request without having to clone the repo, check out the branch, and build it locally.
In my initial research I found that Travis CI can deploy all branches, but I'm not clear how this would be triggered. Do I have to write a custom app that's triggered by pull request web hooks? I'm hoping someone has discovered a simpler method.
Do I have to write a custom app that's triggered by pull request web hooks?
Yes, or find someone else who has happened to have written the exact webhook handler you need.
Writing a webhook handler isn't terribly much work. If you don't want to integrate it with your current app, you can use a micro-framework like Flask to do this in only a few lines of code.
Coming back to this in 2022, there is now also the option of Github Actions, which is a first-party CI service. Actions provides a framework for defining what things to do when certain triggers happen, and there's an extensive marketplace of drop-in components, so you may be able to do all of your triggering of other systems without writing any custom code or running a webserver to listen to webhooks.