How to remove old archive jobs in chef? - gitlab

I am trying to clean up my gitlab server running v12.x something. I wrote a python script to query the api for the gitlab server and i send a request I get a 201 response code. I used the official docs (https://docs.gitlab.com/ee/api/jobs.html) But the jobs remains in the web ui... I tried deleting the artifacts from the server, and I get a 204 back as a response code.
just by using a simple post command
curl --request POST --header "PRIVATE-TOKEN: <token>" "https://gitlab.corp.com/api/v4/projects/1/jobs/1/erase"
How can one verify that the jobs and are deleted...?
In the admin settings I setup archiving jobs for 1 month and to delete the artifact as well. But, in the admin portal I have 10,000 plus jobs....
The result of the script running, after 4 hrs, the api will not accept the token, and the user account, can't to any git commands for 24 hrs, then returns to normal.... By that I mean, you can't view any code in the web ui, and git commands will not work either....
Has anyone experienced this issue?

The long and the short of it is the api will not do it. I had to update the postgress DB directly to clean them out.

Related

How to display more than 20 pipeline of history in gitlab?

I use gitlab API to list all pipeline in ci history, using following query
curl --insecure -sH "PRIVATE-TOKEN: "${TOKEN}"" GET "https://git.do.x5.ru/api/v4/projects/"${PROJECT_ID}"/pipelines/"
But unfortunately it shows only 20 latest pipelines. Is there way to get for example 50 latest pipelines?
From gitlab documentation :
By default, GET requests return 20 results at a time because the API
results are paginated
Check Pagination documentation to know how to increase the number of items listed per page or how to iterate over pages

How do I download a raw file from my IBM Cloud Git Repos and Issue Tracking repository?

I'm trying to download a raw file from one of my IBM Cloud Git Repos and Issue Tracking repositories. I had a script that was able to fetch raw file contents using the following curl command:
curl -H "Private-Token: $PERSONAL_ACCESS_TOKEN" https://git.ng.bluemix.net/:owner/:repo/raw/:branch/:filename
but it recently started failing with a 302 response that is redirecting to a UI login page.
Is there a supported way to download raw file contents from an IBM Cloud Git Repos and Issue Tracking repository?
The curl request above is attempting to use a personal access token to authenticate to a UI endpoint. There was a security fix in GitLab 11.3.11 that limited the scope of personal access tokens to API calls only. That would explain why personal access tokens are no longer working on that request.
The supported method of downloading raw file contents would be to call the repository files API.
For example, to fetch myFolder/myFile.txt from the master branch of myRepo, owned by myUser, you can make a curl call like this:
curl -H "Private-Token: $PERSONAL_ACCESS_TOKEN" https://git.ng.bluemix.net/api/v4/projects/myUser%2FmyRepo/repository/files/myFolder%2FmyFile.txt/raw?ref=master

Is there a command or a "git log" option to retrieve comments/discussion logged for a merge request?

We have a GITlab(8.14) running for collaboration within the company.
I am working on a python script to collect information about merge requests being raised by developers across projects. I can very easily isolate the merge requests using 'git log'
git log --merges
However, I haven't been able to locate the correct command or option to retrieve all the discussion/comments taking place in the Merge Request.
Solution 1: use Gitlab Log System
Have you thought to use the Gitlab Log System instead of using a Git command?
It contains information about all performed requests.... Also you can see all
SQL request that have been performed and how much time it took.
Please take a look here https://docs.gitlab.com/ee/administration/logs.html
So in your Python script of collecting information, you can use queries like that:
SELECT <things> FROM "merge_requests" WHERE <condition>
Solution 2: use Gitlab API
Another way is to directly request Gitlab API to get a list of all notes for a single merge request.
Notes are comments on snippets, issues or merge requests.
like this:
GET /projects/:id/merge_requests/:merge_request_id/notes
The complete API reference for merge request notes is available here.
Does this help you?

Dashboard with builds, commits and files from GitLab

How do I do dashboard with information from my GitLab repo? I don't know how I get information about builds, commits and files and create dashboard with this information. Any idea? Thank you very much.
You can use GitLab's API to obtain the information from GitLab. You will need to have an user account which can access a particular project. Get this user's "Private Token" from the /profile/account page and then you can make requests for which you would get a JSON response.
Retrieving latest commits
curl -H "PRIVATE-TOKEN: [TOKEN]" \
"https://[HOST]/api/v3/projects/[PROJECT ID]/repository/commits"
Retrieving latest builds
curl -H "PRIVATE-TOKEN: [TOKEN]" \
"https://[HOST]/api/v3/projects/[PROJECT ID]/repository/builds"
These are examples using curl. Depending on the programming language you are going to use, you will have to make a GET request while setting a HTTP header (that is what -H in my example stands for) named PRIVATE-TOKEN.

standard way of setting a webserver deploy using webhooks

I am working on code for a webserver.
I am trying to use webhooks to do the following tasks, after each push to the repository:
update the code on the webserver.
restart the server to make my changes take effect.
I know how to make the revision control run the webhook.
Regardless of the specifics of which revision control etc. I am using, I would like to know what is the standard way to create a listener to the POST call from the webhook in LINUX.
I am not completely clueless - I know how to make a HTTP server in python and I can make it run the appropriate bash commands, but that seems so cumbersome. Is there a more straightforward way?
Setup a script to receive the POST request ( a PHP script would be enough )
Save the request into database and mark the request as "not yet finished"
Run a crontab and check the database for "not yet finished" tasks, and do whatever you want with the information you saved into database.
This is definately not the best solution but it works.
You could use IronWorker, http://www.iron.io, to ssh in and perform your tasks on every commit. And to kick off the IronWorker task you can use it's webhook support. Here's a blog post that shows you how to use IronWorker's webhooks functionality and the post already has half of what you want (it starts a task based on a github commit): http://blog.iron.io/2012/04/one-webhook-to-rule-them-all-one-url.html

Resources