I'm migrating form Jenkins to Gitlab CI (self-hosted) and try to setup a workflow for all projects. One project is a Golang tool whose binary is required in other projects.
I've setup a working pipeline for the Golang project which builds a release with an attached binary for download. How can I reuse that binary in other project pipelines? Is there a way to download the binary with the REST API of Gitlab, or can I reuse the artifact of the release job in the other pipelines? I've searched the documentation of Gitlab and did not find an elegant solution for this yet :-/
I've found a solution now. It's possible to use curl to download the build artifact for reuse
curl -L --header "PRIVATE-TOKEN: <TOKEN>" "https://<HOSTNAME>/api/v4/projects/<PROJECT_ID>/jobs/artifacts/master/raw/<ARTIFACT_PATH>?job=build" --output <BINARY_NAME>
Is there any task or something which will send the release pipeline data to git repo instead of sending it to any evironment?
You need to use the Invoke HTTP REST API task to call github api and create a file there.
No, there is not.
The only way you could do is running git commands in a release to git push files/data.
How to do this, you could refer below links:
Executing git commands inside a build job in Visual Studio Team Services (was VSO)
Run Git commands in a script
I am not sure how I can maintain in GitLab a development version, i.e. a section where beta testers can download binaries). So far, i came up with the following approach (I did not test it yet):
Create or update a tag "dev" and manually add the binaries via the following steps:
1) git push origin :refs/tags/dev
2) git tag -fa dev
3) git push origin master --tags
4) manually update binaries
Is it the way to go? Before to implement this approach I would like to check if there are official/recommended approach
Thank you in advance for your help
I have been attempting to use hooks to automatically pull the updated repo from github to my remote web server. This is the code on the post-receive hook:
cd /home/[my username]/document_root/[github repo name]/
git pull origin master
I expect this to run when there is a new commit from my development machine that syncs with the one on github, but it doesn't update the files inside of that repo directory. Any help is appreciated, but I am new to using git and github in general.
EDIT: I tried to follow this tutorial but it still doesn't work.
http://ryanflorence.com/deploying-websites-with-a-tiny-git-hook/
You can push git post-hook to Github. Instead, you can use Github's Webhooks
you can customize a schedule with crontab.
for example, crontab -e , and input
5 * * * snyc_git.sh
Then it will snyc your respository.
Just installed a fresh new 6.8 Gitlab on a brand new high performance server.
Before considering to forget my repositories history (comments, issues, etc...), do one know of a way to export a repository data from a Gitlab server to another Gitlab Server ?
I just failed to found anything on the documentation for exporting/migrating the whole project data (not just the git repository and its wiki).
For GitLab versions >= 8.9 (released in June 2016) you can use the built-in export and import project feature.
Please not that for the existing installations of GitLab, this option has to be enabled in the application settings (URL: /admin/application_settings) under 'Import sources'. You have to be a GitLab admin user to enable and use the import functionality.
Here is the feature complete documentation: https://gitlab.com/help/user/project/settings/import_export.md
I have actually done this recently, we were upgrading our instance of gitlab and needed to save and import repositories to the new installation.
First, create a bundle of the checked-out repository. For example, say you checked out a repository we will call myrepository
To check out the repository use git clone (let's assume your repository is under the root account and the ipaddress is 192.168.1.1)
git clone http://192.168.1.1/root/myrepository.git (or match your environment)
Now this step is somewhat important; you need to change into the working directory that has the .git folder of your checked out repository.
cd myrepository
Next, you create a bundle file:
git bundle create myrepository.bundle --all
Import the bundle file into the new instance of gitlab.
Create a new 'myrepository' on the gitlab gui interface
clone the empty repository; let's say this new gitlab has the ipaddress 192.168.1.2:
git clone http:\\192.168.1.2\root\myrepository.git (or match your environment)
You will get warnings that you cloned an empty repository. This is normal.
Change into the working directory of your checked out repository and do a git pull:
cd myrepository
git pull file/path/to/myrepository.bundle
this will pull the repository into your clone. Next you can do a git add, git commit and git push
This should work assuming you have the gitlab server settings set up correctly; you may have issues such as needing to add a client_max_body_size parameter in your nginx.conf file and also a 'git config --global http.postBuffer' to push large files.
Another way to do this is to make patch files of each commit and then deploy them:
This involves doing a 'git format-patch -C 0badil..68elid -o patch_directory_path' and reference the range of all your commits and have them pushed to an output directory; this should give you one patch file per commit. Next would involve git cloning the new empty repository, changing into the working directory of the clone and applying the patches to the new repository using 'git am patch_directory_path'
For GitLab versions < 8.9, without built-in export/import feature, I recommend a great tool from Marcus Chmelar, gitlab-migrator. I used it successfully many times with older GitLab versions so you should too. Just be aware of its limitations.
For the repos themselves, you can use git bundle: that will generate one file, that it is easy to copy around.
(as I described in "Backup a Local Git Repository")
But another way is simply to git clone --mirror your repos from the first server on a local workstation, and git push --mirror to the new server.
This is what GitHub details in its help page "Duplicating a repository".
In both cases, you need first to declare those repos on the new GitLab server, in order for them to be initialized, and ready to receive commits.
But for the rest... not easily. There is a feature request pending:
(Update August 2016, 2 years later: GitLab 8.9 has that feature implemented)
(for GitLab version older than 8.9, see and upvote Greg Dubicki's answer)
I agree that issues are the main thing to make exportable first.
They are stored in the database. Storing them in git is not an option. Maybe export them as a formatted file (SQL, YAML or something else).
This blog post illustrates the export of a mysql database.
Use mysqldump to create dump of old database, then create a new database on the new server and import this.
On old:
mysqldump gitlab | gzip > gitlab.sql.gz
On new:
gunzip < gitlab.sql.gz | mysql gitlab
Run the db migrate command to make sure the schema is updated to the latest version.
sudo -u gitlab -H bundle exec rake db:migrate RAILS_ENV=production