I've been using gitlab on a private server for development. Unfortunately, requiring a dual core, 2GB RAM VPS purely for the purpose of holding git repos for a couple of people is not cost effective. I would like to migrate to the free gitlab hosted accounts.
Is there are way to transfer a repo and issues to gitlab hosted servers?
It depends on your Gitlab version. They added an import/export feature in 8.9. If you have a lower version you can update to the current version and export your data afterwards.
The following items will be exported:
Project and wiki repositories
Project uploads
Project configuration including web hooks and services
Issues with comments, merge requests with diffs and comments, labels, milestones, snippets, and other project entities
The following items will NOT be exported:
Build traces and artifacts
LFS objects
Related
My company has recently moved to self-hosted Gitlab instance, and now I’m trying to wrap my head around it to see how we could use the CI/CD features of it for our cases. For instance:
We have a PHP-based front end project which consists of several PHP, CSS and JS files that as of now are being copied to our Apache2 folder in the same structure as we use for development.
Is it possible to configure Gitlab to do it for us, let’s say, implementing such an algorithm:
we make a DEPLOY branch in our repository
when we’re done with changes in other branches, we merge those branches into DEPLOY
Gitlab CI/CD detects the new activity and automatically puts the latest version of the files from this branch into our Apache2 directory (connected as a remote folder)
Any advice and guidance is appreciated. I’m currently lost in many manuals that describe Docker deployment etc. which is not yet used in our project.
The company I work for uses Clearcase, even though it was EOL'd on the platforms in which we run it years ago. It is ancient and fragile tech, but one thing it does have is a multisite support that allows for the synchronization of air-gapped repos. Because of security issues, we use secure USB sticks to copy packets and take them to the other side, then apply them with scripts.
Developers and DevOps people want to make a business case to migrate to GitLab, but I cannot find any mention of a feature in GitLab that would allow me to do easily do this. There's something about bundles, but the info I have found is years old and it doesn't seem like too many people are using it.
Does GitLab not support this? Simple synchronization of one repo to another over an air gap using some sort of secure media? If so, it's no wonder so many teams are still using ClearCase.
While not exactly easy, air-gap updates of Git repository is possible through the git bundle command.
It produces one file (with all the history, or only the latest commits for an incremental update), that you can:
copy and distribute easily (it is just one file after all)
clone or pull from(!)
This is not tied to GitLab, and can be applied to any Git repository.
From there, I have written before on migration from ClearCase to Git, and I usually:
do not import the full history, only major labels or UCM baselines
split VObs per project, each project being one Git repository
revisit what was versioned in Vobs: some large files/binaries might need to be .gitignore'd in the new Git repository.
You would not "migrate views": they are just workspace (be it static or dynamic). A simple clone of a repository is enough to recreate such a workspace (static here).
I've just created a brand new instance of Gitlab v.14.
I need to import all my projects (more than 100) from the old gitlab 11 to the new gitlab 14.
Do you have any suggestion or experience?
GitLab 14.x cannot import projects exported by GitLab 11.x (see version compatibility requirements.) Though you don't necessarily need to export your projects to import them to GitLab (as I'll suggest below).
Ideally, you should follow the upgrade path then migrate the two instances as a whole.
But if you must do this by creating a new GitLab instance without being on the same version, there might be a few ways to do this.
Group import
If you upgrade your original GitLab instance to at least version 13.7, you can use the group migration tool.
Project imports
If you can't upgrade your original instance, the easier way to import your projects is to have both instances running and import projects by URL. You don't need to export each project first.
Import automation
If you have GitLab Premium or Ultimate, you can use import automation.
Regarding projects, GitLab 15.0 (May 2022) allows for a more complete import
Migration support for project releases milestones
We continue to add support for more release metadata to GitLab migration.
In GitLab 15.0, we’ve added project releases milestone.
This metadata will help you migrate more of the release data without needing to manually copy over missing release attributes.
See Documentation and Issue.
That will apply to your next migration (from your current instance to a new GitLab 15 one).
GitLab 15.8 (January 2023) further improves the process:
Migrating GitLab projects by direct transfer Beta
Now, you can migrate group and project resources together when using direct transfer.
You can use direct transfers to migrate between GitLab instances or
within the same GitLab instance.
Migrating projects when migrating groups using direct transfer is a major
improvement from migrating groups and projects using file exports because:
You don’t need to manually export each project to a file and then import all those export files to a new location.
Now all projects
within a top-level group are migrated automatically, making your work more efficient.
When migrating from self-managed GitLab to GitLab.com, user associations (such as comment author) are not changed to the user who is importing the projects.
Migration using direct transfer maps users and their contributions correctly, provided a few conditions are met.
This feature is available on GitLab.com. You can migrate from a self-managed GitLab to GitLab.com right now!
To enable it on GitLab self-managed instances, see the linked documentation.
Learn more about migrating GitLab projects by direct transfer Beta and what’s coming next in our recent blog post.
See Documentation and Epic.
I have a DigitalOcean VPS with ubuntu and a few laravel projects, for my projects initial setup I do a git clone to create a folder with my application files from my online repository.
I do all development work in my local machine, where I have two branches (master and develop), what I do is merge develop into my local master, then I push from master into my local repository.
Nw back into my production server, when I want to add all the changes added into production I do a git pull from origin, so far this has resulted into git telling me to stash my changes, why is this?
What would be the best approach to pull changes into production server? take in mind that my production server has no working directory perse, all I do in my VPS is either clone or push upgrades into production.
You can take a look at the CI/CD (continuous integration / continuous delivery) systems. GitLab for example offer free-to-use plan for small teams.
You can create a pipeline with a manual deploy step (you have to press a button after the code is merged to the master branch) and use whatever tool you like to deploy your code (scp, rsync, ftp, sftp etc.).
And the biggest benefit is that you can have multiple intermediate steps (even for the working branches) where you can run unit tests which would prevent you to upload failing builds (whenever you merge non-working code)
For the first problem, do a git status on production to see which files that git sees as changed or added and consider adding them to your .gitignore file (which itself should be a part of your repo). Laravel generally has good defaults for these, but you might have added things or deviated from them in the process of upgrading Laravel.
For the deployment, the best practice is to have something that is consistent, reproducible, loggable, and revertable. For this, I would recommend choosing a deployment utility. These usually do pretty much the same thing:
You define deployment parameters in code, which you can commit as a part of your repo (not passwords, of course, but things like the server name, deploy path, and deploy tasks).
You initiate a deploy directly from your local computer.
The script/utility SSH's into your target server and pulls the latest code from the remote git repo (authorized via SSH key forwarded into the server) into a 'release' folder.
The script does any additional tasks you define (composer install, npm run prod, systemctl restart php-fpm, soft-linking shared files like .env, and etc.)
The script soft-links the document root to your new 'release' folder, which results in an essentially zero-downtime deployment. If any of the previous steps fail, or you find a bug in the latest release, you just soft-link to the previous release folder and your site still works.
Here are some solutions you can check out that all do this sort of thing:
Laravel Envoyer: A 1st-party (paid) service that allows you to deploy via a web-based GUI.
Laravel Envoy: A 1st-party (free) package that allows you to connect to your prod server and script deployment tasks. It's very bare-bones in that you have to write all of the commands yourself, but some may prefer that.
Capistrano: This is (free) a tried-and-tested popular ruby-based deployment utility.
Deployer: The (free) PHP equivalent of Capistrano. Easier to use, has a lot of built-in tasks (including a Laravel one), and doesn't require ruby.
Using these utilities is not necessarily exclusive of doing CI/CD if you want to go that route. You can use these tools to define the CD step in your pipeline while still doing other steps beforehand.
I started exploring Gitlab for version control management and I got an issue at the first step itsself. When ever I create a project its creating a new repository. I have few webapplications which are independent to each other. In that case do I need to use different repository for every project.
What I am looking for is what is what and when to use what but not able to find what is repository and what is project in gitlab website as well as through other sources as well.
Also I came across a term submodule, when can it be used. Can I create one global project and have all the webapplications as different submodules.
Can any one please help me in understanding the difference between those 3 and when to use what based on their intended way of usage. Also please help me by pointing to a good learning site where I can get the information of doing basic version control operations in gitlab.
Thanks.
Gitlab manages projects: a project has many features in addition of the Git repo it includes:
issues: powerful, but lightweight issue tracking system.
merge requests: you can review and discuss code before it is merged in the branch of your code.
wiki: separate system for documentation, built right into GitLab
snippets: Snippets are little bits of code or text.
So fear each repo you create, you get additional features in its associated project.
And you can manage users associated to that project.
See GitLab documentation for more.
The Git repo and Git submodule are pure Git notions.
In your case, a submodule might not be needed, unless you want a convenient way to remember the exact versions of different webapp repo, recorded in one parent repo.
But if that is the case, then yes, you can create one global project and have all the webapplications as different submodules.
Each of those submodules would have their own GitLab project (and Git repo).