Newbie GIT Help - Making a second Repository - gitlab

I currently work on solutions / projects within a single GIT repository, in Visual Studio. The commits I make are to a local folder on the Visual Studio server, and then I use the command 'git push origin master' (after having changed directory to my local folder / repository) to push commits to a Gitlab in my company's corporate space. The purpose of this is less about using branches and software development (as I am the only person who does any work on this), and more about having a way to rollback changes and keep a master copy off the server.
I now want a fresh copy of this GIT repository, so I can use that as a new baseline for an application migration. I will still continue to work on the existing repository too.
What is the best way to make a copy of the existing repository, that I can treat as a totally separate thing, without accidently messing up my existing config on the server? Should I do the clone from the Gitlab? Or clone locally and then push that up to the new space in my Gitlab? Honestly, I'm a bit confused at this point about the proper model for this stuff.
....................

Sounds like you'd like to fork the project: keep the existing repo and start a new, separate repo based on the old one.
Browse to your project in Gitlab
On the main repo screen, click "fork" in the top right
Select a new/ the same organisation as you'd like
The project is now forked!
From here, you can clone the fork to your local machine in a new folder. These are now separate projects, and code updates can be added, committed and pushed to the separate repos.

Related

how to add clone git to my own created repo in gitlab

How to take the project from one repo (where I've been given developer rights by some other person who is the master) to my own created repo (where I'm the master). I'm able to use both CMD and webstorm IDE for accessing GIT.
The action you want to perform is called 'fork' in the context of source code management (e.g git). Quoting https://docs.gitlab.com/ee/gitlab-basics/fork-project.html:
A fork is a copy of an original repository that you put in another namespace where you can experiment and apply changes that you can later decide whether or not to share, without affecting the original project.
To fork a project in GitLab, you just have to click on the 'fork' button on the project's main page.
You can read detailed steps on https://docs.gitlab.com/ee/user/project/repository/forking_workflow.html#creating-a-fork

How to merge a Git branch using a different identity?

We are using Git for a website project where the develop branch will be the source of the test server, and the master branch will serve as the source for the live, production site. The reason being to keep the git-related steps (switching branches, pushing and pulling) to a minimum for the intended user population. It should be possible for these (not extremely technical) users to run a script that will merge develop into master, after being alerted that this would be pushed to live. master cannot be modified by normal users, only one special user can do the merge.
This is where I'm not sure how to integrate this identity change into my code below:
https://gist.github.com/jfix/9fb7d9e2510d112e83ee49af0fb9e27f
I'm using the simple-git npm library. But more generally, I'm not sure whether what I want to do is actually possible as I can't seem to find information about this anywhere.
My intention would be of course to use a Github personal token instead of a password.
Git itself doesn't do anything about user or permission management. So, the short answer is, don't try to do anything sneaky. Rather, use Github's user accounts they way they were intended.
What I suggest is to give this special user their own Github account, with their own copy of the repo. Let's say the main repo is at https://github.com/yourteam/repo, and the special repo is at https://github.com/special/repo.
The script will pull changes from the team repo's develop branch, and merge this into it's own master branch and push to https://github.com/special/repo.
Then, it will push its changes to the team's master branch. This step can optionally be a forced push, since no one else is supposed to mess with master, anyway. (In case someone does, using a forced push here means they have to fix their local repo to match the team repo later on, rather than having the script fail until someone fixes the team repo.)
At the same time, your CI software will notice that master has changed at https://github.com/special/repo, and will publish as you normally would. This is the linchpin: the CI doesn't pay attention to the team repo, so although your team has permission to change it, those changes don't make it into production.
This special user will need commit access to the team repo, in addition to its own GitHub repo. The easiest way is probably to use an SSH key, and run the git push command from the script, rather than trying to use the GitHub API.

How to structure repo for Netlify subdomain support?

Netlify subdomains work based on branches on a repo. If I have a domain say xyz.com and repo Repo-A, the master branch will deploy to xyz.com and dashboard branch will deploy to dashboard.xyz.com. However the dashboard and master branch are very different expect for a few visual elements.
I’m trying to figure out a clean way to structure the repo
Repo - A
(master branch)
src/app
package.json
webpack.config.js
Repo - A
(dashboard branch)
src/app
package.json
webpack.config.js
The problem with this approach is I’d have to change my webpack, package and src files extensively.
I believe switching back and forth between branches will generate a lot of junk in the dist/ folder too.
What’s the best repo structure to make this work? Are there tools to make life simpler for this use case?
Another approach -
Create a Release Repo that has release branches like master and dashboard.
master commits to Repo A which pushes build to master branch of Release repo
master commits to Repo B which pushes build to dashboard branch of Release repo
Is this a cleaner approach compared to first one? Any suggestions?
This feature seems to be more for staging/development/production(master) when you are using them to track changes for review and doing pull requests to each sub-domain branch through the workflow. I don't use this feature, because it is easy to track workflow by creating branch deploys anyway. Where I think this would really come in handy is when tracking versions of my site at subdomains for different versions.
When using a sub-domain for a totally different project, you should consider moving them to their own repositories and managing the project as it's own site at the sub-domain. Then entering a CNAME sub-domain entry into DNS to point to the my-dashboard-site-name.netlify.com
Mono-repo
You could have them in the same mono-repo if you don't want to make them their own repo, you would still separate the sites deploy. This is a little bit more complex than their own repository, but tools like Lerna are there if you want to maintain it that way. It does make for a nice way to maintain projects that re-use the same libraries that are not published to a package manager, but in the same mono-repo.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

Syncing website files between local and live servers using GIT?

Say I have two web servers, one local development and one live.
Under SVN I would checkout the website files to my local webserver's public_html directory and also to the live webserver's public_html directory. I would then work on the files directly on the local server and commit any changes to the central repository.
When I'm ready for those changes to go live on the live server, I would SSH in and perform an SVN update.
Essentially I have two working copies, one on live and one locally, though other users may also have working copies on their local machines. But there will only ever be one working copy on the live server. The reason for this is so that we can just perform SVN update on live server every time we want changes to be published.
How can a simiar workflow be accomplished using GIT?
To model your current work flow almost exactly do:
Set up a git repo.
Clone the repo on the server and locally.
Work locally
git push to the git repo
ssh to server
git pull.
Another way to do it would be to set up a "production" branch in git, have a cron job that continually pulls this branch on the server, and then just merge and push to the "production" branch any time you want to publish your changes. Sounds like you need a more concrete branching strategy.
See: Git flow branching model && git flow cli tool
Good luck! This is a very solvable problem with git.
You might find this useful: http://joemaller.com/990/a-web-focused-git-workflow/
In your local working copy:
git push ssh://you#yourserver/path/to/your/wc
will push the commited changes in your local version to yourserver.
Having a setup that triggers automatically pulling like leonbloy and codemac suggested may seem like a good idea at first but it tends to be very fragile. I suggest a different alternative.
http://toroid.org/ams/git-website-howto

Resources