Git repos over 2 machines, eclipse wont detect new projects - linux

I have an ubuntu and windows machine. I'm trying to push/pull with both at a given time. The problem is that when i create a project, commit&push and then pull on the other machine, eclipse wont detect the project automatically or even if i import it manually.
The git repository is the workspace itself. I've tried fiddling with the .gitignore file which is global to the workspace such as allowing .path and .project, but all it did is mess up the build paths of the OcalIDE(OCaml).
I'm wondering if there's a possible way of achieving this or do i have to resort to using EGit?.

Related

Newbie GIT Help - Making a second Repository

I currently work on solutions / projects within a single GIT repository, in Visual Studio. The commits I make are to a local folder on the Visual Studio server, and then I use the command 'git push origin master' (after having changed directory to my local folder / repository) to push commits to a Gitlab in my company's corporate space. The purpose of this is less about using branches and software development (as I am the only person who does any work on this), and more about having a way to rollback changes and keep a master copy off the server.
I now want a fresh copy of this GIT repository, so I can use that as a new baseline for an application migration. I will still continue to work on the existing repository too.
What is the best way to make a copy of the existing repository, that I can treat as a totally separate thing, without accidently messing up my existing config on the server? Should I do the clone from the Gitlab? Or clone locally and then push that up to the new space in my Gitlab? Honestly, I'm a bit confused at this point about the proper model for this stuff.
....................
Sounds like you'd like to fork the project: keep the existing repo and start a new, separate repo based on the old one.
Browse to your project in Gitlab
On the main repo screen, click "fork" in the top right
Select a new/ the same organisation as you'd like
The project is now forked!
From here, you can clone the fork to your local machine in a new folder. These are now separate projects, and code updates can be added, committed and pushed to the separate repos.

Synchronise 2 repositories in git

I have my main project hosted in GitHub. Everything is good and works.
Now I'm trying to create a Solaris port. I make myself an OpenSolaris VM installed Solaris Studio as compiler/IDE and built.
Everything works fine.
Now what I'm thinking is that since Solaris Studio is completely different IDE/compiler from MSVC/Anjuta/Xcode, I should create a different repository (NOT A FORK) and push Solaris stuff there.
The only problem is - code synchronization.
If I make the change in my main repository and push it to remote, I want my second repository to be updated as well with the changes to the *.cpp/.h files.
Is there exist some kind of hook to do that?
Or maybe I'm better off with creating a fork? But then changes to the build system will be overwritten.
Please advise.
This is the current structure for the main project:
Project folder -> main app folder (*.cpp, *.h, *.sln, Makefile.am/Makefile.in, xcodeproj folder)
|
----> library 1 folder (*.cpp, *.h, *.sln, Makefile.am/Makefile.in, xcodeproj folder)
Or maybe I'm better off with creating a fork? But then changes to the build system will be overwritten.
I wouldn't even bother with a fork.
I would just make sure the build system is isolated in its own folder, which allows you to have in one repository two build configuration folders:
one for a default environment
one dedicated to a Solaris environment
That way, you can go on contributing to the code, from either one of those environments, without having to deal with any synchronization issue.

Dual-booting Linux & Windows, best way to respect git workflow?

I have a PC where I have both a Linux and a windows installation. I use a cloud service so I have access to my important files in both places, and they are synced.
Say I created a git repository, and pushed it to GitHub on windows. Now I suddenly feel the need to switch to my Linux installation to do some stuff. My cloud service does not sync the .git folder, since it's hidden in default by windows. (Would it lead to problems between os'es if i would sync it?). Therefore, Even though I have the same project (with exactly the same files) as on windows, Linux does not automatically recognize the VCS settings of the current project.
I found a somewhat dirty workaround, on Linux I
Initialize an empty rep: git init
Add a remote branch:git remote add Project_name https://github.com/Psychotechnopath/Project_name.git
Fetch the contents of the remote branch git fetch --all
Reset the head onto the remote master branch git reset --hard Project_name/master
Is this the best way to do it (e.g. respecting the git workflow), or are there more elegant ways?
I have mostly-successfully done this on the same filesystem; in my case I mounted the ntfs filesystem from Linux. Two things you have to be careful of:
don't make filenames that are the same on a case-insensitive file system but different on a case-sensitive one
pay special attention to your line endings, you may need to do some work in .gitattributes here
push often in case you find a git bug
If you don't mind having the world's slowest Linux system, you can also just run Linux under Windows via Window Services for Linux, WSL. (Not WSL 2, that's containers). In that case, you can access your windows repo from linux via the /mnt/c/ filesystem.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

A convenient way to update a project on server

There is a project (node.js - although it's not important), which is developed on the local machine and is periodically transferred to the server.
In principle, I could simply erase the project folder on the server each time and replace it with a new one - uploaded from the local machine.
The matter is, that some folders (specifically: node_modules), I do not need to rewrite. So I have to manually create an archive, excluding unnecessary folders from it. And on the server, too, pre-erase everything except for some folders and only then replace.
How can I automate this procedure?
(On the local machine - windows, on the server - Linux)
You can pull the changes directly from your repo.
I do it like this:
I have different branches for different environments like dev, stage and production.
I commit the changes to the branch and pull that on the server.
This way, you don't need to commit unnecessary stuff (like node_modules, credentials etc) to your repo.
You can also easily automate this using CI tools. Look up CI tools.

Resources