Synchronise 2 repositories in git - linux

I have my main project hosted in GitHub. Everything is good and works.
Now I'm trying to create a Solaris port. I make myself an OpenSolaris VM installed Solaris Studio as compiler/IDE and built.
Everything works fine.
Now what I'm thinking is that since Solaris Studio is completely different IDE/compiler from MSVC/Anjuta/Xcode, I should create a different repository (NOT A FORK) and push Solaris stuff there.
The only problem is - code synchronization.
If I make the change in my main repository and push it to remote, I want my second repository to be updated as well with the changes to the *.cpp/.h files.
Is there exist some kind of hook to do that?
Or maybe I'm better off with creating a fork? But then changes to the build system will be overwritten.
Please advise.
This is the current structure for the main project:
Project folder -> main app folder (*.cpp, *.h, *.sln, Makefile.am/Makefile.in, xcodeproj folder)
|
----> library 1 folder (*.cpp, *.h, *.sln, Makefile.am/Makefile.in, xcodeproj folder)

Or maybe I'm better off with creating a fork? But then changes to the build system will be overwritten.
I wouldn't even bother with a fork.
I would just make sure the build system is isolated in its own folder, which allows you to have in one repository two build configuration folders:
one for a default environment
one dedicated to a Solaris environment
That way, you can go on contributing to the code, from either one of those environments, without having to deal with any synchronization issue.

Related

Share and manage rc files (or config files) between mltiple projects

I'm working on different projects based on node, and one thing I always have to do is create the configuration files in all the projects since you all share a lot of configuration, for example, in all projects I use commitlint, lint-stage, husky, eslint, nodemon, and typescript and other settings.
How could I share all these settings in all projects and if I update any of them, update them in all projects?
The first thing that occurs to me is to create a npm packet with all the configurations, and several scripts, that copies / updates these configuration files in the root directory of the project where user is, something like
> myscript update
> myscrpt init
Another option would be to use the configurations programmatically, that is, instead of using a .rc use a .js, but this would force me to manage the dependencies in each project and create a .rc file that uses the configuration of the js file which is in the configuration package.
Another option is to create a github repository as a template, but if I update this repository, the projects I have created using this template are not updated, right?
What do you think is the best way to do it?
Even though git submodules seem to be discouraged lately, I think it's the most reasonable choice (assuming all of your projects are git-based): https://git-scm.com/book/en/v2/Git-Tools-Submodules
In your case, you'd have a 'common' repository with configuration, eg. configuration and two projects: projectA and projectB. Both of them would have the submodules added:
git submodule add <your_git_repo_server>/configuration
Please notice, however, that submodule refers to a particular commit (not a branch or tag - commit). You always have to take big care of synchronizing your dependencies correctly.

Monitor changes done by small program

I need to install small programs I do not fully trust.
Therefore I would like to monitor all files for changes - whether this script places some files it is not supposed to or edits others.
As I want to monitor all folders and files I thought about using something similar to rsync - but is there an alternative to only watch for changes?
Does this way guarantee that I catch everything the software changes? Or are there some kind of "registry-entries" / changes in the configuration, I could miss?
Thanks a lot!
I would suggest you use some kind of sandbox (probably the most straightforward way nowadays is to use Docker).
You could use Git to track all the changes that are made into the sandbox/container:
Initialize a git repo in the root dir
Add all files and commit as the base version
Execute the install script you do not trust
Using git status is going to show you all the changes that were made during installation.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

Git repos over 2 machines, eclipse wont detect new projects

I have an ubuntu and windows machine. I'm trying to push/pull with both at a given time. The problem is that when i create a project, commit&push and then pull on the other machine, eclipse wont detect the project automatically or even if i import it manually.
The git repository is the workspace itself. I've tried fiddling with the .gitignore file which is global to the workspace such as allowing .path and .project, but all it did is mess up the build paths of the OcalIDE(OCaml).
I'm wondering if there's a possible way of achieving this or do i have to resort to using EGit?.

Using hg repository as web site

This is somewhat related to my security question here. Is it a bad idea to use an hg / mercurial repository for a live website? If so, why?
Furthermore, we have dev, test and production installations of our website, like dev.example.com, test.example.com and www.example.com. If it's a bad idea to use a repository for a live/production website, would it be OK to use an hg repository for the dev and test sites?
I'm also concerned about ease of deployment. We have technical and less technical co-workers who will be working with the site. The technical people (software engineers) won't have any problem working with the command line or TortoiseHG. I'm more concerned about the less technical people (web designers). They won't be comfortable working on the command line, and may even find TortoiseHG daunting. These co-workers mostly upload .css files and images to the server. I'd like for these files (at least the .css files) to be under version control, but I want this to be as transparent as possible for the non technical team members.
What's the best way to achieve this?
Edit:
Our 'site' is actually a multi-site CMS setup with a main repository and several subrepositories. Mock-up of the repository structure:
/root [main repository containing core files and subrepositories]
/modules [modules subrepository]
/sites/global [subrepository for global .css and .php files]
/sites/site1 [site1 subrepository]
...
/sites/siteN [siteN subrepository]
Software engineers would work in the root, modules and sites/global repositories. Less technical people (web designers) would work only in the site1 ... siteN subrepositories.
Yes, it is a bad idea.
Do not have your repository as your website. It means that things checked in, but unworking, will immediately be available. And it means that accidental checkins (it happens) will be reflected live as well (i.e. documents that don't belong there, etc).
I actually address this "concept" however (source control as deployment) with a tool I've written (a few other companies are addressing this topic now, as well, so you'll see it more). Mine is for SVN (at the moment) so it's not particularly relevant; I mention it only to show that I've considered this previously (not on a Repository though; a working copy, in that scenario the answer is the same: better to have a non-versioned "free" are as the website directory, and automate (via user action) the copying of the 'versioned' data to that directory).
Many folks keep their sites in repositories, and so long as you don't have people live-editing the live-site you're fine. Have a staging/dev area where your non-revision control folks make their changes and then have someone more RCS-friendly do the commit-pull-merge-push cycle periodically.
So long as it's the conscious action of a judging human doing the staging-area -> production-repo push you're fine. You can even put a hook into the production clone that automatically does a 'hg update' of the working directory within that production clone, so that 'push' is all it takes to deploy.
That said, I think you're underestimating either your web team or tortoiseHg; they can get this.
me personally (i'm a team of 1) and i quite like the idea of using src control as a live website. more so with hg, then with svn.
the way i see it, you can load an entire site, (add/remove files) with a single cmd
much easier then ftp/ssh this, delete that etc
if you are using apache (and probably iis as well) you can make a simple .htaccess file that will block all .hg files (or .svn if you are using svn)
my preferred structure is
development site is on local machine running directly out of a repository (no security is really required here, do what you like commit as required)
staging/test machine is a separate box or vm running a recent copy of the live database
(i have a script to push committed changes to staging server and run tests)
live machine
(open ssh connection, push changes to live server, test again, can all be scripted reasonably easily, google for examples)
because of push/pull nature of hg, it means you can commit changes and test without the danger of pushing a broken build to the live website. like you say in your comments, only specific people should have permission to push a version to the live site. (if it fails, you should easily be able to revert to the previous version, via src control)
Why not have a repo also be an active web server (for dev or test/QA environment anyway)?
Here's what I am trying to implement:
Developers have local test environments in which they can build and test their code
Developers make a clone of the dev environment on their local dev machine
Developers commit as often as they want to their local repo
When chunk of work is done and tested, then developer pushes working change sets to dev repo
Changes would be merged and tested on Dev, then pushed to Test/QA, and so on.
BTW, we're using Mercurial. I believe this model would only work using a distributed source code management tool.

Resources