Giltolite cloning from the slave end - gitolite

In my project I have two Gitolite instances, installed on 2 servers, say location A & location B.
Repository at location A is the master and repo at B is the slave.
My requirement is like: my project team members at location B should be able to clone from the slave gitolite instance(at location B) directly.
Is it possible ? If so how?
(I'm here concerned about a single repository say TEST at both the locations)

You need to make sure that:
the access rules for that git repo is duplicated between the two gitolite-admin/conf/gitolite.conf files of the two gitolite server
the users have their public ssh keys registered in both server gitolite-admin/keys folders
Those users should then be able to clone that single repo from B.
You might want to consider hook or mirroring in order to synchronize that repos between the two servers though.

Related

Synchronizing two git clones without push access

There's a server which hosts a git repo. I have two computers which I code from and have created a git clone on both. I frequently pull to get updates and new files. However, I cannot push to this repo (I can only pull).
I would like the files to be in sync across thee two devices so I can pick up where I left off on the other. How can I accomplish this (without creating another repo)?
You cannot 'synchronize' two repositories if you do not have push rights.
If one of your computers can access the other over ssh then you can push or pull directly between them.
If you don't have direct network access then you could in principle use Git's features for sending patches over email, but this will likely be more inconvenient than just setting up a third repository.

Jenkins - Artifact handling

I have a Jenkins set-up consisting of one Master and two Slaves. I have Jenkins jobs (which run only on the slaves) which will create binaries on every commit. Currently, Jenkins archives these artifacts into some place within the Jenkins Master. When i wish to download the binaries using a bash shell script, i use wget url_link_to_particular_artifact. I wish to change this. I want to copy all the generated artifacts into one common location on the master node. So, the url would remain the same and only the last part would change with respect to the generated binary name. I label my binaries with tags so it is easy to retrieve them later on. Now, is there a plugin which will copy artifacts into the master node but to the location that I can provide. The master and slave nodes are all redhat linux machines.
I have already gone through the Artifactory Plugin and I do not wish to use it. I want something really simple to implement. Is there really a need for a web server to be running at the location on the master where I wish to copy the artifacts into? Can i transfer the artifacts from slave to master over SSH? If yes, how?
EDIT:
I have made some progress and I am sort of stuck now: Assuming we have a web-server on the Jenkins master node that is running. Is it possible for the slave nodes to send the artifacts to this location and the web-server sort of writes it into the file system at that location on the Master??
This, of course, is possible, but let me explain to you, why this is a bad idea.
Jenkins is not your artifact repository. Indeed you can store your artifacts in Jenkins, but it was not designed to do so. If you will do that for most of your jobs, you will run into problems with disk space, etc. or even race condition with names.
Not to mention that you don't want to have hundreds or thousands of files in one directory.
Better approach would be to use an artifact repository, such as Nexus to store your artifacts. You can manage and retrieve them easily thru different channels.
Keep in mind that it would be nice to keep your Jenkins in stateless mode and version control your configuration for easy restoration.
If you still want to store your artifacts in one web location, I'd suggest to setup an nginx server, proxy /jenkins calls to jenkins and /artifacts to your artifacts directory.

How to maintain two Gitolite repositories?

As a newbie to Gitolite would like to know how to handle or maintain two gitolite instance setup at 2 different servers which are at different location as an gitolite-admin.
Its like:
# VM 'A' at location X we have one Gitolite instance and
# VM 'B' at location Y we have another Gitolite instance.
but the repository is the same on both of the instance,say "projectrepo.git" but different set of users will be committing actively # both of the locations.Additional requirement is like both the repos should be in sync.
pls. advise.

Restricting remote communication in Git

I have Git installed in our company's Linux server. All the developers work on the same server. Recently we had to move to Git, which is hosted on some other server. Before we create a Git repository we create SSH keys and then start ssh-agent and finally add the private key using ssh-add.
My problem is I created a Git repository in the Linux machine, set my keys and everything and also did a push to remote Git server. But if some other developer also has his key added he can also perform a git push on my local repository.
Is there any way I can restrict push by other developers on the same Linux machine?
If you want to prevent others from pushing to your personal development machine, set up a firewall. If you want to prevent people from pushing to remote server, remove their keys, or add per-ip firewall rules (so that they can still use SSH). At least that's what I'd do, since it looks like the git itself doesn't offer any access control facilities and leaves it to the OS/networking layer.
In any case, my opinion is that rather than setting up some security facilities, you should trust your coworkers not to screw things up. After all, it's not some public repository - it's a company, where screw ups (intentional or not) should be dealt with accordingly.

Using Git with multiple servers

I have a set of servers with code bases on them. lets call them p1, p2, p3. I have a development server which i use to store the code d1. Each p server is different with a different code base.
I'm trying to figure out how to manage the git repo's correctly so that each of the "p" servers keep the "d1" server up to date.
here's what i did.
Create a git repo on p1 and initial commit.
created a --bare clone of the repo and scp'd it to the d1 server.
Repeated this for all servers.
my d1 server now has a /git/ folder with subfolders
p1, p2,p3. each of these has the normal content of
HEAD Branches config description hooks info objects refs.
I can clone these repos to another machine or folder and i get to see the actual files which is exactly what i wanted.
OK so here is my problem.
How to I keep the p1 repo up to date when someone clones the d1 copy and commits to it.
Do I need to run git fetch on p1
or should i have people change p1 and then git push to d1.
you can implement mirroring with gitolite to keep a central server with all the latest code from the others
from http://gitolite.com/gitolite/mirroring.html
Mirroring is simple: you have one "master" server and one or more
"slave" servers. The slaves get updates only from the master; to the
rest of the world they are at best read-only.
In the following pictures, each box (A, B, C, ...) is a repo. The
master server for a repo is colored red, slaves are green. The user
pushes to the repo on the master server (red), and the master server
-- once the user's push succeeds -- then does a git push --mirror to the slaves. The arrows show this mirror push.
The first picture shows what gitolite mirroring used to be like a long
time ago (before v2.1, actually). There is exactly one master server;
all the rest are slaves. Each slave mirrors all the repos that the
master carries, no more and no less.
This is simple to understand and manage, and might actually be fine
for many small sites. The mirrors are more "hot standby" than anything
else.
But when you have 4000+ developers on 500 repos using 25 servers in 9
cities, that single server tends to become a wee bit stressed.
Especially when you realise that many projects have highly localised
development teams. For example, if most developers for a project are
in city X, with perhaps a few in city Y, then having the master server
in city Z is... suboptimal :-)
And so, for about 3 years now, gitolite could do this:
You can easily see the differences in this scenario, but here's a more
complete description of what gitolite can do:
Different masters and sets of slaves for different repos.
This lets you do things like:
Use the server closest to most of its developers as the master for
that repo. Mirror a repo to only some of the servers. Have repos that
are purely local to a server (not mirrored at all). Push to a slave on
demand or via cron (helps deal with bandwidth or connectivity
constraints). All this is possible whether or not the gitolite-admin
repo is mirrored -- that is, all servers have the exact same
gitolite-admin repo or not.
Pushes to a slave can be transparently forwarded to the real master.
Your developers need not worry about where a repo's master is -- they
just write to their local mirror for all repos, even if their local
mirror is only a slave for some.

Resources