Restricting remote communication in Git - linux

I have Git installed in our company's Linux server. All the developers work on the same server. Recently we had to move to Git, which is hosted on some other server. Before we create a Git repository we create SSH keys and then start ssh-agent and finally add the private key using ssh-add.
My problem is I created a Git repository in the Linux machine, set my keys and everything and also did a push to remote Git server. But if some other developer also has his key added he can also perform a git push on my local repository.
Is there any way I can restrict push by other developers on the same Linux machine?

If you want to prevent others from pushing to your personal development machine, set up a firewall. If you want to prevent people from pushing to remote server, remove their keys, or add per-ip firewall rules (so that they can still use SSH). At least that's what I'd do, since it looks like the git itself doesn't offer any access control facilities and leaves it to the OS/networking layer.
In any case, my opinion is that rather than setting up some security facilities, you should trust your coworkers not to screw things up. After all, it's not some public repository - it's a company, where screw ups (intentional or not) should be dealt with accordingly.

Related

Restricting access to directories for git server?

I'm using my raspberrypi as a Git server. Until I've been using it by myself. Now I would like to give access to a friend. The problem is that the git user also can see everything else on the server. So if he has the ssh-keys to push to the server, he can also SSH in the server and do whatever he wants (well, not everything, he won't have the sudo password).
But still... What's the most secure way to deal with something like this?
I don't want any client to run any commands from this user (if possible) and I want to restrict access to a directory where I mount my external HDD.
You should set up a software like gitolite which handles the access permissions. (Check the software packages for your Raspberry Pi. You will probably find a gitolite package.)
Follow the setup instructions. Instead of modifying .ssh/authorized_keys2 yourself, the software will do this for you. This makes sure the users cannot do anything else than accessing the Git repositories.
Another alternative is GitLab.

Synchronizing two git clones without push access

There's a server which hosts a git repo. I have two computers which I code from and have created a git clone on both. I frequently pull to get updates and new files. However, I cannot push to this repo (I can only pull).
I would like the files to be in sync across thee two devices so I can pick up where I left off on the other. How can I accomplish this (without creating another repo)?
You cannot 'synchronize' two repositories if you do not have push rights.
If one of your computers can access the other over ssh then you can push or pull directly between them.
If you don't have direct network access then you could in principle use Git's features for sending patches over email, but this will likely be more inconvenient than just setting up a third repository.

Git slow when cloning to Samba shares

We are deploying a new development platform.
We have a really complicated environment that we cannot reproduce on developer's computers so people cannot clone the GIT repository on their computer.
Instead, they clone the repository into a Mapped network drive (SAMBA share) thats is the DocumentRoot of a Website for the developer in our servers
Each developer has is own share+DocumentRoot/website and so they cannot impact people this way.
Developers have Linux or Windows as Operating system.
We are using 1Gbits/sec connection and GIT is really slow comparing to local use.
Our repository size is ~900 MB.
git status on samba share takes about 3mins to accomplish, that's unusable.
We tried some SAMBA tuning, but still, it's really slow.
Has someone an idea?
Thank you for time.
Emmanuel.
I believe git status works by simply looking for changes in your repository. It does this by examining all of the files and checking for ones that changed. When you execute this against a samba, or any other share, it's having to do the inspection over a network connection.
I don't have any intimate knowledge of the git implementation but my imagination is that it essentially boils down to
Examine all files in directory
Repeat for every directory
So instead of creating a single persistent connection to the share it's creating one for every single file in the repository and with a 900MB share that's going to be slow even with a fast connection.
Have you considered having the following work flow instead?
Have every developer clone to their local machine
Do work on the local machine
Push changes to their share when they need to deploy / test / debug
This would avoid the use of git on the actual share and eliminate this problem.

Using GIT to clone from a windows machine to a linux webserver (in house)

OK, I am looking for a way to use GIT to keep a web site up to date between my local machine (git repository) and my web site (git clone of repository).
I have initialized the repository (on windows 7 machine) and added all the files to the repo on my local machine. I now need to get the repo to the webswerver (a linux-based machine). I can access the webserver via putty and ssh. How do I go about cloning the repo into the appropriate directory to serve the web site?
I have tried the following from my linux based machine: git clone git+ssh://myuser#10.1.0.135/d/webserver/htdocs/repo
I keep receiving a connect to host 10.1.0.35 port 22: connection time out
Both machines are in house with the webserver being outside of the network on a different IP range (outside of firewall). I came from subversion and can easily svn commit/update to and from the webserver and my machine without issue.
Thanks for any guidance on this!
The best resource I've found for doing this is located here.
The problem I had was that issuing a git clone from the *nix environment using the above suggestions was unable to find the path to the repo properly.
I was able to fix this by starting the git daemon with the --base-path and --export-all params.
So, from the windows box:
git daemon --base-path=C:/source/ --export-all
Then from the *nix box (mac in my case):
git clone git://<local ip>/<project name>
My directory structure on the windows box is:
c:\source\<project name> - this is where the .git folder lives
Here is a walkthrough someone else did. It goes step by step showing how to do what you want.
The IP address 10.1.0.135 is reserved for private networks, which means that it only refers to your local Windows computer when used within your home network. If you're running the git clone command with that address on your server, 10.1.0.135 refers to a completely different computer, which explains why the connection isn't working.
Here's my suggestion: instead of trying to clone the repository on your home computer, first create an empty repository on the server
server$ git init /path/to/repository
and then push changes from your computer to the server's repository
home$ git remote add website ssh://myuser#server/path/to/repository
home$ git push website
You can call the remote something other than "website" if you want.
For slightly more advanced usage, I've written a blog post explaining how to set up staging and production servers and maintain them with git. If you don't want to deal with a staging server, though, I also link to a couple of tutorials about a simple two-repository setup to manage a website with git, which is basically what it sounds like you're looking for.
Sounds like your windows 7 machine (in particular, port 22) may not be accessible from outside of the firewall. With subversion, the webserver is likely accessible to both machines. Also, the IP for your Windows machine is a non-routable IP, which means your firewall is likely also NAT'ing your internal network.
You could approach this by opening port 22 in the firewall, or setting up port-forwarding in the firewall to point to your Windows machine. But you should probably create the git repo on the server, then clone from that to your Windows machine instead. You could use scp -r to get that initial repo on the server, though someone with more git experience may be able to tell you a better way.
Good idea to do this with Git, if you need to check it into a version control system anyhow.
Just wanted to mention you could also look at the rsync utility - e.g. google "Rsync Windows" brings up some nice results.
Rsync is specifically made for keeping directory trees in-sync across machines. And it does it smart, not transfering files which are already on the other side, and you can use compression.. it has tons of features,
and is typically used in UNIX production environments. There are ways to run it also on Windows.
In any case:
Check your firewall settings on both machines - the relevant ports need to be open. In your case port 22 is probably blocked

Is Mercurial Server a must for using Mercurial?

I am trying to pick a version control software for our team but I don't have much experience for it before. After searching and googling, it seems Mercurial is a good try. However, I am a little bit confused about some general information about it. Basically, our team only have 5 people and we all connect to a server machine which will be used to store the repositories. The server is a Redhat Linux system. We probably use a lot of the centralized workflow. Because I like the local commit idea, I still prefer the DVCS kind software. Now I am trying to install mercurial. Here are my questions.
1) Does the server used for repositories always need to be installed the software "mercurial-server "? Or it depends on what kind of workflow it uses ? In other words, is it true if there is no centralized workflow used for works, then the server can be installed by "mercurial client" ?
I am confused about the term "mercurial-server". Or it means the mercurial installed on the server is always called "mercurial server" and it does matter if it is centralized or not. In addition, because we all work on that server, does it mean only one copy of mercurial is required to install there ? We all have our own user directory such as /home/Cassie, /home/John,... and /home/Joe.
2) Is SSH a must ? Or it depends on what kind of connection between users and the server ? So since we all work in the server, the SSH is not required right ?
Thank you very much,
There are two things that can be called a "mercurial server".
One is simply a social convention that "repository X on the shared drive is our common repository". You can safely push and pull to that mounted repository and use it as a common "trunk" for your development.
A second might be particular software that allows mercurial to connect remotely. There are many options for setting this up yourself, as well as options for other remote hosting.
Take a look at the first link for a list of the different connection options. But as a specific answer to #2: No, you don't need to use SSH, but it's often the simplest option if you're in an environment using it anyways.
The term that you probably want to use, rather than "mercurial server", is "remote repository". This term is used to describe the "other repository" (the one you're not executing the command from) for push/pull/clone/incoming/outgoing/others-that-i'm-forgetting commands. The remote repository can be either another repository on the same disk, or something over a network.
Typically you use one shared repository to share the code between different developers. While you don't need it technically, it has the advantage that it is easier to synchronize when there is a single spot for the fresh software.
In the simplest case this can be a repository on a simple file share where file locking is possible (NFS or SMB), where each developer has write access. In this scenario there is no need to have mercurial installed on the server, but there are drawbacks:
Every developer must have a mercurial version installed, which can handle the repo version on the share (as an example, when the repo on the share is created with mercurial 1.9, a developer with 1.3 can't access this repo)
Every developer can issue destructive operations on the shared repo, including the deletion of the whole repo.
You can't reliably run hooks on such a repo, since the hooks are executed on the developer machines, and not on the server
I suggest to use the http or ssh method. You need to have mercurial installed on the server for this (I'm not taking the http-static method into account, since you can't push into a http-static path), and get the following advantages:
the mercurial version on the server does not need to be the same as the clients, since mercurial uses a version-independent wire protocol
you can't perform destructive operations via these protocols (you can only append new revisions to a remote repo, but never remove any of them)
The decision between http and ssh depends on you local network environment. http has the advantage that it bypasses many corporate firewalls, but you need to take care about secure authentication when you want to push stuff over http back into the server (or don't want everybody to see the content). On the other hand ssh has the drawback that you might need to secure the server, so that the clients can't run arbitrary programs there (it depends on how trustworthy your clients are).
I second Rudi's answer that you should use http or ssh access to the main repository (we use http at work).
I want to address your question about "mercurial-server".
The basic Mercurial software does offer three server modes:
Using hg serve; this serves a single repository, and I think it's more used for quick hacks (when the main server is down, and you need to pull some changes from a colleague, for example).
Using hgwebdir.cgi; this is a cgi script that can be used with an HTTP server such as Apache; it can serve multiple repositories.
Using ssh (Secure Shell) access; I don't know much about it, but I believe that it is more difficult to set up than the hgwebdir variant
There is also a separate software package called "mercurial-server". This is provided by a different company; its homepage is http://www.lshift.net/mercurial-server.html. As far as I can tell, this is a management interface for option 3, the mercurial ssh server.
So, no, you don't need to have mercurial-server installed; the mercurial package already provides a server.

Resources