Restricting access to directories for git server? - linux

I'm using my raspberrypi as a Git server. Until I've been using it by myself. Now I would like to give access to a friend. The problem is that the git user also can see everything else on the server. So if he has the ssh-keys to push to the server, he can also SSH in the server and do whatever he wants (well, not everything, he won't have the sudo password).
But still... What's the most secure way to deal with something like this?
I don't want any client to run any commands from this user (if possible) and I want to restrict access to a directory where I mount my external HDD.

You should set up a software like gitolite which handles the access permissions. (Check the software packages for your Raspberry Pi. You will probably find a gitolite package.)
Follow the setup instructions. Instead of modifying .ssh/authorized_keys2 yourself, the software will do this for you. This makes sure the users cannot do anything else than accessing the Git repositories.
Another alternative is GitLab.

Related

How to securely host file on RHEL server and enable download for user

I have programmed an application that users can use to process genome data. This application relies on a 10GB database file, that users have to download in order to run the application. At the moment, I have stored this file on Google Drive, but the download bandwith is limited, so if a number of users download the file on a certain day, it will not work for others and they will get errors running the application.
My solution would be to host the file on our research server, create a user that only has access rights to this folder and nothing else, and make the file downloadable from the server via scp within the application (which is open source) through that user.
My question now is, is this safe to do or are people potentially able to hack into our server? If this method would be a security risk, what would be a better way to provide this file?
Thank you in advance!
Aloha
You can setup something like free Seafile https://www.seafile.com/en/home/, or ask the admin to set it up for you which is pretty secure like a self hosted google drive with 2fa authentication.
Another nice and easy tool is Filebrowser on github (https://github.com/filebrowser/filebrowser)
I would not really advice giving people shell/scp access inside your network.
And hosting anything inside a company network is in general not wisest idea, there is a always a risk involved.
I would setup a Seafile/filebrowser solution at a cheap rented server outside your network and upload it there. Or if you have a small pc left set it up in a DMZ Zone, a zone that has special access restrictions inside your company.
You want to use SSH (scp) as a transportation and authentication method for file hosting. It's possible to keep this safe with caution. For example, GitHub uses SSH for transport when providing git access with the git+ssh protocol.
Now for the caution part, if you haven't done it before, it's not a trivial task.
The proper way to achieve this would be set up an isolated SSH server in a chroot environment, and set up an SSH user on this isolated SSH instance only (not a user in the system that is added by eg useradd). Then you can add the files that's absolutely necessary to the chroot, and provide SSH access to users.
(Nowadays you might want to consider using Linux filesystem namespaces, if applicable, to replace chroot, but I'm not sure on this.)
As for other options, setting up a simple Nginx server for static file hosting might be a lot easier, provided you have some understanding of HTTP and TLS. There're lots of writings on the Internet about this.
Both ways, if you are to expose your server to the Internet or Intranet, you need to make sure of firewalling. Consider to learn about nftables or firewalld or the like, if you haven't already.
SSH is reasonably safe. Always keep software up-to-date.
Set up an sftp-only user with chrooted directory. In /etc/ssh/sshd_config:
Match User MyUser
ChrootDirectory /var/ssh/chroot
ForceCommand internal-sftp
AllowTcpForwarding no
PermitTunnel no
X11Forwarding no
This user will not get a shell (because of internal-sftp), and cannot see files outside of /var/ssh/chroot.
Use a certificate client-side, additional to password.
Good description of the setup process for certificates:
https://www.digitalocean.com/community/tutorials/how-to-configure-ssh-key-based-authentication-on-a-linux-server
Your solution is moderately safe.
A better solution is to put it on a server accessible via sftp, behind a password, but also encrypt the file: in this way you introduce a double layer of protection.
On a Linux server you should be able to use a tool like gpg to encrypt your file.
Next you share the decryption key with your partners using a secure channel with e.g. an end2end encrypted messaging software.

Restricting remote communication in Git

I have Git installed in our company's Linux server. All the developers work on the same server. Recently we had to move to Git, which is hosted on some other server. Before we create a Git repository we create SSH keys and then start ssh-agent and finally add the private key using ssh-add.
My problem is I created a Git repository in the Linux machine, set my keys and everything and also did a push to remote Git server. But if some other developer also has his key added he can also perform a git push on my local repository.
Is there any way I can restrict push by other developers on the same Linux machine?
If you want to prevent others from pushing to your personal development machine, set up a firewall. If you want to prevent people from pushing to remote server, remove their keys, or add per-ip firewall rules (so that they can still use SSH). At least that's what I'd do, since it looks like the git itself doesn't offer any access control facilities and leaves it to the OS/networking layer.
In any case, my opinion is that rather than setting up some security facilities, you should trust your coworkers not to screw things up. After all, it's not some public repository - it's a company, where screw ups (intentional or not) should be dealt with accordingly.

Using GIT to clone from a windows machine to a linux webserver (in house)

OK, I am looking for a way to use GIT to keep a web site up to date between my local machine (git repository) and my web site (git clone of repository).
I have initialized the repository (on windows 7 machine) and added all the files to the repo on my local machine. I now need to get the repo to the webswerver (a linux-based machine). I can access the webserver via putty and ssh. How do I go about cloning the repo into the appropriate directory to serve the web site?
I have tried the following from my linux based machine: git clone git+ssh://myuser#10.1.0.135/d/webserver/htdocs/repo
I keep receiving a connect to host 10.1.0.35 port 22: connection time out
Both machines are in house with the webserver being outside of the network on a different IP range (outside of firewall). I came from subversion and can easily svn commit/update to and from the webserver and my machine without issue.
Thanks for any guidance on this!
The best resource I've found for doing this is located here.
The problem I had was that issuing a git clone from the *nix environment using the above suggestions was unable to find the path to the repo properly.
I was able to fix this by starting the git daemon with the --base-path and --export-all params.
So, from the windows box:
git daemon --base-path=C:/source/ --export-all
Then from the *nix box (mac in my case):
git clone git://<local ip>/<project name>
My directory structure on the windows box is:
c:\source\<project name> - this is where the .git folder lives
Here is a walkthrough someone else did. It goes step by step showing how to do what you want.
The IP address 10.1.0.135 is reserved for private networks, which means that it only refers to your local Windows computer when used within your home network. If you're running the git clone command with that address on your server, 10.1.0.135 refers to a completely different computer, which explains why the connection isn't working.
Here's my suggestion: instead of trying to clone the repository on your home computer, first create an empty repository on the server
server$ git init /path/to/repository
and then push changes from your computer to the server's repository
home$ git remote add website ssh://myuser#server/path/to/repository
home$ git push website
You can call the remote something other than "website" if you want.
For slightly more advanced usage, I've written a blog post explaining how to set up staging and production servers and maintain them with git. If you don't want to deal with a staging server, though, I also link to a couple of tutorials about a simple two-repository setup to manage a website with git, which is basically what it sounds like you're looking for.
Sounds like your windows 7 machine (in particular, port 22) may not be accessible from outside of the firewall. With subversion, the webserver is likely accessible to both machines. Also, the IP for your Windows machine is a non-routable IP, which means your firewall is likely also NAT'ing your internal network.
You could approach this by opening port 22 in the firewall, or setting up port-forwarding in the firewall to point to your Windows machine. But you should probably create the git repo on the server, then clone from that to your Windows machine instead. You could use scp -r to get that initial repo on the server, though someone with more git experience may be able to tell you a better way.
Good idea to do this with Git, if you need to check it into a version control system anyhow.
Just wanted to mention you could also look at the rsync utility - e.g. google "Rsync Windows" brings up some nice results.
Rsync is specifically made for keeping directory trees in-sync across machines. And it does it smart, not transfering files which are already on the other side, and you can use compression.. it has tons of features,
and is typically used in UNIX production environments. There are ways to run it also on Windows.
In any case:
Check your firewall settings on both machines - the relevant ports need to be open. In your case port 22 is probably blocked

Is Mercurial Server a must for using Mercurial?

I am trying to pick a version control software for our team but I don't have much experience for it before. After searching and googling, it seems Mercurial is a good try. However, I am a little bit confused about some general information about it. Basically, our team only have 5 people and we all connect to a server machine which will be used to store the repositories. The server is a Redhat Linux system. We probably use a lot of the centralized workflow. Because I like the local commit idea, I still prefer the DVCS kind software. Now I am trying to install mercurial. Here are my questions.
1) Does the server used for repositories always need to be installed the software "mercurial-server "? Or it depends on what kind of workflow it uses ? In other words, is it true if there is no centralized workflow used for works, then the server can be installed by "mercurial client" ?
I am confused about the term "mercurial-server". Or it means the mercurial installed on the server is always called "mercurial server" and it does matter if it is centralized or not. In addition, because we all work on that server, does it mean only one copy of mercurial is required to install there ? We all have our own user directory such as /home/Cassie, /home/John,... and /home/Joe.
2) Is SSH a must ? Or it depends on what kind of connection between users and the server ? So since we all work in the server, the SSH is not required right ?
Thank you very much,
There are two things that can be called a "mercurial server".
One is simply a social convention that "repository X on the shared drive is our common repository". You can safely push and pull to that mounted repository and use it as a common "trunk" for your development.
A second might be particular software that allows mercurial to connect remotely. There are many options for setting this up yourself, as well as options for other remote hosting.
Take a look at the first link for a list of the different connection options. But as a specific answer to #2: No, you don't need to use SSH, but it's often the simplest option if you're in an environment using it anyways.
The term that you probably want to use, rather than "mercurial server", is "remote repository". This term is used to describe the "other repository" (the one you're not executing the command from) for push/pull/clone/incoming/outgoing/others-that-i'm-forgetting commands. The remote repository can be either another repository on the same disk, or something over a network.
Typically you use one shared repository to share the code between different developers. While you don't need it technically, it has the advantage that it is easier to synchronize when there is a single spot for the fresh software.
In the simplest case this can be a repository on a simple file share where file locking is possible (NFS or SMB), where each developer has write access. In this scenario there is no need to have mercurial installed on the server, but there are drawbacks:
Every developer must have a mercurial version installed, which can handle the repo version on the share (as an example, when the repo on the share is created with mercurial 1.9, a developer with 1.3 can't access this repo)
Every developer can issue destructive operations on the shared repo, including the deletion of the whole repo.
You can't reliably run hooks on such a repo, since the hooks are executed on the developer machines, and not on the server
I suggest to use the http or ssh method. You need to have mercurial installed on the server for this (I'm not taking the http-static method into account, since you can't push into a http-static path), and get the following advantages:
the mercurial version on the server does not need to be the same as the clients, since mercurial uses a version-independent wire protocol
you can't perform destructive operations via these protocols (you can only append new revisions to a remote repo, but never remove any of them)
The decision between http and ssh depends on you local network environment. http has the advantage that it bypasses many corporate firewalls, but you need to take care about secure authentication when you want to push stuff over http back into the server (or don't want everybody to see the content). On the other hand ssh has the drawback that you might need to secure the server, so that the clients can't run arbitrary programs there (it depends on how trustworthy your clients are).
I second Rudi's answer that you should use http or ssh access to the main repository (we use http at work).
I want to address your question about "mercurial-server".
The basic Mercurial software does offer three server modes:
Using hg serve; this serves a single repository, and I think it's more used for quick hacks (when the main server is down, and you need to pull some changes from a colleague, for example).
Using hgwebdir.cgi; this is a cgi script that can be used with an HTTP server such as Apache; it can serve multiple repositories.
Using ssh (Secure Shell) access; I don't know much about it, but I believe that it is more difficult to set up than the hgwebdir variant
There is also a separate software package called "mercurial-server". This is provided by a different company; its homepage is http://www.lshift.net/mercurial-server.html. As far as I can tell, this is a management interface for option 3, the mercurial ssh server.
So, no, you don't need to have mercurial-server installed; the mercurial package already provides a server.

Secure, Private, Local Gitorious

I want to have a local Gitorious installation that cannot be accessed outside of my local network, and is as secure and private as possible. The repos will be holding code I need kept private and secure in case of hacking or theft.
I'm not an expert with Linux, and certainly not an expert with git/gitorious, so any tips for improving my installation described below would be most helpful!
I have:
Installed Gitorious on a local machine running Ubuntu Server 11.04 64-bit, with an encrypted LVM.
Used this guide for Gitorious installation, if anyone is curious.
Modified Gitorious to support local IPs as hostnames.
In gitorious.yml:
host fields are a local IP (e.g. 192.168.xxx.xxx)
public_mode: false
only_site_admins_can_create_profiles: true
hide_http_clone_urls: true
git-daemon was installed, but is now removed.
No ports forwarded by internet facing router to machine.
Both git:// based and http:// based requests would normally allow open cloning of repos. Removing git-daemon and setting hide_http_clone_urls to false seems to have disabled both. They both deliver errors now when I attempt to clone.
With an encrypted LVM the machine is secure in case of physical theft. Also, all cloned repos on other machines are kept on encrypted drives as well. I used a custom script on the encrypted LVM that fills the harddrive with porn in case of too many failed attempts.
My current concerns:
Is repo access through git:// and http:// fully disabled?
Are all avenues of repo access secured behind ssh now?
Is there a way to block all requests to the machine that don't originate from within the local network, in case my router gets angry and seeks revenge against me?
Anything more I can do to encrypt or protect the repos in case something goes wrong?
How do I backup gitorious's data? Just backup the MySQL database and repos directory?
Thank you.
If your git-daemon is not running then no git:// access.
hide_http_clone_urls does not disable http, it just does not show the link. To protect it from unauthorized access, you might want to block on apache/nginx all access to git.yourdomain.com.
You can take a look at my debian package, that have many default configurations, better then the documentations available on the internet:
https://gitorious.org/gitorious-for-debian/gitorious/
the base folder is where all configurations is stored, like apache configs and others, there is also the shell scripts that make default users and other things, just explore the source tree.
being more specific about the apache config, take a look here: https://gitorious.org/gitorious-for-debian/gitorious/blobs/master/base/debian/etc/apache2/sites-available/gitorious
If, for example, you don't add the git.yourserver.com alias, then no one should be able to git clone from http.
You might also want to watch and support the private repositories feature that are planned, which will provide real, safe, control of who can see what.
Also for the question about ssh, I can say that, yes, it's safe and will only give access to who have a public key registered on your gitorious installation.
About the requests question, you could take a look at apache allow, deny rules, where you can create something like:
Deny from All
Allow from 192.168.0
For backup, you have to backup your repository folder and mysql databases.

Resources