We installed a OpenLDAP 2.4.31 solution on debian; and several machines in the site are using it. Though the local authentication is not disabled on the machines.
One of the machines has some problems; and its developers asked us to disable central authentication for it. Due to policy, we are not able to change anything on the machine itself; and only can configure our LDAP server. How can we disable one specific machine to use our LDAP server?
You can remove the address of the machines from the LDAP servers; but make sure the machine doesn't get locked out!
Related
I have programmed an application that users can use to process genome data. This application relies on a 10GB database file, that users have to download in order to run the application. At the moment, I have stored this file on Google Drive, but the download bandwith is limited, so if a number of users download the file on a certain day, it will not work for others and they will get errors running the application.
My solution would be to host the file on our research server, create a user that only has access rights to this folder and nothing else, and make the file downloadable from the server via scp within the application (which is open source) through that user.
My question now is, is this safe to do or are people potentially able to hack into our server? If this method would be a security risk, what would be a better way to provide this file?
Thank you in advance!
Aloha
You can setup something like free Seafile https://www.seafile.com/en/home/, or ask the admin to set it up for you which is pretty secure like a self hosted google drive with 2fa authentication.
Another nice and easy tool is Filebrowser on github (https://github.com/filebrowser/filebrowser)
I would not really advice giving people shell/scp access inside your network.
And hosting anything inside a company network is in general not wisest idea, there is a always a risk involved.
I would setup a Seafile/filebrowser solution at a cheap rented server outside your network and upload it there. Or if you have a small pc left set it up in a DMZ Zone, a zone that has special access restrictions inside your company.
You want to use SSH (scp) as a transportation and authentication method for file hosting. It's possible to keep this safe with caution. For example, GitHub uses SSH for transport when providing git access with the git+ssh protocol.
Now for the caution part, if you haven't done it before, it's not a trivial task.
The proper way to achieve this would be set up an isolated SSH server in a chroot environment, and set up an SSH user on this isolated SSH instance only (not a user in the system that is added by eg useradd). Then you can add the files that's absolutely necessary to the chroot, and provide SSH access to users.
(Nowadays you might want to consider using Linux filesystem namespaces, if applicable, to replace chroot, but I'm not sure on this.)
As for other options, setting up a simple Nginx server for static file hosting might be a lot easier, provided you have some understanding of HTTP and TLS. There're lots of writings on the Internet about this.
Both ways, if you are to expose your server to the Internet or Intranet, you need to make sure of firewalling. Consider to learn about nftables or firewalld or the like, if you haven't already.
SSH is reasonably safe. Always keep software up-to-date.
Set up an sftp-only user with chrooted directory. In /etc/ssh/sshd_config:
Match User MyUser
ChrootDirectory /var/ssh/chroot
ForceCommand internal-sftp
AllowTcpForwarding no
PermitTunnel no
X11Forwarding no
This user will not get a shell (because of internal-sftp), and cannot see files outside of /var/ssh/chroot.
Use a certificate client-side, additional to password.
Good description of the setup process for certificates:
https://www.digitalocean.com/community/tutorials/how-to-configure-ssh-key-based-authentication-on-a-linux-server
Your solution is moderately safe.
A better solution is to put it on a server accessible via sftp, behind a password, but also encrypt the file: in this way you introduce a double layer of protection.
On a Linux server you should be able to use a tool like gpg to encrypt your file.
Next you share the decryption key with your partners using a secure channel with e.g. an end2end encrypted messaging software.
For the context : I'm a student and I must do a project with some other people of my class. My role is to prepare them a web server that each one can use and access from anywhere. I plan to host everything on a dedicated server that I already have to avoid additional cost and give to each people a subdomain that will be redirected with VirtualHosts. They will be able to send files to the server with a SFTP server (openssh), they will get an account per person and it will be chrooted to their virtualhost directory.
My main problem : Will this be secure ? I mean, if one of the user set an easy password or just do anything risky, can someone access the other's people virtualhost or even the host dedicated machine ? I already thought about .htaccess and they will be deactivated. Is there another way to get out of an apache virtualhost ?
Things to note : they will have apache, php and an access to a mysql (or maybe mariadb, I don't know for now) database. So, they may be able to upload some old, unsecure code. Some of these users are not very educated to cybersecurity.
The server is a Ubuntu 16.04 LTS.
Thanks for the advices,
If you limit their access to only their own home directory, that's a good start.
A good layer of security would also be to implement 2FA, check out Duo Mobile, you can implement it for SSH logins (or need more details, eg. what options do they have to login into the server?)
If the users are not very educated in cybersecurity as you mentioned, it will be difficult for them to escape the virtual host they have access to.
Although i need more details such as each virtual host will have a separate database or it will be talking to a central database? also, for a paranoid measure, consider where the server is hosted. There are lots of variables that can be affirmed from what you described, but it is best to keep the server on its own network with nothing critical in the same subnet. Just in case.
I have a Linux web server that is looking for a Kerberos realm. I need to give it a .keypass file, which I can do. However, what's really getting me is the KDC. I cannot determine the parent KDC, and I don't know which server would be the admin server. Also, I'm not sure how to go about the process with Ktpass. Has anyone done this before, if so, how did you do it?
This has been really frustrating me as I know the architectural process, but I can't figure it out in a Windows domain with multiple DCs. The linux portion isn't a problem, I know what to do where, but I have no idea how to pull that information from Windows in a way that Tomcat can read.
Any help would be appreciated. Thanks!
In theory, you can map any machine in an DNS domain to any kerberos realm by getting every machine involved to use the same krb5.conf file. However, in practice the machine with DNS
name web.foo.com is in the realm FOO.COM.
To find the KDC for a realm, you can generally do dns querys for these SRV records.
dig -t SRV _kerberos._udp.foo.com
AD supports this.
I want to have a local Gitorious installation that cannot be accessed outside of my local network, and is as secure and private as possible. The repos will be holding code I need kept private and secure in case of hacking or theft.
I'm not an expert with Linux, and certainly not an expert with git/gitorious, so any tips for improving my installation described below would be most helpful!
I have:
Installed Gitorious on a local machine running Ubuntu Server 11.04 64-bit, with an encrypted LVM.
Used this guide for Gitorious installation, if anyone is curious.
Modified Gitorious to support local IPs as hostnames.
In gitorious.yml:
host fields are a local IP (e.g. 192.168.xxx.xxx)
public_mode: false
only_site_admins_can_create_profiles: true
hide_http_clone_urls: true
git-daemon was installed, but is now removed.
No ports forwarded by internet facing router to machine.
Both git:// based and http:// based requests would normally allow open cloning of repos. Removing git-daemon and setting hide_http_clone_urls to false seems to have disabled both. They both deliver errors now when I attempt to clone.
With an encrypted LVM the machine is secure in case of physical theft. Also, all cloned repos on other machines are kept on encrypted drives as well. I used a custom script on the encrypted LVM that fills the harddrive with porn in case of too many failed attempts.
My current concerns:
Is repo access through git:// and http:// fully disabled?
Are all avenues of repo access secured behind ssh now?
Is there a way to block all requests to the machine that don't originate from within the local network, in case my router gets angry and seeks revenge against me?
Anything more I can do to encrypt or protect the repos in case something goes wrong?
How do I backup gitorious's data? Just backup the MySQL database and repos directory?
Thank you.
If your git-daemon is not running then no git:// access.
hide_http_clone_urls does not disable http, it just does not show the link. To protect it from unauthorized access, you might want to block on apache/nginx all access to git.yourdomain.com.
You can take a look at my debian package, that have many default configurations, better then the documentations available on the internet:
https://gitorious.org/gitorious-for-debian/gitorious/
the base folder is where all configurations is stored, like apache configs and others, there is also the shell scripts that make default users and other things, just explore the source tree.
being more specific about the apache config, take a look here: https://gitorious.org/gitorious-for-debian/gitorious/blobs/master/base/debian/etc/apache2/sites-available/gitorious
If, for example, you don't add the git.yourserver.com alias, then no one should be able to git clone from http.
You might also want to watch and support the private repositories feature that are planned, which will provide real, safe, control of who can see what.
Also for the question about ssh, I can say that, yes, it's safe and will only give access to who have a public key registered on your gitorious installation.
About the requests question, you could take a look at apache allow, deny rules, where you can create something like:
Deny from All
Allow from 192.168.0
For backup, you have to backup your repository folder and mysql databases.
I've been setting up a samba share on a Redhat box, and am able to connect to it from the local machine. From an XP machine however, I'm only able to successfully connect to the root of the share (e.g. "\machine"). Connecting to the actual shared folders (e.g. "\machine\share") generates an error.
The full error message is:
\machine\share is not accessible.
You might not have permission to use
this network resource. Contact the
administrator of this server to find
out if you have access permissions.
Incorrect function.
Looking at the properties on the windows side, I see "everyone", "root (Unix Group\root)", and "root (Unix User\root") listed with no permissions.
I'm using share authentication, and the user I've designated for the guest account has read/write access to the shared folder.
Has anyone run into a similar issue before? Thanks in advance for any assistance.
It looks like the windows machine was caching authentication information, and not updating it as the samba server's authentication mode was changed. This meant that once I'd failed to connect to the samba server (due to bad settings on the server side), connections would continue to fail even when the server settings were corrected. Rebooting the XP machine resolved the issue.