I've followed this procedure to only allow sftp access to the sftp group
Restricting SFTP user to home directory
for this to work the home directory of the user has to be root owned.
How can can enable this so the user can delete and upload files to this area when sftp'd in? I've tried keeping the owner of the home folder as root:root and changed all the permissions to files below to user:sftp but that doesnt work.
Deleting and creating files in a directory depends on the write permission to this directory. So you will have to create subdirectories owned by the sftp-user inside his home were this user is able to write.
There are some workarounds to this, but all of them have their side effects.
Related
I would like to know how to encrypt the FTP directory, because the employees that have an elevated permissions are able to see the content of the FTP although they don't need to access to this documents.
Thanks in advance
Encrypting the directory is not possible, but you could encrypt the underlying filesystem. However, anyone logged in with enough permissions to view the contents of that directory is still able to view files in it.
You're better off setting correct permissions for this directory, so that it doesn't include other employees/users rights.
So for example, if your FTP directory is in /home/ftp. Ensure it is only accessible by the ftp user (assuming 'ftpusr' is a valid user and 'ftpgrp' is a valid group)
# chown ftpusr:ftpgrp /home/ftp
# chmod 750 /home/ftp
This way only the user 'ftpusr' and all users belonging to the 'ftpgrp' group are able to view the files inside the directory.
The question in brief: How can I set up WebDav so that unauthenticated users can download files with a URL, but can't access a list or make changes to the share?
The long version:
I'm new to WebDav. I'd like to replicate the Dropbox/public folder functionality. That allows any user with the correct URL to download a file, but no unauthenticated user can access a list of the files in the subdirectory or make any changes to the public folder.
I'd like to be able to send my client a URL to a file for download without exposing the whole contents of the share and, importantly, without requiring the client to have a user id and password.
The WebDav directory should also not be alterable or viewable by anyone who doesn't have a user id and password.
The WebDav directory is isolated on my server and I can alter .htaccess files.
I have a symlink in my /var/www/ folder that points to a folder in my home directory. I had to set the "other" permissions on the folder to read/execute in order to get files to show up when visiting the server via a web browser, but this allows access to the folder by other users. I want apache to be able to access this folder, but I also want to deny read/write/execute to any other users (other than apache and myself). How do I go about doing this?
I figured it out:
The apache service belongs to the user www-data, so I just added www-data to my own user group and then restarted apache!
I want to give permissions for subfolders in Owncloud.
Example:
a user can edit and read all the files in a synchronized folder except some specific subfolders.
Im working with desktop client and web interface. Version: OwnCloud 8.0.3 (stable)
As far as I know a user has access to all his/her folders and files, plus any files that are shared with him/her by other users. You cannot restrict access to user's files if they are in that user's account.
My assumption is that you are an administrator and can create accounts, etc. A workaround might be the following, but it is a workaround and not the solution you've asked for:
If there are some files that you'd like more than one user, or only specific users to be able to view; you can share them using the web interface.
You could create a master user who has access to all files and then share with the other users from the master account.
If anyone knows any different to this please suggest an edit to my answer and I'll put it in.
I have a dedicated Linux web server with many user accounts on it. The user accounts are all located in /home/[userid] directories. I am able to create Perl scripts that run within each of my users’ accounts that can access files only within their own account, but now I need to create a script that can run “above” the users’ accounts and be able to access a file within any specified user’s account.
Currently, I have a script that uses Net::FTP to retrieve the needed file from each account so I can extract the necessary data from it, but of course, it’s slow to FTP into every account. Since the accounts are merely directories on the server, I’m looking for a way to run a Perl script in a way that it can access each account directory and simply open the required file and return the requested data for the specified account.
How can I accomplish this?
You should login as a user that has access to all the user directories (e.g. root). For security reasons, it might be safer to use sftp or some other encrypted connection.