Is execute permisson required in uploading file by scp? - linux

I created a web application directory and app in the directory.
And, they have 776 permission and their group is apache:apache.
I'd like to allow others to upload modules by scp.
But,I want others not to execute programs.So,I don't give "x" others , but it doesn't work. They cant upload file and permisson denied.others group is "xxx".
I think if others have read and write permission , they can upload file. What is wrong?

In order to add files to a directory you need both write (+w) and execute (+x) permissions.
See this answer for how file permissions work in Linux:
https://unix.stackexchange.com/questions/21251/execute-vs-read-bit-how-do-directory-permissions-in-linux-work
The execute permission on a directory does not imply files in that directory are made executable.

Related

Problems with file hooks and permissions in GitLab

I run my own GitLab server and setup a file hook which is supposed to access some files in my users directory. The file hook is executed by the git user, so I get a permission denied.
A certain process foo, which places some files in my user directory which the file hook is supposed to read does not give me the option to add another group to the created files.
Does anyone have an idea how to solve this issue?
Beside:
using sudo, meaning having a sudoers in place, authorizing git to copy foo's files
modifying the ACL (setfacl), to add git as an authorized user to read those files
there is no GitLab-specific solution, only Linux-based ones.

permission denied while copying eventhough necessary permission exist

I have a file(file) with permission 500. In Linux, I tried to copy (using cp) that file into a folder (a) whose permission is 600. Even though folder have write permission, I am getting " cannot stat `a/file': Permission denied error.
Could anyone explain why is it so?
Is it because directory does not have executable permission ?
Execute bit allows the affected user to enter the directory, and access files and directories inside.
Plse see http://www.hackinglinuxexposed.com/articles/20030424.html
https://unix.stackexchange.com/questions/21251/why-do-directories-need-the-executable-x-permission-to-be-opened for further info

Allowing jenkins to access contents of currently logged in user folder

I am using Jenkins to build my project in a Linux machine. During build operation files are read from a source location and files are to be copied to a new destination location.The source and destination locations are input by the user from Jenkins UI. I want the user to be able to select any folder located within his/her home folder as source or destination. For example: /home/jdoe/folder.
Currently, any folder inside /var/lib/jenkins, with jenkins:nogroup user-group, can be selected. However, a folder inside /home/jdoe/folder with same (jenkins:nogroup) user-group, and with the same permissions as the folders within /var/lib/jenkins, cannot be selected. I get a permission denied error on trying to read or write inside /home/jdoe/folder.
What can I do to enable reading and writing to a folder within the home folder of the currently logged in user? Can I set up Jenkins in a certain way to be able to do that, or do I have to change group settings for the home folder?Could you suggest a good configuration for me to be able to make this work?
Would there be any difference in using Jenkins on an Windows platform?
First make sure that the folder is having read-write access for jenkins user group.
sudo chmod -R 77 /home/jdoe
Also as in comment by Daniel, grant execute permission on the /home/jdoe folder.
sudo chmod a+x /home/jdoe

Users can't upload files, even with permissions set to them using vsftpd

I have a cloud hosting linux solution. I had vsftpd working on it, but after having issues and tinkering with a lot of settings, I now have an issue where users can login using FTP and connect to the correct home directory, navigate within it, download files but they cannot upload files to the server. They get a time out error, which appears to be a permissions error, but I can't narrow it down any more than that. /var/logs/syslog gives nothing away.
The folders belong to the users. The parent www folder is set to 555. Can anyone help with this issue at all?
Cheers,
T
Try to set the permissions to 755, 555 doesn't allow writing for anyone. Are your user and group different?
You also may need to enable logging for FTP server. The time out error may include some other errors, not only permission denied.
To have extended logging change the variables in your ftp config file:
dual_log_enable=YES
log_ftp_protocol=YES
xferlog_enable=YES
syslog_enable=NO
and check the log file name there.
you must create a folder into user folder (Example : /var/www/user1/upload).
and set permission 777 (Example : chmod 777 /var/www/user1/upload).
then upload file into this folder.

How to allow file uploading outside home directory with SSH?

I'm running a Fedora 8 Core server. SSH is enabled and I can login with Transmit (FTP client) on port 22. When logged in, I can successfully upload files to the users home directory. Outside the home directory I can only browse files, not upload/change anything. How can I allow file uploading to a specific directory outside the users home directory?
an easy method is to grant the user rights to the folder you want them to be able to upload to, then add a symlink (link -s) from their home folder to the destination.
You can also just use
scp file user#server:/path
which will let you upload to any directory you have permissions to
file is the file to copy
user & server should be obvious
/path is any destination path on the server which you have rights to; so /home/user/ would be your likely default home folder
You need to make those directories writable by the proper users, or (easier) that user's group. This is of course a huge security hole, so be careful.
HI,
Give the FTP user write permission on the directory where you want to upload your files.

Resources