I have a folder somewhere on my webserver and a symlink to this folder from a public_html of another user. For the purpose of centralizing some content.
I can perfectly include files from within this symlinked folder.
However, I can not access a file (e.g. test.html or test.php) directly.
If I try to access a file directly I get a forbidden error in the browser and in the log files I see:
Symbolic link not allowed or link target not accessible
I have tried to edit the httpconf as I found on the internet, tried to put a .htaccess in the folder where the symlink recides etc. But nothing has helped so far.
Additional info:
My webserver runs CentOS release 6.5 (Final)
Apache 2.2.27
permissions to the parentfolder of the linked folder: 777
Permissions to the linked folder: 777
I have set them from 644 to 777 for testing purposes
And please note that I am a programmer running into problems, not a linux expert :)
*** I still do not have a working solution. I can use the files by including them, just fine, just not call any files directly e.g. images, stylesheets. I have no idea what other details to give that will help solve this problem.
The permissions are ok for the link, folder and parentfolder
Changing the owner of either link or folder does also not give any results
Options FollowSymLinks in .htaccess in the folder with the link, the parent of that folder, the destination folder and parent of destination folder, it does all not seem to help.
I can just not execute files and really need to
It turned out that the symlink itself needed to be owned by the same user as the original linked folder
chown -R username: linkname
That did the trick for me
You can check the rights with
ls -la
Then it looks a little wierd to see all the files owned by the actual user, except for the symlink, which is owned by the original user.
Hope this helps others with a similar issue as well!
first of all I would recommend to include some details in your question. E.g. the permissions of dirs and files would be helpful, what distro etc.
My crystal ball analysis:
- You web-server (apache) runs as user www-data
- Your php runs a suexec user A / user b.
- The www-data user, cannot access the file, because the permission for www-data is not sufficient on the file or it's parent dir (!).
You could do:
$ sudo -u www-data ls -lah /path/to/your/file
... to see, what your webserver users sees.
$ ls -lah /path/to/your/file
... to check, what permissons are set to the file. (group/world readable?)
$ ls -dlah /path/to/your
... to check the permissions on parent dir (need x flag for www-data),
HTH,
j.
In a <Directory> block in the Apache config, or in an .htaccess file:
Options FollowSymLinks
Related
I have an issue with Linux ACL. Here is my work flow:
Set ACL permissions on empty directory:
sudo setfacl -Rdm g:www-data:rw /var/www/mysite/html/vendor/
Change directory:
cd /var/www/mysite/html/
Install composer packages:
composer install
Verify installed file permissions:
ls -la vendor/
All the newly created files and folders belong to my user group instead of belonging to the www-data group like it should...
drwxrwxrwx+ 3 john john 4096
What am I missing here?
Note: If my user creates a file or a directory, the correct group permission will be applied. The problem only happens with the composer command.
I finally found what I was doing wrong. I was confusing file "ownership" and file "permissions".
setfacl is used to set default "permissions" for files created in a directory. What I actually needed was to set default "ownership". This is done by setting the "setgid flag" with the chmod command after properly setting the directory group and user ownership.
I wanted all newly created files in my project directory to belong to the user "john" and the group "www-data".
chown -R john:www-data /srv/www/myproject
Now we set the "setgid flag" on the directory and all newly created files will belong to john:www-data:
chmod +s /srv/www/myproject
That's all and there's absolutely nothing wrong with setting the www-data group on your served files if you set verything else properly. In fact, the most upvoted anwser related to Laravel file permission on Stackoverflow (800+ upvote) recommends this exact method. Those who disagree never provide a better working solution.
To conclude, Unix permissions is a complicated topic. Few people understand how to properly set permissions on a production server, many fluent programmers are newbies when it comes to Linux. Take answers you read on SO with a grain of salt.
I have created a simple app using AngularJS. When I tried to host that project in my website http://demo.gaurabdahal.com/recipefinder it shows the following error:
Forbidden
You don't have permission to access /recipefinder on this server.
Server unable to read htaccess file, denying access to be safe
But if I go to http://demo.gaurabdahal.com/ it displays "access denied" message as expected, that I have printed. But why is it unable to open that AngularJS projects "recipefinder". If I tried to put a simple HTML app there, it opens just fine.
The same AngularJS project works fine when I host that in github (http://gaurabdahal.github.io/recipefinder)
I can't understand what's wrong.
I had this problem too. My advice is look in your server error log file. For me, it was that the top directory for the project was not readable. The error log clearly stated this. A simple
sudo chmod 755 <site_top_folder>
fixed it for me.
Set group of your public directory to nobody.
This is a common problem with GoDaddy virtual server hosting when you bring up a new website.
Assuming you have SSH access to the server (you have to enable it on cPanel), login to your account. Upon successful login, you will be placed in the home directory for your account. The DocumentRoot for your website is located in a subdirectory named public_html. GoDaddy defaults the permissions for this directory to 750, but those permissions are inadequate to allow Apache to read the files for website. You need to change the permissions for this directory to 755 (chmod 755 public_html).
Copy the files for your website into the public_html directory (both scp and rsync work for copying files to a GoDaddy Linux server).
Next, make sure all of the files under public_html are world readable. To do this, use this command:
cd public_html
chmod -R o+r *
If you have other subdirectories (like css, js, and img), make sure they are world accessible by enabling both read and execute for world access:
chmod o+rx css
chmod o+rx img
chmod o+rx js
Last, you will need to have a .htaccess file in the public_html file. GoDaddy enforces a rule that prohibits the site for loading if you do not have a .htaccess file in your public_html directory. You can use vi to create this file ("vi .htaccess"). Enter the following lines in the file:
Order allow,deny
Allow from all
Require all granted
This config will work for both Apache 2.2 and Apache 2.4. Save the file (ZZ), and then make sure the file has permissions of 644:
chmod 644 .htaccess
Works like a charm.
You need to run these commands in /var/www/html/ or any other directory that your project is on:
sudo chgrp -R GROUP ./
sudo chown -R USER:GROUP ./
find ./ -type d -exec chmod 755 -R {} \;
find ./ -type f -exec chmod 644 {} \;
In my case (apache web server) I use www-data for USER and GROUP
Every public folder makes the permission to 755. Problem solved.
GoDaddy shared server solution
I had the same issue when trying to deploy separate Laravel project on a subdomain level.
File structure
- public_html (where the main web app resides)
[works fine]
- booking.mydomain.com (folder for separate Laravel project)
[showing error 403 forbidden]
Solution
go to cPanel of your GoDaddy account
open File Manager
browse to the folder that shows 403 forbidden error
in the File Manager, right-click on the folder (in my case booking.mydomain.com)
select Change Permissions
select following checkboxes
a) user - read, write, execute
b) group - read, execute
c) world - read, execute
Permission code must display as 755
Click change permissions
In linux,
find project_directory_name_here -type d -exec chmod 755 {} \;
find project_directory_name_here -type f -exec chmod 644 {} \;
It will replace all files and folder permission of project_directory_name_here and its inside stuff.
In my case apache was somehow configured wrong(?) so I had to set permissions to all parent dirs too. Just setting permission to .htaccess (and it's parent dir) didn't work.
Ok, I recently met the same issue too while working on a WordPress installation using apache2 on the server on Ubuntu 20.04.
I experienced this issue when I changed file ownership to another user:
Here's what worked for me:
$ sudo chown -R www-data:www-data /var/www/YOUR-DIRECTORY
Here's a bit more context into the issue:
The above command gives ownership of all the files [in that folder] to the www-data user and group. This is the user that the Apache web server runs as, and Apache will need to be able to read and write WordPress files in order to serve the website and perform automatic updates.
Be sure to point to your server’s relevant directory (replace YOUR-DIRECTORY with your actual folder).
You could run through this insightful article on digitalocean.
As for Apache running on Ubuntu, the solution was to check error log, which showed that the error was related with folder and file permission.
First, check Apache error log
nano /var/log/apache2/error.log
Then set folder permission to be executable
sudo chmod 755 /var/www/html/
Also set file permission to be readable
sudo chmod 644 /var/www/html/.htaccess
Just my solution. I had extracted a file, had some minor changes, and got the error above. Deleted everything, uploaded and extracted again, and normal business.
Important points in my experience:
every resource accessed by the server must be in an executable and readable directory, hence the xx5 in every chmod in other answers.
most of the time the webserver (apache in my case) is running neither as the user nor in the group that owns the directory, so again xx5 or chmod o+rx is necessary.
But the greater conclusion I reached is start from little to more.
For example, if
http://myserver.com/sites/all/resources/assets/css/bootstrap.css
yields a 403 error, see if http://myserver.com/ works, then sites, then sites/all, then sites/all/resources, and so on.
It will help if your server has directory indexes enable:
In Apache: Options +Indexes
This instruction might also be in the .htaccess of your webserver public_html folder.
I had same problem on Fedora, and found that problem was selinux.
to test that it is problem run command:
sudo setenforce 0
Otherwise or change in file /etc/sysconfig/selinux
SELINUX=enforcing
to
SELINUX=disabled
or add rules to selinux to allow http access
I had the same problem on a rackspeed server after changing the php version in the cpanel. Turned out it also changed the permissions of the folder... I set the permission of the folder to 755 with
chmod 755 folder_name
"Server unable to read htaccess file" means just that. Make sure that the permissions on your .htaccess file are world-readable.
Hope you are good. I have Xammp on fedora and changed owner of opp/lampp/htdoc to root. Why I did so because whenever someone creates new folder through sharing, they don't have permission to dynamically create folder or files or to write images. Then I run command
chmod -R 777 /opt/lampp/htdocs
But when system goes to restart then I again need to run this command. So avoid again and again run this command I changed the owner on "opt/lampp/htdocs" and run
chmod -R 777 /opt/lampp/htdocs
Now, whenever server restarts, assigned permissions don't need to be set again and again. That is resolved.
I have an issue, that old directories can be used to write something. But if any network user creates new directory under htdocs, that new directory needs to be changed the permission for it.
previously created, and can use this one directory to run script to create files
drwxrwxrwx 2 root root 4096 2011-06-15 14:09 aaa
Newly created, cannot be used to run a script to create image or to write anything
drwxr-xr-x 2 root root 4096 2011-06-17 15:17 aaaa
drwxr-xr-x this one is really annoying to me for each newly created folder in htdocs :(
Just to let you know that my htdocs user and rights are:
drwxrwxrwx 101 root root 4096 2011-06-17 15:17 htdocs
Why is it so? Can anybody please help me to figure this problem out? I am waiting for quick response anxiously.
First off, you should investigate what permissions you really need - chmodding everything to 777 is a security risk as it will allow any user to write inside of your web root.
However, to address your actual question of the default permissions when a new folder is created by a user, you want to adjust the default "umask" which determines such things.
This question has some information for changing it for the Apache user (if a "network user" is a user creating new files and directories through the httpd process):
Setting the umask of the Apache user
If you need to adjust it for other users or processes, the solution will be similar.
Good luck!
Edit
Since you're on Fedora, try this: (from the question I linked above)
[root ~]$ echo "umask 002" >> /etc/sysconfig/httpd
[root ~]$ service httpd restart
The first command will add that line to the /etc/sysconfig/httpd which is a permanent configuration file, and the second command will make it active.
You are tackling the problem from the wrong side. Restore your apache configuration to use apache.apache as default user/group, and set your samba server to use those credentials when someone write to your document root.
If you are using nfs or another posix compatible filesystem, use chmod g+s to keep all files readable from your apache server.
Try it:
#umask 000
have a good time!!
I moved from a shared hosting to a VPS a few weeks ago and I'm having these annoying permission issues with WordPress. You know you can download and upgrade plugins (and wordpress itself) from the admin panel, but since I moved it started asking me my FTP credentials, which is kinda slow when I have to update ~20 plugins.
I think this should be some kind of rights issue. I looked that the shared hosting wordpress files, they all belong to the username and group kovshenin (kovshenin:kovshenin) and the files are -rw-r--r-- and the directories are drwx-r-xr-x.
On my VPS apache runs under apache:apache and my files are kovshenin:kovshenin. What should I do to make them readable and writable by both kovshenin and apache?
Also, I changed the permissions to 0777 for all files and folders of my wordpress installation, that allowed me to install and delete plugins without FTP, but when I pushed to automatic upgrade to WordPress 2.8.1 it still asked me for my FTP account. Is that a wp issue or did I miss something?
Thanks.
Update: I managed to run id and id www-data on the MediaTemple shared hosting. User kovshenin is in group kovshenin, and www-data is in group www-data. No more groups. What's the trick?
Another update Okay, I added the apache user to the kovshenin group, my wordpress files are kovshenin:kovshenin with rw-rw-r-- permissions and drwxrwxr-x permissions on directories, but something is still wrong. The user apache can access the files and folders, I can use the online Themes and Plugins editor in the wordpress admin panel, I'm able to make changes to the .htaccess file from within wordpress, but plugin/theme installation still asks me for FTP credentials!
Any ideas? Thanks.
What should I do to make them readable and writable by both kovshenin and apache?
Create a new group, say "wordpress".
Add both koveshenin and www-data users to the wordpress group.
Change the group owner of all the files to wordpress (using chgrp).
Make sure all the files are group writeable.
Set the g+s (setgid) permission bit on all the directories of interest.
Make sure kovshenin and apache's default umask includes group read & write permission.
The second last step is the trick. It means that whenever kovshenin or apache creates a file in those directories, the group owner will be set to wordpress (instead of kovshenin or apache).
You can give ownership to www-data according to here.
Run the following command in your WordPress directory (sudo required):
sudo chown -Rf www-data *
Works for Apache.
Assuming your wordpress install directory is /var/www/html to mass change all the files and directories to the proper permission use:
sudo find /var/www/html/ -type d -exec chmod 775 {} \;
sudo find /var/www/html/ -type f -exec chmod 664 {} \;
To mass change the owner group of everything use:
sudo chgrp -R <desired_username>.<desired_groupname> /var/www/html
I had the same problem and I solved it turning off PHP 'safe_mode' in plesk, now WP can create folders and move files without any problems.
I hope this help you.
Currently, adding define('FS_METHOD', 'direct'); to wp-config.php might do the trick. Not sure that would have worked in '09 though. See here for my similar case using nginx. I found that it was an essential step.
Simple question, but for some reason I couldn't find the exact answer on Google:
I have a fresh Ubuntu install on Slicehost, and would like to make a public directory in my home dir for a simple website containing a bunch of static HTML files. How do I do this? Is it just a matter of typing mkdir public_html and setting the permissions, or is there a cleaner way? (I remember in the past I've had issues where every time I copied a file into my public_html directory, I would have to manually set its permissions, which was quite frustrating.)
Assuming you've already installed apache, do the following:
sudo a2enmod userdir
sudo service apache2 reload
The first command enables the userdir apache mod, which does exactly what you want. The second reloads apache configurations so that it starts using the new configuration.
To install apache2:
sudo apt-get install apache2
Of course, you'll also need to make sure that the permissions on your public_html folder allow the www-data user to see the files in there -- 755 usually works well. To do this:
mkdir ~/public_html
chmod -R 755 ~/public_html
This will recursively (-R) go through your public_html and set the permissions to 755 (owner rwx, and both group and other r-x, r-x).
The other answers are on the right track with mod_userdir, but using that will give your website the base URL http://www.yourdomain.com/~username/ - for instance, a file /home/username/public_html/index.html would be accessible as http://www.yourdomain.com/~username/index.html. If you want your files to be accessible under the domain root, as http://www.yourdomain.com/index.html for example, then you'll need to put the directive
DocumentRoot /home/username/public_html
in the Apache configuration file.
You need to use mod_userdir for Apache, otherwise you need to set up symlinks from /var/www/ or wherever.
Your permissions issue is because Apache does not have read access to your files. You need to allow read access to www-data (or whatever the user is; distro-specific).