Restore permission denied on Ubuntu for files/folders - linux

I've recently uploaded a cakephp web app on apache2 on a Linux server on an amazon ec2 instance. In order to edit files for development, i use filezilla to open files, make changes locally and then upload them. But to go around permissions to read write, I use:
sudo chmod 777 /var/www/myFolder/ -R
Only problem is, once I'm done editing, I don't know how to revert back to restricting permissions to avoid security issues.
I now need to make virtual hosts so to add conf file in etc/apache2 I need to make it writeable. How to make the folder unwriteable again?

Give these two a whack:
sudo find /var/www/yourfolder -type d -print0 | xargs -0 chmod 755
sudo find /var/www/yourfolder -type f -print0 | xargs -0 chmod 644
The first one sets the correct permissions for directories recursively, and the second one the correct permissions for the files. It's pretty well the standard permission set for apache servers.
Edit: for apache2.conf if you applied permissions recursively and need to change back - replace /var/www/ in the commands above with /etc/apache2/

Related

PhpStorm needs 777 permission to work [duplicate]

I'm using Apache Web Server that has the owner set to _www:_www. I never know what is the best practice with file permissions, for example when I create new Laravel 5 project.
Laravel 5 requires /storage folder to be writable. I found plenty of different approaches to make it work and I usually end with making it 777 chmod recursively. I know it's not the best idea though.
The official doc says:
Laravel may require some permissions to be configured: folders within
storage and vendor require write access by the web server.
Does it mean that the web server needs access to the storage and vendor folders themselves too or just their current contents?
I assume that what is much better, is changing the owner instead of permissions. I changed all Laravel's files permissions recursively to _www:_www and that made the site work correctly, as if I changed chmod to 777. The problem is that now my text editor asks me for password each time I want to save any file and the same happens if I try to change anything in Finder, like for example copy a file.
What is the correct approach to solve these problems?
Change chmod
Change the owner of the files to match those of the
web server and perhaps set the text editor (and Finder?) to skip
asking for password, or make them use sudo
Change the owner of the web server to match the os user (I don't
know the consequences)
Something else
Just to state the obvious for anyone viewing this discussion.... if you give any of your folders 777 permissions, you are allowing ANYONE to read, write and execute any file in that directory.... what this means is you have given ANYONE (any hacker or malicious person in the entire world) permission to upload ANY file, virus or any other file, and THEN execute that file...
IF YOU ARE SETTING YOUR FOLDER PERMISSIONS TO 777 YOU HAVE OPENED YOUR
SERVER TO ANYONE THAT CAN FIND THAT DIRECTORY. Clear enough??? :)
There are basically two ways to setup your ownership and permissions. Either you give yourself ownership or you make the webserver the owner of all files.
Webserver as owner (the way most people do it, and the Laravel doc's way):
assuming www-data (it could be something else) is your webserver user.
sudo chown -R www-data:www-data /path/to/your/laravel/root/directory
if you do that, the webserver owns all the files, and is also the group, and you will have some problems uploading files or working with files via FTP, because your FTP client will be logged in as you, not your webserver, so add your user to the webserver user group:
sudo usermod -a -G www-data ubuntu
Of course, this assumes your webserver is running as www-data (the Homestead default), and your user is ubuntu (it's vagrant if you are using Homestead).
Then you set all your directories to 755 and your files to 644...
SET file permissions
sudo find /path/to/your/laravel/root/directory -type f -exec chmod 644 {} \;
SET directory permissions
sudo find /path/to/your/laravel/root/directory -type d -exec chmod 755 {} \;
Your user as owner
I prefer to own all the directories and files (it makes working with everything much easier), so, go to your laravel root directory:
cd /var/www/html/laravel >> assuming this is your current root directory
sudo chown -R $USER:www-data .
Then I give both myself and the webserver permissions:
sudo find . -type f -exec chmod 664 {} \;
sudo find . -type d -exec chmod 775 {} \;
Then give the webserver the rights to read and write to storage and cache
Whichever way you set it up, then you need to give read and write permissions to the webserver for storage, cache and any other directories the webserver needs to upload or write too (depending on your situation), so run the commands from bashy above :
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
Now, you're secure and your website works, AND you can work with the files fairly easily
The permissions for the storage and vendor folders should stay at 775, for obvious security reasons.
However, both your computer and your server Apache need to be able to write in these folders. Ex: when you run commands like php artisan, your computer needs to write in the logs file in storage.
All you need to do is to give ownership of the folders to Apache :
sudo chown -R www-data:www-data /path/to/your/project/vendor
sudo chown -R www-data:www-data /path/to/your/project/storage
Then you need to add your user (referenced by it's username) to the group to which the server Apache belongs. Like so :
sudo usermod -a -G www-data userName
NOTE: Most frequently, the group name is www-data but in your case, replace it with _www
We've run into many edge cases when setting up permissions for Laravel applications. We create a separate user account (deploy) for owning the Laravel application folder and executing Laravel commands from the CLI, and run the web server under www-data. One issue this causes is that the log file(s) may be owned by www-data or deploy, depending on who wrote to the log file first, obviously preventing the other user from writing to it in the future.
I've found that the only sane and secure solution is to use Linux ACLs. The goal of this solution is:
To allow the user who owns/deploys the application read and write access to the Laravel application code (we use a user named deploy).
To allow the www-data user read access to Laravel application code, but not write access.
To prevent any other users from accessing the Laravel application code/data at all.
To allow both the www-data user and the application user (deploy) write access to the storage folder, regardless of which user owns the file (so both deploy and www-data can write to the same log file for example).
We accomplish this as follows:
All files within the application/ folder are created with the default umask of 0022, which results in folders having drwxr-xr-x permissions and files having -rw-r--r--.
sudo chown -R deploy:deploy application/ (or simply deploy your application as the deploy user, which is what we do).
chgrp www-data application/ to give the www-data group access to the application.
chmod 750 application/ to allow the deploy user read/write, the www-data user read-only, and to remove all permissions to any other users.
setfacl -Rdm u:www-data:rwx,u:deploy:rwx application/storage/ to set the default permissions on the storage/ folder and all subfolders. Any new folders/files created in the storage folder will inherit these permissions (rwx for both www-data and deploy).
setfacl -Rm u:www-data:rwX,u:deploy:rwX application/storage/ to set the above permissions on any existing files/folders.
This worked for me:
cd [..LARAVEL PROJECT ROOT]
sudo find . -type f -exec chmod 644 {} \;
sudo find . -type d -exec chmod 755 {} \;
sudo chmod -R 777 ./storage
sudo chmod -R 777 ./bootstrap/cache/
Only if you use npm (VUE, compiling SASS, etc..) add this:
sudo chmod -R 777 ./node_modules/
What it does:
Change all file permissions to 644
Change all folder permissions to 755
For storage and bootstrap cache (special folders used by laravel for creating and executing files, not available from outside) set permission to 777, for anything inside
For nodeJS executable, same as above
Note: Maybe you can not, or don't need, to do it with sudo prefix. it depends on your user's permissions, group, etc...
Change the permissions for your project folder to enable read/write/exec for any user within the group owning the directory (which in your case is _www):
chmod -R 775 /path/to/your/project
Then add your OS X username to the _www group to allow it access to the directory:
sudo dseditgroup -o edit -a yourusername -t user _www
The Laravel 5.4 docs say:
After installing Laravel, you may need to configure some permissions.
Directories within the storage and the bootstrap/cache directories
should be writable by your web server or Laravel will not run. If you
are using the Homestead virtual machine, these permissions should
already be set.
There are a lot of answers on this page that mention using 777 permissions. Don't do that. You'd be exposing yourself to hackers.
Instead, follow the suggestions by others about how to set permissions of 755 (or more restrictive). You may need to figure out which user your app is running as by running whoami in the terminal and then change ownership of certain directories using chown -R.
This is what worked for me:
cd /code/laravel_project
php artisan cache:clear
php artisan config:clear
sudo service php7.4-fpm stop
sudo service nginx stop
sudo chown -R $USER:www-data storage
sudo chown -R $USER:www-data bootstrap/cache
chmod -R 775 storage
chmod -R 755 bootstrap/cache
sudo service php7.4-fpm start && sudo service nginx start
inspired by https://stackoverflow.com/a/45673457/470749
If you do not have permission to use sudo as so many other answers require...
Your server is probably a shared host such as Cloudways.
(In my case, I had cloned my Laravel application into a second Cloudways server of mine, and it wasn't completely working because the permissions of the storage and bootstrap/cache directories were messed up.)
I needed to use:
Cloudways Platform > Server > Application Settings > Reset Permission
Then I could run php artisan cache:clear in the terminal.
Most folders should be normal "755" and files, "644"
Laravel requires some folders to be writable for the web server user. You can use this command on unix based OSs.
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
I also follow the way to have user as owner, and user is member of www-data.
My sequence of command is a bit different:
cd /var/www/html/laravel-project-root
sudo chown -R $USER:www-data .
sudo find . -type f -exec chmod 664 {} \;
sudo find . -type d -exec chmod 775 {} \;
sudo find . -type d -exec chmod g+s {} \; <----- NOTE THIS
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
Please note the particularity of this very answer: I am the only (here) adding the group bit to every folder. In this way, if someone or something creates a new subfolder, it automatically has www-data as group
this happens when we are forced to deploy some userland files, preloaded using ftp, and so new subfolders folders are always owned from www-data group even if created by ftp client
Note also that the
sudo chgrp -R www-data storage bootstrap/cache >
is not needed if you do ALL of the commands in the list. But if you want to fix a ftp deploy, for example, and you didn't execute the 2nd line, so this is needed.
As posted already
All you need to do is to give ownership of the folders to Apache :
but I added -R for chown command:
sudo chown -R www-data:www-data /path/to/your/project/vendor
sudo chown -R www-data:www-data /path/to/your/project/storage
Add to composer.json
"scripts": {
"post-install-cmd": [
"chgrp -R www-data storage bootstrap/cache",
"chmod -R ug+rwx storage bootstrap/cache"
]
}
After composer install
The solution posted by bgles is spot on for me in terms of correctly setting permissions initially (I use the second method), but it still has potential issues for Laravel.
By default, Apache will create files with 644 permissions. So that's pretty much anything in storage/. So, if you delete the contents of storage/framework/views, then access a page through Apache you will find the cached view has been created like:
-rw-r--r-- 1 www-data www-data 1005 Dec 6 09:40 969370d7664df9c5206b90cd7c2c79c2
If you run "artisan serve" and access a different page, you will get different permissions because CLI PHP behaves differently from Apache:
-rw-rw-r-- 1 user www-data 16191 Dec 6 09:48 2a1683fac0674d6f8b0b54cbc8579f8e
In itself this is no big deal as you will not be doing any of this in production. But if Apache creates a file that subsequently needs to be written by the user, it will fail. And this can apply to cache files, cached views and logs when deploying using a logged-in user and artisan. A facile example being "artisan cache:clear" which will fail to delete any cache files that are www-data:www-data 644.
This can be partially mitigated by running artisan commands as www-data, so you'll be doing/scripting everything like:
sudo -u www-data php artisan cache:clear
Or you'll avoid the tediousness of this and add this to your .bash_aliases:
alias art='sudo -u www-data php artisan'
This is good enough and is not affecting security in any way. But on development machines, running testing and sanitation scripts makes this unwieldy, unless you want to set up aliases to use 'sudo -u www-data' to run phpunit and everything else you check your builds with that might cause files to be created.
The solution is to follow the second part of bgles advice, and add the following to /etc/apache2/envvars, and restart (not reload) Apache:
umask 002
This will force Apache to create files as 664 by default. In itself, this can present a security risk. However, on the Laravel environments mostly being discussed here (Homestead, Vagrant, Ubuntu) the web server runs as user www-data under group www-data. So if you do not arbitrarily allow users to join www-data group, there should be no additional risk. If someone manages to break out of the webserver, they have www-data access level anyway so nothing is lost (though that's not the best attitude to have relating to security admittedly). So on production it's relatively safe, and on a single-user development machine, it's just not an issue.
Ultimately as your user is in www-data group, and all directories containing these files are g+s (the file is always created under the group of the parent directory), anything created by the user or by www-data will be r/w for the other.
And that's the aim here.
edit
On investigating the above approach to setting permissions further, it still looks good enough, but a few tweaks can help:
By default, directories are 775 and files are 664 and all files have the owner and group of the user who just installed the framework. So assume we start from that point.
cd /var/www/projectroot
sudo chmod 750 ./
sudo chgrp www-data ./
First thing we do is block access to everyone else, and make the group to be www-data. Only the owner and members of www-data can access the directory.
sudo chmod 2775 bootstrap/cache
sudo chgrp -R www-data bootstrap/cache
To allow the webserver to create services.json and compiled.php, as suggested by the official Laravel installation guide. Setting the group sticky bit means these will be owned by the creator with a group of www-data.
find storage -type d -exec sudo chmod 2775 {} \;
find storage -type f -exec sudo chmod 664 {} \;
sudo chgrp -R www-data storage
We do the same thing with the storage folder to allow creation of cache, log, session and view files. We use find to explicitly set the directory permissions differently for directories and files. We didn't need to do this in bootstrap/cache as there aren't (normally) any sub-directories in there.
You may need to reapply any executable flags, and delete vendor/* and reinstall composer dependencies to recreate links for phpunit et al, eg:
chmod +x .git/hooks/*
rm vendor/*
composer install -o
That's it. Except for the umask for Apache explained above, this is all that's required without making the whole projectroot writeable by www-data, which is what happens with other solutions. So it's marginally safer this way in that an intruder running as www-data has more limited write access.
end edit
Changes for Systemd
This applies to the use of php-fpm, but maybe others too.
The standard systemd service needs to be overridden, the umask set in the override.conf file, and the service restarted:
sudo systemctl edit php7.0-fpm.service
Use:
[Service]
UMask=0002
Then:
sudo systemctl daemon-reload
sudo systemctl restart php7.0-fpm.service
I have installed laravel on EC2 instance and have spent 3 days to fix the permission error and at last fixed it.
So I want to share this experience with other one.
user problem
When I logged in ec2 instance, my username is ec2-user and usergroup is ec2-user.
And the website works under of httpd user: apache: apache
so we should set the permission for apache.
folder and file permission
A. folder structure
first, you should make sure that you have such folder structure like this under storage
storage
framework
cache
sessions
views
logs
The folder structure can be different according to the laravel version you use.
my laravel version is 5.2 and you could find the appropriate structure according to your version.
B. permission
At first, I see the instructions to set 777 under storage to remove file_put_contents: failed to open stream error.
So i setup permission 777 to storage
chmod -R 777 storage
But the error was not fixed.
here, you should consider one: who writes files to storage/ sessions and views.
That is not ec2-user, but apache.
Yes, right.
"apache" user writes file (session file, compiled view file) to the session and view folder.
So you should give apache to write permission to these folder.
By default: SELinux say the /var/www folder should be read-only by the apache deamon.
So for this, we can set the selinux as 0:
setenforce 0
This can solve problem temporally, but this makes the mysql not working.
so this is not so good solution.
You can set a read-write context to the storage folder with: (remember to setenforce 1 to test it out)
chcon -Rt httpd_sys_content_rw_t storage/
Then your problem will be fixed.
and don't forget this
composer update
php artisan cache:clear
These commands will be useful after or before.
I hope you save your time.
Good luck. Hacken
I have used this snippet for more than 3 years.
laravel new demo
cd demo
sudo find ./ -type f -exec chmod 664 {} \;
sudo find ./ -type d -exec chmod 775 {} \;
sudo chgrp -Rf www-data storage bootstrap/cache
sudo chmod -Rf ug+rwx storage bootstrap/cache
sudo chmod -Rf 775 storage/ bootstrap/
php artisan storage:link
And #realtebo recommendation looks good. I will try.
sudo find . -type d -exec chmod g+s {} \; <----- NOTE THIS
This was written in 2017 for around version 5.1~5.2. Use some common sense before you use this on later versions of Laravel.
I decided to write my own script to ease some of the pain of setting up projects.
Run the following inside your project root:
wget -qO- https://raw.githubusercontent.com/defaye/bootstrap-laravel/master/bootstrap.sh | sh
Wait for the bootstrapping to complete and you're good to go.
Review the script before use.
I had the following configuration:
NGINX (running user: nginx)
PHP-FPM
And applied permissions correctly as #bgies suggested in the accepted answer. The problem in my case was the php-fpm's configured running user and group which was originally apache.
If you're using NGINX with php-fpm, you should open php-fpm's config file:
nano /etc/php-fpm.d/www.config
And replace user and group options' value with one NGINX is configured to work with; in my case, both were nginx:
...
; Unix user/group of processes
; Note: The user is mandatory. If the group is not set, the default user's group
; will be used.
; RPM: apache Choosed to be able to access some dir as httpd
user = nginx
; RPM: Keep a group allowed to write in log dir.
group = nginx
...
Save it and restart nginx and php-fpm services.
For Laravel developers, directory issues can be a little bit pain. In my application, I was creating directories on the fly and moving files to this directory in my local environment successfully. Then on server, I was getting errors while moving files to newly created directory.
Here are the things that I have done and got a successful result at the end.
sudo find /path/to/your/laravel/root/directory -type f -exec chmod 664 {} \;
sudo find /path/to/your/laravel/root/directory -type d -exec chmod 775 {} \;
chcon -Rt httpd_sys_content_rw_t /path/to/my/file/upload/directory/in/laravel/project/
While creating the new directory on the fly, I used the command mkdir($save_path, 0755, true);
After making those changes on production server, I successfully created new directories and move files to them.
Finally, if you use File facade in Laravel you can do something like this:
File::makeDirectory($save_path, 0755, true);
If you have some code changes in some files and then some permission changes, it might be easier to set up the right permissions and commit again than try to pick the files that have permission changes.
You will only be left with the code changes.
we will give 755 permission to storage and bootstrap folder permission
sudo chmod -R 755 bootstrap/cache
sudo chmod -R 755 storage
Solution 2
We will give all files and folder ownership to the webserver user
sudo chown -R www-data:www-data /var/www/your-project-path
set 644 permission for all files and 755 for all directories.
sudo find /var/www/your-project-path -type f -exec chmod 644 {} \;
sudo find /var/www/your-project-path -type d -exec chmod 755 {} \;
we will give proper read and write permission for files and folders.
cd /var/www/your-project-path
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
above all detail for ubuntu server
I found an even better solution to this.
Its caused because php is running as another user by default.
so to fix this do
sudo nano /etc/php/7.0/fpm/pool.d/www.conf
then edit the
user = "put user that owns the directories"
group = "put user that owns the directories"
then:
sudo systemctl reload php7.0-fpm
I'd do it like this:
sudo chown -R $USER:www-data laravel-project/
find laravel-project/ -type f -exec chmod 664 {} \;
find laravel-project/ -type d -exec chmod 775 {} \;
then finally, you need to give webserver permission to modify the storage and bootstrap/cache directories:
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
sudo chown -R $USER:www-data your_directory_name_under_storage_app
It will give required permission to server user & server for file access.
Mac OS Big Sur
In my case artisan couldn't access
/var/www/{project_name}/storage to create files ex. in /logs folder
so I had manually to go to
/var and create the structure of folders that Laravel need to make symlink.
Add access to folder
sudo chgrp -R $USER /var/www/project_name
Then I could use
php artisan storage:link without any problems.

What is the lowest privileged user that node.js can run as on ubuntu?

I don't know very much about different privilege level.
What is the minimum user level required to run node.js, if all I need to do is serve static pages, in a directory.
Likewise, what is the minimum user level required for the dropbox daemon to run in node.js ?
EDIT: For example, in this guide: https://www.linode.com/docs/security/securing-your-server/
One creates a new user to administer the system using this command:
usermod -a -G sudo exampleuser
How does one create a low level, less privileged user that Dropbox or node.js can run as ?
EDIT: I'm not sure which folders to set what file permissions. I wan't Dropbox to be able to run and modify the contents of its folder. I wan't node.js to also have permission to read and execute files in that folder.
A lower level user would not be part of the sudo group. In general you need read access to any files, but not necessarily write.
Typically it is best practice to create a generic user with its own group in isolation.
Then you would setup a directory tree for just that user with its files.
You can either make the files group readable for the special group or world readable. The directories will also need execute permissions.
For the situation you are describing, I would run dropbox as a separate user as well and then make the dropbox folder either group or world readable. The dropbox user should also not be part of the sudo group.
Let's say you have 3 users:
Your primary login user called "user"
node user
dropbox user
Each user is in its own group with the same name. I'll assume your dropbox folder is named ~/Dropbox.
chown -R dropbox:node ~/Dropbox
find ~/Dropbox -type d | xargs chmod 750
find ~/Dropbox -type f | xargs chmod 640
mkdir ~/node
# copy files in here
chown -R node:node ~/node
find ~/node -type d | xargs chmod 750
find ~/node -type f | xargs chmod 640
Note that this turns off the executable bit on all binaries, so you may have to manually fix those. Also note that chmod and chown both require root permissions to modify other user files, so you'll probably need to add some sudo to those commands, but it was already getting long.

Linux Wordpress can't not write wp-config file

I installed the latest version of Apache2 / PHP / MYSQL on my pc
In the directory /srv/www/htdocs I created a directory wordpress with all wordpress file.
Then, when I tried to create the wp-config file through the web interface I get this error :
Sorry, but i can't write the wp-config file.
I tried this command to change the group of /src/www/htdocs/wordpress
chown -R root:root /srv/www/htdocs/wordpress
But it was not working. After some research, i seen lot of people saying to change the group to www-data but i do not see www-data using this commande :
cut -d: -f1 /etc/group
Anyone know what I am doing wrong ?
Sorry about my poor english.
This is what worked for me. (I'm a beginner and I'm using Debian 8)
First I checked the Apache config file to find the group that apache uses. In my case it was www-data.
grep ^Group /etc/apache2/httpd.conf
I checked my /etc/group file and the group www-data was there.
Then I changed the group ownership of my wordpress directory from root to www-data.
chown -R root:www-data /var/www/html/wordpress
I changed the permissions of my wordpress folder to 755 for directories and 644 for files, following the recommendations of other sites.
find /var/www/html/wordpress -type d -exec chmod 755 {} \;
find /var/www/html/wordpress -type f -exec chmod 644 {} \;
But since we want to give write permissions to the group www-admin for the wordpress folder and subfolders, I changed the permissions for directories to 775:
find /var/www/html/wordpress -type d -exec chmod 775 {} \;
Then it worked: All right, sparky! You’ve made it through this part of the installation. WordPress can now communicate with your database. If you are ready, time now to…
Notes: The username and password of my myssql database was root/root. The username I was using to login to my computer was root and the password was something different than root. Just in case I changed my password to root, so the credentials of mysql and my local account are the same. I don't know if by having the same name (root) they are the same account.
Sounds like www-data is not the group name used by apache on your system. To find what it actually uses, try the following:
ps xO gid | head -n 1; ps xO gid | grep httpd
(That's a capital O, not a zero). The column GID (probably the second column) is the numeric group ID that apache is running under. Look up its name in /etc/group.
unable to write to wp-config.php file.
I had this problem in my ubuntu 20.04, WordPress 5.x. I solved this issue with these steps:
First, change the group ownership of the WordPress directory from root to www-data.
sudo chown -R root:www-data /var/www/html/wordpress
Then change your WordPress folders permission by this command.
sudo find /var/www/html/wordpress -type d -exec chmod 755 {} \;
Then change your WordPress files permission by this command.
sudo find /var/www/html/wordpress -type f -exec chmod 644 {} \;
Run the installation again.
Even though I implemented CHOWN & CHMOD steps, but the issue was still persisting, then I found another scenario where this/same error could occur because of SELINUX CONTEXT issue (lack of write permission context).
To solve that - You can change Apache/HTTPD write context with in the web-content folder like this (Specific to the Debian environment, as mentioned in this question):
chcon -R -t httpd_sys_rw_content_t /src/www/htdocs
OR (on CentOS/Fedora):
chcon -R -t httpd_sys_rw_content_t /var/www/html
You can validate the output like this (on CentOS/Fedora):
# pwd
# /var/www
Notice the output of below command having changed RW (Read/Write) context on html (web-content) folder:
# ls -ltrZ
# drwxrwxr-x. 2 apache apache system_u:object_r:httpd_sys_script_exec_t:s0 4096 Jul 9 20:45 cgi-bin
# drwxrwxr-x. 5 apache apache system_u:object_r:httpd_sys_rw_content_t:s0 4096 Jul 14 19:28 html
NOTE: Do validate the Path, Web Server User-Names, and SELINUX write-context as per your environment before executing above commands.
You have to need permission..... Just open the terminal and execute: sudo chmod -R 777 /opt/lampp/htdocs/yourfolder
Example output

Magento file permissions change after system backup - Cause 500 error

I'm having problems with Magento when doing a system backup. Every time I do a system backup Magento changes the file permissions and causes a 500 server error when the backup has completed and the admin screen is reloaded.
The problem is the same as this unanswered question. I am not setting 'maintenance mode' on. : https://stackoverflow.com/questions/13107963/magento-file-permissions-changing-to-chmod-666-after-system-backup
Can anyone tell me how to stop this from happening. It's a pain to have to reset the permissions every time I do a backup.
The problem comes about because Magento Backup sets permissions on files.
The offending piece of work is lib/Mage/Archive/Helper/File.php
In it you find a function public function open($mode = \'w+\', $chmod = 0666)
This causes issues where permissions are changed globally.
If you must use Magento's site crippling backup, then you must run a script to set the file/folder permissions back.
ssh commands (can be run as a shell script)
For Magento where PHP is running through FastCGI, suPHP, or LSAPI:
find . -type f -exec chmod 644 {} \;
find . -type d -exec chmod 755 {} \;
chmod 500 pear
chmod 500 mage #for magento 1.5+
For Magento where PHP is running as a DSO module under Apache:
find . -type f -exec chmod 644 {} \;
find . -type d -exec chmod 755 {} \;
chmod o+w var var/.htaccess app/etc
chmod 550 pear
chmod 550 mage #for magento 1.5+
chmod -R o+w media
For Magento to function, var/ and media/ may need to be recursively set to 777
From MagentoCommerce permissions settings page which also has a php script linked about halfway down that can be run to set permissions.
The other option is to dump Magento's backup system and do a roll-your-own approach with tar and mysqldump. Note: this will require plenty of free space to temporarily store the tar file till you download it unless you exclude the media/ folder and do the media folder backup in a different manner.
tar -czf magentobu.tgz --exclude="public_html/var/*" --exclude="public_html/media/catalog/product/cache/*" public_html
mysqldump -u $USER -p$PASS $DBASE > magentodb.sql
And copy the resulting files over to your offsite storage whether its your workstation, S3 bucket or vps.
Another option is to set up a system that you can rsync your Magento system to as well as pulling over a mysql dump file created once daily. Doing this, you can have a continuous, up-to-the-minute full site mirror that doesn't incur extra storage space on the server while you wait to pull the backup offsite.

Setting up permissions for WordPress on Amazon EC2 (Amazon Linux)

I setup WordPress on an Amazon EC2 instance. It's using Amazon Linux and is a standard setup (just php5 and mysql).
WordPress works fine, but there's some permission issues. Specifically I can't upload media, update permalink, plugins, etc. I have no write permission under the ec2-user and because I uploaded all the files over WinSCP the current owner is ec2-user.
My question is what's the best way to correct this issue? I could probably fix it by changing ownership of all folders/files to root, but that's not a very elegant or dynamic solution.
The path to my web directory is /var/www/html. Can I allow the ec2-user the correct permissions? Perhaps by having a group that both the Apache user and ec2-user share?
Any ideas would be appreciated
See http://blog.david-jensen.com/development/wordpress-amazon-ec2-apache-permissions-wordpress/ among other Google results. He looks to have had good luck:
I have been doing my best to figure out the Amazon EC2 Apache setup of
permissions to enable WordPress to be able to manage all of the files
on my Amazon EC2 instance without WordPress asking for FTP permissions
when I try to upload a plugin or theme via the Admin site. I ended up
having to give file and group ownership of the files in my html folder
to apache user for WordPress to run correctly.
http://www.chrisabernethy.com/why-wordpress-asks-connection-info/ and
its comments helped me reach this conclusion.
From the webpage:
Run
sudo su chown -R apache:apache /vol/html
I then set permissions to what the hardening WordPress guide recommends for my html root as all my WordPress files are there as I am running MultiSite with multiple domains.
find /vol/html/ -type d -exec chmod 755 {} \;
find /vol/html/ -type f -exec chmod 644 {} \;
As apache doesn’t have a login I feel this is worth the risk though there is probably a better way to do this. I then added ec2-user to the apache group and changed the permissions of the wp-content folder to have group write permission 775.
useradd -G apache ec2-user
sudo chmod -R 775 /vol/html/wp-content
This allows FileZilla or any other program logged in as ec2-user the ability to change files and folders in the wp-content folder only. If anyone has a better way of doing this I would like to know. I am only using SSH and SFTP to access the server with key files.
I set the owner to ec2-user:apache, then perform the hardening, then adjust the group read+write permissions for the folders.
sudo chown -R ec2-user:apache /vol/html
sudo chmod -R 755 /vol/html
sudo find /vol/html/ -type d -exec chmod 755 {} \;
sudo find /vol/html/ -type f -exec chmod 644 {} \;
sudo chgrp -R apache /vol/html
sudo chmod -R g+rw /vol/html
sudo chmod -R g+s /vol/html
Then edit /wordpress-install/wp-config.php and define the fs_method
define('FS_METHOD', 'direct');
Now wordpress can update/upload, etc. And you can still SFTP files without changing the permissions every time.
I tried the solution provided in the answer by #markratledge for my AWS EC2 instance (Amazon Linux).
Wordpress(apache) was good, but SFTP(ec2-user) was giving permission errors.
Then I tried the following:
I added ec2-user to the apache group:
usermod -a -G apache ec2-user
Next I set 'apache' as owner group and 'ec2-user' as owner user for the WordPress installation directory (/var/www/html in my case):
chown -R apache:ec2-user /var/www/html
Finally, WordPress was happy and I could SFTP too. Thanks!
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/hosting-wordpress.html
To fix file permissions for the Apache web server
Some of the available features in WordPress require write access to
the Apache document root (such as uploading media though the
Administration screens). The web server runs as the apache user, so
you need to add that user to the www group that was created in the
LAMP web server tutorial.
Add the apache user to the www group.
[ec2-user wordpress]$ sudo usermod -a -G www apache Change the file
ownership of /var/www and its contents to the apache user.
[ec2-user wordpress]$ sudo chown -R apache /var/www Change the group
ownership of /var/www and its contents to the www group.
[ec2-user wordpress]$ sudo chgrp -R www /var/www Change the directory
permissions of /var/www and its subdirectories to add group write
permissions and to set the group ID on future subdirectories.
[ec2-user wordpress]$ sudo chmod 2775 /var/www [ec2-user wordpress]$
find /var/www -type d -exec sudo chmod 2775 {} \; Recursively change
the file permissions of /var/www and its subdirectories to add group
write permissions.
[ec2-user wordpress]$ find /var/www -type f -exec sudo chmod 0664 {}
\; Restart the Apache web server to pick up the new group and
permissions.
[ec2-user wordpress]$ sudo service httpd restart Stopping httpd:
[ OK ] Starting httpd: [
OK ]
I came across this question searching for the answer. I set all ownership and group ownership to Apache. However, if I want to upload something ftp I have to ssh change permissions to ec2-user upload the file and change it back. I figured it was a small price to pay to have the permissions set to WordPress's recommended settings.

Resources