What permissions can I give my logs for Laravel to be able to read from them? - linux

I am trying to display my logs on my website to verified users in Laravel based on my role based access control.
$file = fopen("/var/log/auth.log", "r") or die();
$content = fread($file, filesize("/var/log/auth.log"));
fclose($file);
This hits me with an error:
fopen(/var/log/auth.log): failed to open stream: Permission denied
I can see that Laravel does not have the correct read permissions for this file and I do not what to do a typical chmod -R 777 due to security. I am using nginx but Laravel executes with php-fpm.
What user-group does my site execute in? What permissions should I give that user-group on my log files?

Try:
chown {your_user}:nginx /var/log/auth.log
chmod ug+rwx /var/log/auth.log

For this situation it's strongly not recommended to change permissions like "chmod 0777" (the same 777), "chmod 0755" (the same 755) or something like that for avoiding security vulnerabilities.
Actually the files which will used by web-server, will attach to your "storage" directory. You can just change owner, as web-server user (Apache or Nginx). Lot of cases it's "www-data".
Also don't forget about bootstrapped cache-files (configurations, services and packages) under "bootstrap/cache" directory.
sudo chown -R www-data:www-data storage/ bootstrap/cache/
After this, when you will want to run some artisan-commands, you can do them with "sudo", or just can make the current user as owner:
sudo chown -R $USER:$USER storage/ bootstrap/cache/
And after running your command(s) you can revert back the owner to "www-data" user (1-st command).
The advantage of this method is that this will not be tracked by version control system.

Related

Linux: How can I SSH connect using Apache user?

As a web developer I always have the problem when updating PHP (and other) files from an SSH client, because I am logged in as a user or simply root.
After that update I always have to run manually from a terminal 'chown -R apache:apache *' to make the files accessible.
I tried to make a user ID and add it to the group 'apache' and add the apache user to the group of my user id. That works only for existing files on the server file system, because newly created files have permissions rwxr--r-- which does not allow writing by my user even as it is in the 'apache' group.
I'd like to make a login (shell is not needed) for the Apache user, so I can use an SSH based file browser like Forklift to login as Apache or use sshfs to mount as Apache user.
Another way is make umask that my user id always sets attributes of newly created files from sshfs mount or a file browser (mounted with my user id, not root) that they have permission rwxrwxr-- (i.e. 0775) by default.
Is there a way I can upload files to the server (updating existing op create new ones) without having to worry about permissions by Apache ?
You have to set the setgid
For example, do the following steps:
adduser hugo
addgroup apache
usermod -a -G apache hugo
mkdir /tmp/example
chown hugo:apache /tmp/example
chmod g+s /tmp/example
su hugo
cd /tmp/example
touch my_file
ls -l

PhpStorm needs 777 permission to work [duplicate]

I'm using Apache Web Server that has the owner set to _www:_www. I never know what is the best practice with file permissions, for example when I create new Laravel 5 project.
Laravel 5 requires /storage folder to be writable. I found plenty of different approaches to make it work and I usually end with making it 777 chmod recursively. I know it's not the best idea though.
The official doc says:
Laravel may require some permissions to be configured: folders within
storage and vendor require write access by the web server.
Does it mean that the web server needs access to the storage and vendor folders themselves too or just their current contents?
I assume that what is much better, is changing the owner instead of permissions. I changed all Laravel's files permissions recursively to _www:_www and that made the site work correctly, as if I changed chmod to 777. The problem is that now my text editor asks me for password each time I want to save any file and the same happens if I try to change anything in Finder, like for example copy a file.
What is the correct approach to solve these problems?
Change chmod
Change the owner of the files to match those of the
web server and perhaps set the text editor (and Finder?) to skip
asking for password, or make them use sudo
Change the owner of the web server to match the os user (I don't
know the consequences)
Something else
Just to state the obvious for anyone viewing this discussion.... if you give any of your folders 777 permissions, you are allowing ANYONE to read, write and execute any file in that directory.... what this means is you have given ANYONE (any hacker or malicious person in the entire world) permission to upload ANY file, virus or any other file, and THEN execute that file...
IF YOU ARE SETTING YOUR FOLDER PERMISSIONS TO 777 YOU HAVE OPENED YOUR
SERVER TO ANYONE THAT CAN FIND THAT DIRECTORY. Clear enough??? :)
There are basically two ways to setup your ownership and permissions. Either you give yourself ownership or you make the webserver the owner of all files.
Webserver as owner (the way most people do it, and the Laravel doc's way):
assuming www-data (it could be something else) is your webserver user.
sudo chown -R www-data:www-data /path/to/your/laravel/root/directory
if you do that, the webserver owns all the files, and is also the group, and you will have some problems uploading files or working with files via FTP, because your FTP client will be logged in as you, not your webserver, so add your user to the webserver user group:
sudo usermod -a -G www-data ubuntu
Of course, this assumes your webserver is running as www-data (the Homestead default), and your user is ubuntu (it's vagrant if you are using Homestead).
Then you set all your directories to 755 and your files to 644...
SET file permissions
sudo find /path/to/your/laravel/root/directory -type f -exec chmod 644 {} \;
SET directory permissions
sudo find /path/to/your/laravel/root/directory -type d -exec chmod 755 {} \;
Your user as owner
I prefer to own all the directories and files (it makes working with everything much easier), so, go to your laravel root directory:
cd /var/www/html/laravel >> assuming this is your current root directory
sudo chown -R $USER:www-data .
Then I give both myself and the webserver permissions:
sudo find . -type f -exec chmod 664 {} \;
sudo find . -type d -exec chmod 775 {} \;
Then give the webserver the rights to read and write to storage and cache
Whichever way you set it up, then you need to give read and write permissions to the webserver for storage, cache and any other directories the webserver needs to upload or write too (depending on your situation), so run the commands from bashy above :
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
Now, you're secure and your website works, AND you can work with the files fairly easily
The permissions for the storage and vendor folders should stay at 775, for obvious security reasons.
However, both your computer and your server Apache need to be able to write in these folders. Ex: when you run commands like php artisan, your computer needs to write in the logs file in storage.
All you need to do is to give ownership of the folders to Apache :
sudo chown -R www-data:www-data /path/to/your/project/vendor
sudo chown -R www-data:www-data /path/to/your/project/storage
Then you need to add your user (referenced by it's username) to the group to which the server Apache belongs. Like so :
sudo usermod -a -G www-data userName
NOTE: Most frequently, the group name is www-data but in your case, replace it with _www
We've run into many edge cases when setting up permissions for Laravel applications. We create a separate user account (deploy) for owning the Laravel application folder and executing Laravel commands from the CLI, and run the web server under www-data. One issue this causes is that the log file(s) may be owned by www-data or deploy, depending on who wrote to the log file first, obviously preventing the other user from writing to it in the future.
I've found that the only sane and secure solution is to use Linux ACLs. The goal of this solution is:
To allow the user who owns/deploys the application read and write access to the Laravel application code (we use a user named deploy).
To allow the www-data user read access to Laravel application code, but not write access.
To prevent any other users from accessing the Laravel application code/data at all.
To allow both the www-data user and the application user (deploy) write access to the storage folder, regardless of which user owns the file (so both deploy and www-data can write to the same log file for example).
We accomplish this as follows:
All files within the application/ folder are created with the default umask of 0022, which results in folders having drwxr-xr-x permissions and files having -rw-r--r--.
sudo chown -R deploy:deploy application/ (or simply deploy your application as the deploy user, which is what we do).
chgrp www-data application/ to give the www-data group access to the application.
chmod 750 application/ to allow the deploy user read/write, the www-data user read-only, and to remove all permissions to any other users.
setfacl -Rdm u:www-data:rwx,u:deploy:rwx application/storage/ to set the default permissions on the storage/ folder and all subfolders. Any new folders/files created in the storage folder will inherit these permissions (rwx for both www-data and deploy).
setfacl -Rm u:www-data:rwX,u:deploy:rwX application/storage/ to set the above permissions on any existing files/folders.
This worked for me:
cd [..LARAVEL PROJECT ROOT]
sudo find . -type f -exec chmod 644 {} \;
sudo find . -type d -exec chmod 755 {} \;
sudo chmod -R 777 ./storage
sudo chmod -R 777 ./bootstrap/cache/
Only if you use npm (VUE, compiling SASS, etc..) add this:
sudo chmod -R 777 ./node_modules/
What it does:
Change all file permissions to 644
Change all folder permissions to 755
For storage and bootstrap cache (special folders used by laravel for creating and executing files, not available from outside) set permission to 777, for anything inside
For nodeJS executable, same as above
Note: Maybe you can not, or don't need, to do it with sudo prefix. it depends on your user's permissions, group, etc...
Change the permissions for your project folder to enable read/write/exec for any user within the group owning the directory (which in your case is _www):
chmod -R 775 /path/to/your/project
Then add your OS X username to the _www group to allow it access to the directory:
sudo dseditgroup -o edit -a yourusername -t user _www
The Laravel 5.4 docs say:
After installing Laravel, you may need to configure some permissions.
Directories within the storage and the bootstrap/cache directories
should be writable by your web server or Laravel will not run. If you
are using the Homestead virtual machine, these permissions should
already be set.
There are a lot of answers on this page that mention using 777 permissions. Don't do that. You'd be exposing yourself to hackers.
Instead, follow the suggestions by others about how to set permissions of 755 (or more restrictive). You may need to figure out which user your app is running as by running whoami in the terminal and then change ownership of certain directories using chown -R.
This is what worked for me:
cd /code/laravel_project
php artisan cache:clear
php artisan config:clear
sudo service php7.4-fpm stop
sudo service nginx stop
sudo chown -R $USER:www-data storage
sudo chown -R $USER:www-data bootstrap/cache
chmod -R 775 storage
chmod -R 755 bootstrap/cache
sudo service php7.4-fpm start && sudo service nginx start
inspired by https://stackoverflow.com/a/45673457/470749
If you do not have permission to use sudo as so many other answers require...
Your server is probably a shared host such as Cloudways.
(In my case, I had cloned my Laravel application into a second Cloudways server of mine, and it wasn't completely working because the permissions of the storage and bootstrap/cache directories were messed up.)
I needed to use:
Cloudways Platform > Server > Application Settings > Reset Permission
Then I could run php artisan cache:clear in the terminal.
Most folders should be normal "755" and files, "644"
Laravel requires some folders to be writable for the web server user. You can use this command on unix based OSs.
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
I also follow the way to have user as owner, and user is member of www-data.
My sequence of command is a bit different:
cd /var/www/html/laravel-project-root
sudo chown -R $USER:www-data .
sudo find . -type f -exec chmod 664 {} \;
sudo find . -type d -exec chmod 775 {} \;
sudo find . -type d -exec chmod g+s {} \; <----- NOTE THIS
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
Please note the particularity of this very answer: I am the only (here) adding the group bit to every folder. In this way, if someone or something creates a new subfolder, it automatically has www-data as group
this happens when we are forced to deploy some userland files, preloaded using ftp, and so new subfolders folders are always owned from www-data group even if created by ftp client
Note also that the
sudo chgrp -R www-data storage bootstrap/cache >
is not needed if you do ALL of the commands in the list. But if you want to fix a ftp deploy, for example, and you didn't execute the 2nd line, so this is needed.
As posted already
All you need to do is to give ownership of the folders to Apache :
but I added -R for chown command:
sudo chown -R www-data:www-data /path/to/your/project/vendor
sudo chown -R www-data:www-data /path/to/your/project/storage
Add to composer.json
"scripts": {
"post-install-cmd": [
"chgrp -R www-data storage bootstrap/cache",
"chmod -R ug+rwx storage bootstrap/cache"
]
}
After composer install
The solution posted by bgles is spot on for me in terms of correctly setting permissions initially (I use the second method), but it still has potential issues for Laravel.
By default, Apache will create files with 644 permissions. So that's pretty much anything in storage/. So, if you delete the contents of storage/framework/views, then access a page through Apache you will find the cached view has been created like:
-rw-r--r-- 1 www-data www-data 1005 Dec 6 09:40 969370d7664df9c5206b90cd7c2c79c2
If you run "artisan serve" and access a different page, you will get different permissions because CLI PHP behaves differently from Apache:
-rw-rw-r-- 1 user www-data 16191 Dec 6 09:48 2a1683fac0674d6f8b0b54cbc8579f8e
In itself this is no big deal as you will not be doing any of this in production. But if Apache creates a file that subsequently needs to be written by the user, it will fail. And this can apply to cache files, cached views and logs when deploying using a logged-in user and artisan. A facile example being "artisan cache:clear" which will fail to delete any cache files that are www-data:www-data 644.
This can be partially mitigated by running artisan commands as www-data, so you'll be doing/scripting everything like:
sudo -u www-data php artisan cache:clear
Or you'll avoid the tediousness of this and add this to your .bash_aliases:
alias art='sudo -u www-data php artisan'
This is good enough and is not affecting security in any way. But on development machines, running testing and sanitation scripts makes this unwieldy, unless you want to set up aliases to use 'sudo -u www-data' to run phpunit and everything else you check your builds with that might cause files to be created.
The solution is to follow the second part of bgles advice, and add the following to /etc/apache2/envvars, and restart (not reload) Apache:
umask 002
This will force Apache to create files as 664 by default. In itself, this can present a security risk. However, on the Laravel environments mostly being discussed here (Homestead, Vagrant, Ubuntu) the web server runs as user www-data under group www-data. So if you do not arbitrarily allow users to join www-data group, there should be no additional risk. If someone manages to break out of the webserver, they have www-data access level anyway so nothing is lost (though that's not the best attitude to have relating to security admittedly). So on production it's relatively safe, and on a single-user development machine, it's just not an issue.
Ultimately as your user is in www-data group, and all directories containing these files are g+s (the file is always created under the group of the parent directory), anything created by the user or by www-data will be r/w for the other.
And that's the aim here.
edit
On investigating the above approach to setting permissions further, it still looks good enough, but a few tweaks can help:
By default, directories are 775 and files are 664 and all files have the owner and group of the user who just installed the framework. So assume we start from that point.
cd /var/www/projectroot
sudo chmod 750 ./
sudo chgrp www-data ./
First thing we do is block access to everyone else, and make the group to be www-data. Only the owner and members of www-data can access the directory.
sudo chmod 2775 bootstrap/cache
sudo chgrp -R www-data bootstrap/cache
To allow the webserver to create services.json and compiled.php, as suggested by the official Laravel installation guide. Setting the group sticky bit means these will be owned by the creator with a group of www-data.
find storage -type d -exec sudo chmod 2775 {} \;
find storage -type f -exec sudo chmod 664 {} \;
sudo chgrp -R www-data storage
We do the same thing with the storage folder to allow creation of cache, log, session and view files. We use find to explicitly set the directory permissions differently for directories and files. We didn't need to do this in bootstrap/cache as there aren't (normally) any sub-directories in there.
You may need to reapply any executable flags, and delete vendor/* and reinstall composer dependencies to recreate links for phpunit et al, eg:
chmod +x .git/hooks/*
rm vendor/*
composer install -o
That's it. Except for the umask for Apache explained above, this is all that's required without making the whole projectroot writeable by www-data, which is what happens with other solutions. So it's marginally safer this way in that an intruder running as www-data has more limited write access.
end edit
Changes for Systemd
This applies to the use of php-fpm, but maybe others too.
The standard systemd service needs to be overridden, the umask set in the override.conf file, and the service restarted:
sudo systemctl edit php7.0-fpm.service
Use:
[Service]
UMask=0002
Then:
sudo systemctl daemon-reload
sudo systemctl restart php7.0-fpm.service
I have installed laravel on EC2 instance and have spent 3 days to fix the permission error and at last fixed it.
So I want to share this experience with other one.
user problem
When I logged in ec2 instance, my username is ec2-user and usergroup is ec2-user.
And the website works under of httpd user: apache: apache
so we should set the permission for apache.
folder and file permission
A. folder structure
first, you should make sure that you have such folder structure like this under storage
storage
framework
cache
sessions
views
logs
The folder structure can be different according to the laravel version you use.
my laravel version is 5.2 and you could find the appropriate structure according to your version.
B. permission
At first, I see the instructions to set 777 under storage to remove file_put_contents: failed to open stream error.
So i setup permission 777 to storage
chmod -R 777 storage
But the error was not fixed.
here, you should consider one: who writes files to storage/ sessions and views.
That is not ec2-user, but apache.
Yes, right.
"apache" user writes file (session file, compiled view file) to the session and view folder.
So you should give apache to write permission to these folder.
By default: SELinux say the /var/www folder should be read-only by the apache deamon.
So for this, we can set the selinux as 0:
setenforce 0
This can solve problem temporally, but this makes the mysql not working.
so this is not so good solution.
You can set a read-write context to the storage folder with: (remember to setenforce 1 to test it out)
chcon -Rt httpd_sys_content_rw_t storage/
Then your problem will be fixed.
and don't forget this
composer update
php artisan cache:clear
These commands will be useful after or before.
I hope you save your time.
Good luck. Hacken
I have used this snippet for more than 3 years.
laravel new demo
cd demo
sudo find ./ -type f -exec chmod 664 {} \;
sudo find ./ -type d -exec chmod 775 {} \;
sudo chgrp -Rf www-data storage bootstrap/cache
sudo chmod -Rf ug+rwx storage bootstrap/cache
sudo chmod -Rf 775 storage/ bootstrap/
php artisan storage:link
And #realtebo recommendation looks good. I will try.
sudo find . -type d -exec chmod g+s {} \; <----- NOTE THIS
This was written in 2017 for around version 5.1~5.2. Use some common sense before you use this on later versions of Laravel.
I decided to write my own script to ease some of the pain of setting up projects.
Run the following inside your project root:
wget -qO- https://raw.githubusercontent.com/defaye/bootstrap-laravel/master/bootstrap.sh | sh
Wait for the bootstrapping to complete and you're good to go.
Review the script before use.
I had the following configuration:
NGINX (running user: nginx)
PHP-FPM
And applied permissions correctly as #bgies suggested in the accepted answer. The problem in my case was the php-fpm's configured running user and group which was originally apache.
If you're using NGINX with php-fpm, you should open php-fpm's config file:
nano /etc/php-fpm.d/www.config
And replace user and group options' value with one NGINX is configured to work with; in my case, both were nginx:
...
; Unix user/group of processes
; Note: The user is mandatory. If the group is not set, the default user's group
; will be used.
; RPM: apache Choosed to be able to access some dir as httpd
user = nginx
; RPM: Keep a group allowed to write in log dir.
group = nginx
...
Save it and restart nginx and php-fpm services.
For Laravel developers, directory issues can be a little bit pain. In my application, I was creating directories on the fly and moving files to this directory in my local environment successfully. Then on server, I was getting errors while moving files to newly created directory.
Here are the things that I have done and got a successful result at the end.
sudo find /path/to/your/laravel/root/directory -type f -exec chmod 664 {} \;
sudo find /path/to/your/laravel/root/directory -type d -exec chmod 775 {} \;
chcon -Rt httpd_sys_content_rw_t /path/to/my/file/upload/directory/in/laravel/project/
While creating the new directory on the fly, I used the command mkdir($save_path, 0755, true);
After making those changes on production server, I successfully created new directories and move files to them.
Finally, if you use File facade in Laravel you can do something like this:
File::makeDirectory($save_path, 0755, true);
If you have some code changes in some files and then some permission changes, it might be easier to set up the right permissions and commit again than try to pick the files that have permission changes.
You will only be left with the code changes.
we will give 755 permission to storage and bootstrap folder permission
sudo chmod -R 755 bootstrap/cache
sudo chmod -R 755 storage
Solution 2
We will give all files and folder ownership to the webserver user
sudo chown -R www-data:www-data /var/www/your-project-path
set 644 permission for all files and 755 for all directories.
sudo find /var/www/your-project-path -type f -exec chmod 644 {} \;
sudo find /var/www/your-project-path -type d -exec chmod 755 {} \;
we will give proper read and write permission for files and folders.
cd /var/www/your-project-path
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
above all detail for ubuntu server
I found an even better solution to this.
Its caused because php is running as another user by default.
so to fix this do
sudo nano /etc/php/7.0/fpm/pool.d/www.conf
then edit the
user = "put user that owns the directories"
group = "put user that owns the directories"
then:
sudo systemctl reload php7.0-fpm
I'd do it like this:
sudo chown -R $USER:www-data laravel-project/
find laravel-project/ -type f -exec chmod 664 {} \;
find laravel-project/ -type d -exec chmod 775 {} \;
then finally, you need to give webserver permission to modify the storage and bootstrap/cache directories:
sudo chgrp -R www-data storage bootstrap/cache
sudo chmod -R ug+rwx storage bootstrap/cache
sudo chown -R $USER:www-data your_directory_name_under_storage_app
It will give required permission to server user & server for file access.
Mac OS Big Sur
In my case artisan couldn't access
/var/www/{project_name}/storage to create files ex. in /logs folder
so I had manually to go to
/var and create the structure of folders that Laravel need to make symlink.
Add access to folder
sudo chgrp -R $USER /var/www/project_name
Then I could use
php artisan storage:link without any problems.

Add permission to two users (my apache server and myself)

I want my php script to be able to create file, edit, and delete it, so I need to give it permissions to do so in Linux.
I've done this with one of the stackoverflow answers with this code:
sudo chown -R www-data:www-data .
But when I do so, I lose my user access to files - so I can't open them with gedit for example until I change permissions back like so:
sudo chown -R igor /var/www/html/demo/myDir
I think I need to give permission to Apache, but leave my access as well. I feel there is some easy answer to make it work, but I can't find one. Any suggestions?
You are changing the owner of the files, if you want to change the permission of the files without changing the owner you need to use : chmod.
For example if you want to read write and execute on the current folder you can use: chmod 777 .
If what you want is the two users have the same permissions over the folder you could add your user to the group www-data (assuming that you are in the files folder):
sudo usermod -a -G www-data youruser
sudo chgrp -R www-data .
sudo chmod -R 770 .

Chown a specific folder without root privilleges

I need to chown a file to some other user, and make sure it is unreadable again. Sounds complicated but it will be mainly look like this:
cd /readonly
wget ...myfile
cd /workdir
chmod -R 444 /readonly
chown -R anotheruser /readonly
ls /readonly # OK
echo 123 > /readonly/newfile # Should not be allowed
cat /readonly/myfile # OK
chown 777 /readonly # Should not be allowed
In SunOS I saw something similar to this, I remember not being able to delete the disowned files by Apache, but I could not find something similar to this in Linux, as chmod requires root privilleges.
The reason I need this, I will fetch some files from web, make sure they will be unchangable by the rest of the script, only root can change it. The script can not definetely run as root.
On many *nixes (Linux, at the very least), this will be impossible.
chown is a privilege restricted to root, since otherwise you could pawn off your files on other users to avoid quota restrictions.
In a related case, it would also pose something of a semantic problem if arbitrary users could chown files to themselves to gain access.
More precisely, you can chown files that you own to change their group ownership information, but you can only change user ownership if you are root.
In any case, chown is the wrong hammer for this particular nail.
chmod, which you are already using, is the correct way to make a file read-only within a script.
The chmod 444 that you are already doing will protect against accidental modifications to the files.
You cannot "freeze" or otherwise render permissions static as a Unix/Linux user without elevating to root privileges (at which point, you can chown them to root:root and no one other than root can change permissions or ownership on them).
In terms of script design, you should not need to be more restrictive than this.
If your script is haphazardly chmoding or rm -fing files, then you have much more serious correctness problems to worry about than ensuring that the downloaded data is safe and sound.

Setting Drupal root directory to 777

A company mine is working with is having permissions issues for uploading files (via FTP). We found a workaround of putting everything to 777 (not my first choice, but ease of use trumps security here).
The problem with this is that Drupal breaks upon putting the root directory as 777.
Why is this? How can I change that?
Typically your files directory should be:
chmod -R 775 files
But also make sure your owner and group are correct. The owner in this case should be your ftp user. And your group should be the apache user.
chgrp -R apache_user files
chown -R ftp_user files
Having problem to upload files with ftp or with drupal? Drupal need write permission in sites/default/files to save images and css etc.
Maybe its problem with owner too?
Check this page: http://drupal.org/node/244924
The problem with this is that Drupal breaks upon putting the root
directory as 777.
Actually you need to change not root of Drupal directory but the directory sites/default/files.
If you want to do that in easiest way change this directory permissions to 777:
cd <your Drupal root>/sites/default
chmod -R 777 files
The secure way is to set your WWW user (e.g. www-data) as an owner of this directory:
cd <your Drupal root>/sites/default
chown -R www-data files
chmod -R 775 files
Also you can add your group (e.g. my_group):
chgrp -R my_group files

Resources