Not able to delete directory - linux

I am having a frequent problems with my web hosting (its shared)
I am not able to delete or change permission for a particular directory. The response is,
Cannot delete. Directory may not be empty
I checked the permissions and it looks OK. There are 100's of files in this folder which I don't want.
I contacted my support and they solved it saying it was permission issue. But it reappeared. Any suggestions?
The server is Linux.

You can't rmdir a directory with files in it. You must first rm all files and subdirectories. Many times, the easiest solution is:
$ rm -rf old_directory
It's entirely possible that some of the files or subdirectories have permission limitations that might prevent them from being removed. Occasionally, this can be solved with:
$ chmod -R +w old_directory
But I suspect that's what your support people did earlier.

This could also be because your FTP client might not be showing the hidden files (like cache, or any hiddn files that your application might create), while the hidden files are preventing you from deleting the directory. (though, in your case, I am not sure if this is the cause .. .it could be permission issue with your hosting provider.. Webserver running as another user (like apache or www) combined with your directories having global write perms).

I assume that's a response from an FTP server?
Usually, a message from an FTP server really means it. If it says the directory is not empty, there might be certain files you cannot see that exists in the directory which maybe one of:
Your PHP/JSP/ASP/whatever scripts may run under a different user account thus creating files which you may not be able to see/delete
Is your hosting's web interface run under your FTP account? There might be conflicting permissions there if you manage some files from the web interface and then later via FTP.
Hosting server/operating system files created unintentionally e.g. from the hosting's web interface
If it comes from a script, write a one-time throw-away script that delete the files and that directory and then uploads and executes it.
And just to be sure, some FTP server doesn't support direct directory deletion, you need all the files first, is that the case?

Related

retrieving files from NAS linux network in PHP

I'm working on a php project where a particular feature will have to access the files stored from an external directory:Network Attached Storage(linux). Lets say the path is /volume1/accounts and this is mounted in the linux server where my site is hosted using apache. I will have to retrieve files from that directory. is there a way in PHP to do that? My client says that its already been mounted.
No matter what I do I cant access using these test codes
print "<pre>".print_r(scandir("/volume1/accounts/"), true)."</pre>";
print "<pre>".print_r(scandir("192.168.0.233/volume1/accounts"), true)."</pre>";
print "<pre>".print_r(scandir("192.168.0.233:/volume1/accounts"), true)."</pre>";
How am I suppose to do it? Please help me.
Generally, PHP engine is executed with apache server's privileges. So mounted directory has no permissions or ownership for apache server, It'll be not able to show file lists. Could you try to make directory on /volume1/accounts/ and change ownership and permissions? If apache server is working with apache:apache ownership, please change ownership of directory as same.

Linux file permissions and Java problems (permission retention)

I run servers on my Linux Server (Ubuntu) and there's a bit of a problem. It may seem simple to fix, however I don't think it is. The servers run in my username (server), however, others access certain files with different users via FTP. Because the server is running in my username, whenever a plugin creates new files, they do not have permission to edit etc.
I have tried putting the users into groups and then allowing group access to that folder (even for new files), but had no luck. Every time they need to edit the files, I need to chmod -R 777 it.
I thought about running the servers in their usernames, however that would produce complications. Is it actually possible to make new files retain the permissions of the parent (or a top folder)? All the solutions I've found doesn't seem to work.
Not for users but for groups. You can:
chmod g+s parent_dir
chgrp shared_group parent_dir
If you create files inside it, that files will have the group of the folder (shared_group).

vftpd issue regarding file permissions and user rights

My system is:
Ubuntu 10.04 / Apache2
The question is related to the sofware vftpd - an ftp server for linux (https://security.appspot.com/vsftpd.html)
I have installed vftpd and it works fine. I am having an issue though trying to understand why users are able to delete files which are owned by root. I have set up the ftp server with the option "local_enable=YES" and also "chroot_local_user=YES" so that the users cannot navigate outside their home directory.
The strange thing is that if a file is owned by root, the ftp users are able to delete it. Is a user able to delete any file in the home directory regardless of who owns it?
I want to prevent users from being able to delete files, or allow other users to only have read access to the home directories of other users.
If anyone knows the vftpd software and can help i´d be most grateful,
yours,
Rob
Have you checked which is the chmod of the files? If all files belongs to the same group, and the group have read and write privileges, any user can modify the files through ftp.

LAMP: Recommended Directory and File Permissions

My project resides in a shared Linux hosting server. The hosting provider, of course, has already set up the necessary directory and file ownerships relative to other server users. My concern for now is how to setup permissions within my domain so my users can have read access to the files and folders they should have and still let my scripts retain read/write access to it.
Question: What would be the recommended permissions on:
Public files and folders (read only?)
Files where uploaded files from forms are stored
Files and folders where GD and cache files are being written into
Folders where my server-side scripts are stored (I used mainly PHP)
My WWW root folder (where index.php resides)
This is a perfect example of where you need the Principle of Least Privilege. Allow ReadOnly to the webserver's user for RO content, allow writing only to a directory/files that absolutely need to be written. Explicitly deny access to things you don't want people to read (config files, htaccess, anything with paths/ip addresses/passwords), don't allow any extra processing if you're not using it (CGI executables, Server Side Includes).
The best way to do it is to start with deny everything and slowly open thing up as you go. First try serving static content, see what is the minimal amount of Apache directives/modules and filesystem ownerships and permissions to get it working. Then try some RO PHP scripts. Then try some RW PHP scripts. Then DB connectivity, and so on, you get the idea... It's a very tedious processes, and you want to plan ahead the sort of things you want to test; I tend to write long scripts with wget commands trying to do both good and bad things to the server. Make one change, restart, rerun the script, see what changes from the last time. Observe-modify-analyze, until you cant stand looking at it anymore ;)

Cruisecontrol, deployment, folder permissions

We're using cruisecontrol.net, it builds the version, creates a zip file, then 15 min later, unzips the file on the Integration server. But when the folder gets to the integration server, often, the security permission on one of the folders is totally hosed. The Domain admin and folder owner can't even open the folder in explorer. We reboot and the folder permissions are good we can delete the folder and redeploy the zip file and it's okay.
Does anyone have any idea what or how the folder permissions are getting so messed up?
Any tools to use to diagnose/watch what exactly is messing it up?
Have you tried using psexec from system internals to upzip to file on the remote machine rather than the build machine?
Also, it seems to me that rather than unzipping the zip just copy the stuff directly to the remote server. I'm not seeing the reason to zip it and then just unzip it?

Resources