File permissions changing on save ( using root ) - linux

Using a fresh installation of CENTOS 6.2, when I connect to the server ( SFTP mount with nautilus ) and edit files, no matter what permission the file had before, it is reset to 700, read+write+execute only for the owner.
When SSHing directly into the machine and editing files on the command line - no permissions are changed.
The files I am editing are website scripts sitting in my Apache folders.
Why is this behavior happening? Any suggestions are welcome.

Your FTP client might be "downloading and reuploading" your files when you edit them. Change your umask if you want different permissions, or use SSH and a proper editor if you want to keep the permissions...

Related

Vsfp user don't see any data linux Rhel/Centos

I tried creating vsftp server on Rhel 8 and centos. My ftp Users can login into server but only sed list of directory's and are able to navigate to any directory.
User cannot create directory or file
User cannot see any files in any directory.
I chnage chmod 777
And changed ownership but nothing works
It was selinux config I had to change under /etc/selinux/config
changed enforcing to disable

How can I CHMOD files/directories on Windows Azure?

I am using FileZilla FTP to right click and change a directories File Permissions as I do on many other sites/servers. For some reason this is not working in Windows Azure. It outputs in FileZilla "500 'SITE CHMOD 777 (mydirectory)': command not understood"
Any ideas?
The Windows Azure portal has a "Console" for websites where you can execute some shell commands. One of them appears to be chmod (fileutils) 4.1. I was able to modify the permissions on a folder using this:
chmod -R 744 myfolder
I found a hack solution to delete files on Azure:
Stop your website from the management console (https://manage.windowsazure.com)
Open up the FTP site in Filezilla
Rename the directory that has the problem to anything else (Possibly an optional step, I dont know)
Delete the renamed directory
Restart your website.
That seems to do it.
Windows Azure Websites is a Windows Server based server. Thus, file permissions don't work like in Linux (as #SLaks already mentioned).
However, the account your scripts (PHP/ASP.NET/node.js) are executed under has full access to the folder /site/wwwroot, as does your FTP user. Meaning that from your PHP you can do all fully privileged file access operations - Read, Write, Delete, Create, Create directories.
What you cannot do, and cannot be changed, is to execute scripts (which 0777 would give you in Linux).

Users can't upload files, even with permissions set to them using vsftpd

I have a cloud hosting linux solution. I had vsftpd working on it, but after having issues and tinkering with a lot of settings, I now have an issue where users can login using FTP and connect to the correct home directory, navigate within it, download files but they cannot upload files to the server. They get a time out error, which appears to be a permissions error, but I can't narrow it down any more than that. /var/logs/syslog gives nothing away.
The folders belong to the users. The parent www folder is set to 555. Can anyone help with this issue at all?
Cheers,
T
Try to set the permissions to 755, 555 doesn't allow writing for anyone. Are your user and group different?
You also may need to enable logging for FTP server. The time out error may include some other errors, not only permission denied.
To have extended logging change the variables in your ftp config file:
dual_log_enable=YES
log_ftp_protocol=YES
xferlog_enable=YES
syslog_enable=NO
and check the log file name there.
you must create a folder into user folder (Example : /var/www/user1/upload).
and set permission 777 (Example : chmod 777 /var/www/user1/upload).
then upload file into this folder.

db.* files in /home from Perforce?

I see several db.* files in my /home directory, and it seems they come from perforce. For example, some files are db.archmap, db.bodtext, db.change, db.changex
Are these files useful? Can I delete them? They are making my /home directory messy
You have started a server using your home directory as the Perforce server's P4ROOT folder. Those files are files that are generated from starting the server and cannot be deleted unless you want to hose your server installation. It's not clear to me how you've started the server instance, so I'll try and cover multiple bases with my answer.
If you want to start up the server under your own account, you should set the P4ROOT environment variable and point it to where you want the server to store its files. Alternatively, when you start the server, you can specify the root folder on the command line using the -r option:
p4d -r /home/mark/p4server
which would put the server's files into the directory called 'p4server' off of my home directory.
Typically it is best to run the perforce server using a user that is dedicated to running perforce. I use a user called 'perforce'. I set P4ROOT (and other variables) in that users environment. If you cannot use a separate user, it might be easier to use the -r command line option that I mentioned above.
Those files are only server files, not client files. So it is safe to delete them, but if you start the server back up it will recreate them. So you might want to uninstall the server.
Unless you are running a beta version, they have p4sandbox coming soon(maybe in the beta, I forget) which MAY create those files. I don't have a beta version, so I can't verify what new files the client may or may not create.
You can check the documentation here to see what these files do/are for.

How to allow file uploading outside home directory with SSH?

I'm running a Fedora 8 Core server. SSH is enabled and I can login with Transmit (FTP client) on port 22. When logged in, I can successfully upload files to the users home directory. Outside the home directory I can only browse files, not upload/change anything. How can I allow file uploading to a specific directory outside the users home directory?
an easy method is to grant the user rights to the folder you want them to be able to upload to, then add a symlink (link -s) from their home folder to the destination.
You can also just use
scp file user#server:/path
which will let you upload to any directory you have permissions to
file is the file to copy
user & server should be obvious
/path is any destination path on the server which you have rights to; so /home/user/ would be your likely default home folder
You need to make those directories writable by the proper users, or (easier) that user's group. This is of course a huge security hole, so be careful.
HI,
Give the FTP user write permission on the directory where you want to upload your files.

Resources