How to delete target folder created using scp by target user - linux

I have a machines A B C. only Machine B have access to A & C. For machine A I have root access , machine B I have root access and for machine C I have User level access.
1.2.3.4 is the ip address assigned to machine B.
When I am doing from machine A as
scp -pr ./logs/ root#1.2.3.4:/common/tftpboot/
It creating folder name logs inside <machine C>:/common/tftpboot/
I have given all read write execute permission to all user groups and others as machine A is comes under others using chmod 777 tftpboot
Now after copying logs folder I am not able to delete the <machine C>:/common/tftpboot/logs/folder from User of machine C though machine C user has given the 777 permission to /common/tftpboot/ folder as the logs folder is created by others i.e machine A root
So I want to do scp to copy the folder only (and not individual files) and still I want that user c should able to delete the folder created by machine A scp after analysing logs
Now I need to do ssh to Machine B from Machine A and then only I can able to delete the scp created logs folder.
can anybody help to do so ??

before doing SCP I have changed permission of the logs folder to 777 i.e chmod -R 777 ./logs and now I can able to delete the folder created by scp at Machine C

Related

Restrict users from storing in home directory in Linux

We have a RHEL server where multiple users have access to it through application. Application RStudio running on these servers default the workspace to the users /home folder. Though there is separate space provided for individual users, users tend to store the files onto /home filling up the /home.
Is there any possibility to restrict users from storing data to their home folders either at server level or R Studio level which would force them to use the provided location?
Though there are options to change the default workspace for all the users, due to the large number of teams each having their sensitive data, it is not possible to have a shared folder as default location.
You could create a group without write permissions on home folder and start rstudio through the command sg, which allows you to start it with the group id with reduced permissions.
The ls -l command displays directory contents in long format. The long format contains both permissions and ownership.
# ls -l
With chown you can change owner and group associated to a file/directory (-R == recursive)
# sudo chown -R user01:groupA Directory
By setting the owner and the single group, the others will have restrictions (if set) in accessing files / folders.
The chmod command is used to modify the various permissions/restrictions.
# sudo chmod -c ug=rwx,o= file1
going specifically
-c == report if the change is made
u == user
g == group
rwx == read, write, execute
o == others
=null == no permission
For create a new group you can use groupadd
# sudo groupadd rstudiogroup
You will have to set the new group created as the owner of the save destination folder and finally start the software through the command sg
# sudo sg rstudiogroup -c rstudio

Changing default files permission in Linux

I work under Centos 7.
For some time, I have a problem with the FTP /home/students directory whose access rights( permission) is set to 750. When I create a file as user students the file access permission is 644 (read/write for the owner and read-only for other users). But when the students user receives files by SFTP (with authentication by ssh key), the permission of these files is 600.
Can the right of access (permission) be imposed by the one who uploads the file by SFTP?
How to make the default permission for files received by SFTP automatically 644?
Thank you
I think u should do something like this > Modify /etc/ssh/sshd_config :
Subsystem sftp internal-sftp -m 0644
Then u should reload the SSHD Configuration :
sudo systemctl reload sshd

Google cloud scp permission denied

I am trying to transfer files to my Google cloud hosted Linux (Debian) instance via secure copy (scp). I did exactly what the documentation told to connect from a local machine to the instance. https://cloud.google.com/compute/docs/instances/connecting-to-instance.
Created a SSH keygen
Added the keygen to my instance
I can login successfully by:
ssh -i ~/.ssh/my-keygen [USERNAME]#[IP]
But when I want to copy files to the instance I get a message "permission denied".
scp -r -i ~/.ssh/my-keygen /path/to/directory/ [USERNAME]#[IP]:/var/www/html/
It looks like the user with which I login has no permissions to write files, so I already tried to change the file permissions of /var/www/, but this still gives the permission denied message.
I also tried to add the user to the root group, but this still gives the same problem.
usermod -G root myuser
The command line should be
scp -r -i ~/.ssh/my-keygen /path/to/directory/ [USERNAME]#[IP]:/var/www/html/
Assuming your files are in the local /path/to/directory/ and the /var/www/html/ is on the remote server.
The permissions does not allow to write in the /var/www/html/. Writing to /tmp/ should work. Then you can copy the files with sudo to the desired destination with root privileges.
If SSH isn't working, install gcloud CLI and run the following locally: gcloud compute scp --recurse /path/to/directory [IP] --tunnel-through-iap. This will dump the directory into your /home/[USERNAME]/ folder. Then log into the console and use sudo to move the directory to /var/www/html/.
For documentation, see https://cloud.google.com/sdk/gcloud/reference/compute/scp.

sending files to virtual machine SCP

I am trying to transfer files over to my virtual machine
I tried the command
scp files user#xxx.xx.xx.xxx:/home/user/directory
I am later asked to enter the password for user#xxx.xx.xx.xxx
When I enter the password the output is:
scp: /home/user/directory/filename: Permission denied
I thought perhaps I don't have the correct permissions or rights to the files?
So I checked rights for each file and it is
-rwxr-xr-x
Not really sure what I need to do to correctly SCP my files over to my virtual machine
Make sure that user exists on both machines and that it has permission to write to the destination directory. This means the destination directory must either be a) world-writable, b) writable by a group that user belongs to, or c) owned by user.

mySQLdump from Linux machine to a mounted Windows folder on a remote server

I am trying to do a mysqldump from a local Linux machine to a Windows folder that has been mounted on the system. This is the command I am using in the terminal:
mysqldump -u root -plinuxsux myDB -t LOG > /mounted folder/path/blah/myDB.sql
I am getting the following error:
/mounted folder/path/blah/myDB.sql: Permission denied
I checked the permissions of the folder on the Windows side, and there is a specific user that I created called Sys003 that has full control of that folder.
Do I need to put that user name (and password) into the command above to get it to work? And if so, how do I do that? Thanks.
The problem is that the user that is actually running the mysqldump command has not the permission to write on the destination folder.
One solution might be changing to the Sys003 user and run the mysqldump again:
normal_prompt> su Sys003
password...
Sys003_prompt> mysqldump...
Another one can be running mysqldump as your normal user, then copy the dump as Sys003:
normal_prompt> mysqldump... > /local/dump.sql
normal_prompt> su Sys003
password...
Sys003_prompt> cp /local/dump.sql /mounted_folder/path/blah/myDB.sql
Be careful, since your Sys003 user might not be authorized on running mysqldump, but that's a totally different question :)
It was an error in the /etc/fstab file. I had the user as a different user than Sys003. Once I put the user as Sys003 with their password, it worked.

Resources