scp file to different user in the remote server from local - linux

generally, i login to a server xyz.com using my login credentials(myuserid#xyz.com),
my home = /home/user/myuserid/
after login, i do "su - someuser" to access the files.
i would like to copy a file from local machine to a directory in someuser eg: /abc/someuser/temp
for this, i am using
scp somefile.txt myuserid#xyz.com:/abc/someuser/temp/
it is asking my password for myuserid and then says.. /abc/someuser/temp/ permission denied
what command shall i use to copy a file to su in remote host?

You'll have to use someuser's credentials to do the scp
scp somefile.txt someuser#xyz.com:/abc/someuser/temp/
Alternatively you can give myuserid permission to someuser's home directory.

Related

Shell command to copy file from one server to remote server with different user

I have a server serverA and a user on it with "akotha", and there is another user "mqm". I can switch to "mqm" by typing sudo su - mqm, but I don't know the password of the mqm user. All I want is to copy a file from my localserver to serverA and place it in a folder which only mqm has write access to.
Can you please let me know the command to fulfill my requirement?
You can use SSH and secure copy command:
$ scp path/to/local/file mqm#ip_address_of_server_A:~/directory
but if you haven't the password of 'mqm' you can send it to user 'akotha' and then change file permissions

Google cloud scp permission denied

I am trying to transfer files to my Google cloud hosted Linux (Debian) instance via secure copy (scp). I did exactly what the documentation told to connect from a local machine to the instance. https://cloud.google.com/compute/docs/instances/connecting-to-instance.
Created a SSH keygen
Added the keygen to my instance
I can login successfully by:
ssh -i ~/.ssh/my-keygen [USERNAME]#[IP]
But when I want to copy files to the instance I get a message "permission denied".
scp -r -i ~/.ssh/my-keygen /path/to/directory/ [USERNAME]#[IP]:/var/www/html/
It looks like the user with which I login has no permissions to write files, so I already tried to change the file permissions of /var/www/, but this still gives the permission denied message.
I also tried to add the user to the root group, but this still gives the same problem.
usermod -G root myuser
The command line should be
scp -r -i ~/.ssh/my-keygen /path/to/directory/ [USERNAME]#[IP]:/var/www/html/
Assuming your files are in the local /path/to/directory/ and the /var/www/html/ is on the remote server.
The permissions does not allow to write in the /var/www/html/. Writing to /tmp/ should work. Then you can copy the files with sudo to the desired destination with root privileges.
If SSH isn't working, install gcloud CLI and run the following locally: gcloud compute scp --recurse /path/to/directory [IP] --tunnel-through-iap. This will dump the directory into your /home/[USERNAME]/ folder. Then log into the console and use sudo to move the directory to /var/www/html/.
For documentation, see https://cloud.google.com/sdk/gcloud/reference/compute/scp.

Copy files from Linux server using ssh client with different user name

I have this linux machine with ssh server installed, I can access the server using username="ubuntu". ssh server blocks clients that try to connect using "root" username.
So connection can be made by:
ssh -i mykey ubuntu#myserver
I can get files that belong to "ubuntu" using :
scp -i mykey ubuntu#myserver:<file location> ./
However, what I really want is to get files that belong to "root" username, (Note: I can't access the server with username "root" for obvious security reasons).
so is there a way to do download files that are under "root" username?
I was thinking to do some magic in the server side that enables me to do that.(I don't know how :) )
if this help: I have root access and also I can create files on my server side. but I'm not allowed to change the file permission under the root(if someone get hold of these files I'll be fired)
You can try monster like this
ssh ubuntu#myhost 'sudo cat /path/to/file | uuencode' | uudecode > path/to/local
You should have uuencode and uudecode on coresponding hosts.
Or if file is text you can skip uuencode part
ps: see related topic
You could do it the other way around.
Log into the the pc with the file you want with
ssh ubuntu#myserver
Then gain superuser privileges
sudu su
and then copy the files you want
scp /the_file_you_want ubuntu#myhost:/the_location_and_filename_you_want
Some other ways you can find here
https://unix.stackexchange.com/questions/106480/how-to-copy-files-from-one-machine-to-another-using-ssh
enable ssh on your machine
(if fedora) (for ubuntu you can find command on google easily)
service sshd on
From your local machine
ssh -i ubuntu#myserver
change to root
su
enter password
and copy files using scp
scp somefile.extension randomuser#localmachine:/some/path/
I hope it helps

Changing user to root when connected to a linux server and copying files

My script is coded in a way that doesn't allow you to connect to a server directly by root. This code basically copies files from a server to my computer and it works but I don't have access to many files because only root can access them. How can I connect to a server as a user and then copy its files by switching to root?
Code I want to change:
sshpass -p "password" scp -q -r username#74.11.11.11:some_directory copy_it/here/
In other words, I want to be able to remotely copy files which are only accessible to root on a remote server, but don't wish to access the remote server via ssh/scp directly as root.
Is it possible through only ssh and not sshpass?
If I understand your question correctly, you want to be able to remotely copy files which are only accessible to root on the remote machine, but you don't wish to (or can't) access the remote machine via ssh/scp directly as root. And a separate question is whether it could be done without sshpass.
(Please understand that the solutions I suggest below have various security implications and you should weigh up the benefits versus potential consequences before deploying them. I can't know your specific usage scenario to tell you if these are a good idea or not.)
When you ssh/scp as a user, you don't have access to the files which are only accessible to root, so you can't copy all of them. So you need to instead "switch to root" once connected in order to copy the files.
"Switching to root" for a command is accomplished by prefixing it with sudo, so the approach would be to remotely execute commands which copy the files via sudo to /tmp on the remote machine, changes their owner to the connected user, and then remotely copy them from /tmp:
ssh username#74.11.11.11 "sudo cp -R some_directory /tmp"
ssh username#74.11.11.11 "sudo chown -R username:username /tmp/some_directory"
scp -q -r username#74.11.11.11:/tmp/some_directory copy_it/here/
ssh username#74.11.11.11 "rm -r /tmp/some_directory"
However, sudo prompts for the user's password, so you'll get a "sudo: no tty present and no askpass program specified" error if you try this. So you need to edit /etc/sudoers on the remote machine to authorize the user to use sudo for the needed commands without a password. Add these lines:
username ALL=NOPASSWD: /bin/cp
username ALL=NOPASSWD: /bin/chown
(Or, if you're cool with the user being able to execute any command via sudo without being prompted for password, you could instead use:)
username ALL=NOPASSWD: ALL
Now the above commands will work and you'll be able to copy your files.
As for avoiding using sshpass, you could instead use a public/private key pair, in which a private key on the local machine unlocks a public key on the remote machine in order to authenticate the user, rather than a password.
To set this up, on your local machine, type ssh-keygen. Accept the default file (/home/username/.ssh/id_rsa). Use an empty passphrase. Then append the file /home/username/.ssh/id_rsa.pub on the local machine to /home/username/.ssh/authorized_keys on the remote machine:
cat /home/username/.ssh/id_rsa.pub | ssh username#74.11.11.11 \
"mkdir -m 0700 -p .ssh && cat - >> .ssh/authorized_keys && \
chmod 0600 .ssh/authorized_keys"
Once you've done this, you'll be able to use ssh or scp from the local machine without password authorization.

mySQLdump from Linux machine to a mounted Windows folder on a remote server

I am trying to do a mysqldump from a local Linux machine to a Windows folder that has been mounted on the system. This is the command I am using in the terminal:
mysqldump -u root -plinuxsux myDB -t LOG > /mounted folder/path/blah/myDB.sql
I am getting the following error:
/mounted folder/path/blah/myDB.sql: Permission denied
I checked the permissions of the folder on the Windows side, and there is a specific user that I created called Sys003 that has full control of that folder.
Do I need to put that user name (and password) into the command above to get it to work? And if so, how do I do that? Thanks.
The problem is that the user that is actually running the mysqldump command has not the permission to write on the destination folder.
One solution might be changing to the Sys003 user and run the mysqldump again:
normal_prompt> su Sys003
password...
Sys003_prompt> mysqldump...
Another one can be running mysqldump as your normal user, then copy the dump as Sys003:
normal_prompt> mysqldump... > /local/dump.sql
normal_prompt> su Sys003
password...
Sys003_prompt> cp /local/dump.sql /mounted_folder/path/blah/myDB.sql
Be careful, since your Sys003 user might not be authorized on running mysqldump, but that's a totally different question :)
It was an error in the /etc/fstab file. I had the user as a different user than Sys003. Once I put the user as Sys003 with their password, it worked.

Resources