Backup files and folders from remote host using Rsync Nodejs - node.js

I want to create a backup script using nodejs, cronjob. I use npm rsync to make copies of my files and folders. The code works inside the local driver, but I can't connect to the source remote host:
rsync = new Rsync()
.flags("e")
.source("192.168.1.140:/home/test/YDA")
.destination("../Desktop/fff/");
How I could provide username and password for the remote host?

Related

Copying files from a linux machine to an aws ec2 instance

I want to write a jenkins pipeline in which at a particular step i have to copy few zip files from a different linux machine. The pipeline will be running on an AWS EC2 agent.
I have to copy the zip files from linux machine to AWS EC2 instance.
i tried using few ways to handle this using curl and scp but not able to achieve it. Is there a better way to achieve it.
With curl : i am facing connection reset by peer error. Please help
I would use scp for this task. Here's an example of me copying over a file called foo.sh to the remote host:
scp -i mykey.pem foo.sh "ec2-user#ec2-123-123-123-123.compute-1.amazonaws.com:/usr/tmp/foo.sh"
in the example:
mykey.pem is my .pem file
foo.sh is the file I want to copy across
ec2-user the user on the host
123-123-123-123 the (fake) public ip address of the host
/usr/tmp/foo.sh the location where I want the file to be

SSH access parent host folder

After connecting to a remote server (A) through ssh is it possible to access host's folder/files?
This server A has access to another server (B) which I can't access from my computer. I need to run some commands on B using some config files on my computer.
I ended up copying the files from my computer to A using scp and run the commands there.

How to copy files from Amazon EFS to my local machine with shell script?

I have a question regarding file transfer from Amazon efs to my local machine with a simple shell script. The manual procedure I follow is:
Copy the file from efs to my Amazon ec2 instance using sudo cp
Copy from ec2 to my local machine using scp or FileZilla (drag and drop)
Is there a way it can be done running a shell script in which I give two inputs: source file address and save destination directory?
Can two steps be reduced to one i.e. directly copying from efs to local machine?
You should be able to mount to the local machine and access the remote file system locally on your machine.
http://docs.aws.amazon.com/efs/latest/ug/mounting-fs.html
With mounting, you can access the file locally with your machine resources to edit the remote files.
While SCP can work, you need to keep them in sync all the time between your local and remote.
Hope it helps.

Using SCP command to download files from Linux server to client server

I'm creating files on a Linux server that I'm logged into and I'm adding the ability for the user to download these files from the Linux server on to the connecting computer. I'm writing a scrip and using the scp command to download these files:
scp data.txt user#usraddress:/home/usr
However, I don't want to specify "user#usraddress:/home/usr" to be just my computer. I want whoever is logged onto the linux server to be able do download these files. Is there a way to get the address of the connecting computer?
How would I do this?
Forgive me if this seems elementary, I'm very new to scripting.
When you open a remote session in a GNU/Linux machine, the ssh server sets the environment variable SSH_CONNECTION with some connection information. You can use this variable and the $USER variable to fill that parameters:
scp data.txt $USER#${SSH_CONNECTION%% *}:/home/$USER
Note that as far as I know you couldn't assume the client home directory is at /home. As said by chepner, you could omit the destination directory to use the default location, the home directory.
scp data.txt $USER#${SSH_CONNECTION%% *}:

Using rsync to keep two servers in sync

I have two AWS EC2 instances that I'm trying to implement a two way sync. So if a file or folder on server1 is created or updated it should sync that file/folder to server2. If it's a new folder it should be created on the server. The problem I'm having is I can't get rsync to create the folders on the 'local' server.
For example, server 1: /rootdir/1/2/3/4, where directories 3 and 4 do not exist on server2. When I run rsync on server2 I want those new directories to be created.
Here is the code I'm trying to use, running from Server2:
$sudo rsync -avzP -e "ssh -i /home/ec2-user/.ssh/Key.pem" ec2-user#IPADDRESS OF SERVER1:/rootdir/1/2/ /rootdir/1/2
I'm not getting an error but the directories aren't being copied.
I also tried -r but it made no difference.
I finally figured out what I was doing wrong. The servers were configured with a non-standard port and I needed to tell rsync which port to use.

Resources