How to copy files from Amazon EFS to my local machine with shell script? - linux

I have a question regarding file transfer from Amazon efs to my local machine with a simple shell script. The manual procedure I follow is:
Copy the file from efs to my Amazon ec2 instance using sudo cp
Copy from ec2 to my local machine using scp or FileZilla (drag and drop)
Is there a way it can be done running a shell script in which I give two inputs: source file address and save destination directory?
Can two steps be reduced to one i.e. directly copying from efs to local machine?

You should be able to mount to the local machine and access the remote file system locally on your machine.
http://docs.aws.amazon.com/efs/latest/ug/mounting-fs.html
With mounting, you can access the file locally with your machine resources to edit the remote files.
While SCP can work, you need to keep them in sync all the time between your local and remote.
Hope it helps.

Related

Transferring files from my Local windows pc to my Linux VM

SO i am new to tech, and as previous posts suggests i am working with OCI. Currently i run a linux 8 VM on OCI. My goal is to run terraform scrips on the vm, and have the resources created in OCI.
Current problem:
The tf files i will be writing will be done so on my local windows 10 machine. The files will be saved in a local directory. I need a way of transferring these local files to a directory in my linux machine, in order to execute them!
Is anybody good with OCI is there capability for a sftp transfer using winscp?? I'm just not sure where to start. Anybody with good advice please aid me!
It depends of your OCI network configuration.
If your OCI compute VM is in a public subnet and you have an internet gateway, then you can use ssh to connect to it (using putty for instance). That means you can also use scp which lets move copy files over ssh. As you mentioned, WinSCP let's you connect to your OCI compute VM by using ssh and scp or sFTP. After installing it you can create a new connection using the public ip of your OCI compute VM and the private key.
My personal preference is to use MobaXterm to connect to ssh to
my OCI compute VMs. Once connected to a remote host using ssh, the
left pane directly displays a file browser for the remote host.
Drag-and-dropping a file there would initiate an sFTP transfer
automatically.
Please also note that scp is obsolete since 2019. SFTP or rsync could be used instead. Using MobaXterm, it can be done by opening a new terminal tab (which is local to your Windows machine) and type the rsync command you wish for instance rsync -v -P -e 'ssh -i "D:/my_folder/oci_api_key.pem"' /cygdrive/d/my_folder/*.tf opc#<oci_vm_ip>:/home/opc/my_folder
-v is increasing verbosity, to display more information. -P displays partial progress for each file transferred. -e lets you specify which command to use to run rsync. in this case I use ssh and pass the private key. More option are available and you can check them by typing man rsync.
If your OCI compute VM is in a private subnet, you would need to set up a bastion VM in a public subnet to first access the bastion and then the VM. Here is a blog post about how to achieve that using putty and WinSCP : https://www.ateam-oracle.com/ssh-tunnel-to-a-private-vm-using-a-bastion-host-in-oci

Copying files from a linux machine to an aws ec2 instance

I want to write a jenkins pipeline in which at a particular step i have to copy few zip files from a different linux machine. The pipeline will be running on an AWS EC2 agent.
I have to copy the zip files from linux machine to AWS EC2 instance.
i tried using few ways to handle this using curl and scp but not able to achieve it. Is there a better way to achieve it.
With curl : i am facing connection reset by peer error. Please help
I would use scp for this task. Here's an example of me copying over a file called foo.sh to the remote host:
scp -i mykey.pem foo.sh "ec2-user#ec2-123-123-123-123.compute-1.amazonaws.com:/usr/tmp/foo.sh"
in the example:
mykey.pem is my .pem file
foo.sh is the file I want to copy across
ec2-user the user on the host
123-123-123-123 the (fake) public ip address of the host
/usr/tmp/foo.sh the location where I want the file to be

Access shared folder using a specific windows credential

I'm currently working with a requirement: download a file from the database then write it to a shared folder. Temporarily, I'm working on a path on my local:
File.WriteAllBytes(path, content);
My problem is the shared folder is on a windows machine and only a specific account will be allowed to write to this folder.
Now I know the basics of Impersonation but I don't know if it is possible to impersonate on a Docker container on a Linux machine.
In short, I want to deploy my application on a Linux container then write a file to a windows shared folder with limited access.
Is the folder on the host or mounted on the host? If so you can then map the host folder to the container. e.g.
C:\> "Hello" > c:\temp\testfile.txt
C:\> docker run -v c:/temp:/tmp busybox cat /tmp/testfile.txt
c:/temp being a local path on the host
/tmp being the path in the container.
More details here: volume-shared-filesystems

How can I backup Google Drive into AWS Glacier?

I want to backup whatever new file or folder added to my Google Drive into AWS Glacier through a linux instance running in an EC2 instance.
I have gone through some AWS Glacier clients, but they are for uploading files from and downloading to local system.
https://www.cloudwards.net/best-backup-tools-amazon-glacier/
Rclone may able to help you. Rclone is a command line program to sync files and directories to and from
Google Drive
Amazon S3
Openstack Swift / Rackspace cloud files / Memset Memstore
Dropbox
Google Cloud Storage
Amazon Drive
Microsoft OneDrive
Hubic
Backblaze B2
Yandex Disk
SFTP
The local filesystem
https://github.com/ncw/rclone
Writing the steps (may be helpful to someone)
We need to create remotes for Google Drive and Amazon S3
I'm using Ubuntu server on AWS EC2 instance.
Download appropriate file from https://rclone.org/downloads/ - Linux ARM - 64 Bit (in my case)
Copy the downloaded file from local to server (using scp command) and extract the file. OR extract the file on local itself and copy the extracted files to the server (because I was facing problem in extracting it on server)
ssh into the ubuntu server.
Go inside the folder - rclone-v1.36-linux-amd64 (in my case)
Execute the following commands:
Copy binary file
$ sudo cp rclone /usr/bin/
$ sudo chown root:root /usr/bin/rclone
$ sudo chmod 755 /usr/bin/rclone
Install manpage
$ sudo mkdir -p /usr/local/share/man/man1
$ sudo cp rclone.1 /usr/local/share/man/man1/
$ sudo mandb
Run rclone config to setup. See rclone config docs for more details.
$ rclone config
After executing rcolne config command, choose the number/alphabet of option you want to select. Once reached to Use auto config? part, enter N (as we are working on remote server)
Paste the link you got in local browser, copy the verification code and enter the the code in the terminal.
Confirm, by entering y
Enter n to create another remote for Amazon S3, and repeat the same procedure.
Use the following links for various rclone commands and options:
https://rclone.org/docs/
https://linoxide.com/file-system/configure-rclone-linux-sync-cloud/

Mounting a folder from other machine in linux

I want to mount a folder which is on some other machine to my linux server. To do that i am using the following command
mount -t nfs 192.xxx.x.xx:/opt/oracle /
Which is executing with the following error
mount.nfs: access denied by server while mounting 192.xxx.x.xx:/opt/oracle
Do anyone knows what's going on ??? I am new to linux.
Depending on what distro you're using, you simply edit the /etc/exports file on the remote machine to export the directories you want, then start your NFS daemon.
Then on the local PC, you mount it using the following command:
mount -t nfs {remote_pc_address}:/remote/dir /some/local/dir
Please try with your home directory as per my knowledge you can't dump anything directly on root like that.
For more reference, find full configuration steps here.

Resources