unable to copy file from local machine to ec2 instance in ansible playbook - linux

So I am running scp -i ~/Downloads/ansible-benchmark.pem ~/Documents/cis-playbook/section-1.yaml ubuntu#ec2-18-170-77-90.eu-west-2.compute.amazonaws.com:~/etc/ansible/playbooks/
to transfer an ansible playbook I created with VSCODE the section-1.yaml file,
but I am coming up with an error scp: /home/ubuntu/etc/ansible/playbooks/: No such file or directory
the directory definitely exists in the ec2 instance, I did install ansible, but for some reason I don't know why it isn't recognising the directory.

For the first one, you can check if is available
/home/ubuntu/etc/ansible/playbooks/
if that part is not available on target/source, you can use create a folder on ansible first then you can go for copy on target
You can use this issue Ansible: find file and loop over paths

Related

GCP Filestore error modifying shared folder contents with nodejs script under non-root user

I want to write a program that writes log files into the shared folder in a GCP compute engine. I used GCP filestore to mount the NFS folder in an ubuntu vm. After creating the folder, I noticed that I couldn't use cp to copy file to that folder unless I use sudo. When I ran the nodejs script, it also returned a permission denied error. However, I don't want to run the nodejs script with root. Is there a way to modify the set of permission so that I can write to the shared folder under the default, non-root user?
I modified the permission of the shared folder to 777 but it didn't work. I still cannot write to the folder.

How to copy Pyramid project?

How can I copy Pyramid project to another computer? I was following Pyramid's tutorial SQLAlchemy + URL dispatch and finished first four chapters. I wasn't home so I copied directory with my project to my laptop to be able to go through it. I opened directory with VSCode, it automatically opened virtual environment that was installed (with python3 -m venv env). I tried few commands but got error:
bash: env/bin/pip: /home/user/path/to/project/env/bin/python3.5: bad interpreter: No such file or directory
This got me thinking, how am I supposed to copy my project to another machine? Obviously, simple copying of directory doesn't work (because of virtual environment).
Virtual environments are not relocatable. Recreate your virtual environment from scratch.

why '~' tilde directory is created automatically in home directory on AWS EC2 instance

I have some python scripts running on AWS-ec2 instance on crontab. Every day I found "~" directory in my home directory. Don't know why this happening.
I have to manually remove the ~ (tilde) directory from home-dir.
When I run these script on local ubuntu machine. It's working fine.
There is a bug in the one of the scripts that you are running that is creating this.
Typical shells ~ is used to refer to the users home. And somewhere this is being used where is is not really replaced. Because these are python scripts you might need to manually handle those.
See - How to get the home directory in Python?

Can not copy files From Azure VM to local Windows

I want to copy file from Azure Linux VM to local Windows PC. Actually I remember, I could do this perfectly with the same command but now when I run the cmd it shows message as 100% done but when I go to tmp directory, I dont see the file there.
Here is the cmd I give on Linux VM:
scp -r mlopenedx#138.91.116.170:/edx/var/log/tracking/tracking.log /tmp/
And this is output I get:
tracking.log 100% 70KB 70.0KB/s 00:00
But when I see tmp folder I dont see the file.Can any on suggest me the answer.
I have tried things like: giving Home folder ~/ instead of /tmp/.
Also tried below cmd:
sudo scp -i ~/.ssh/id_rsa mlopenedx#MillionEdx:/edx/var/log/tracking/tracking.log /tmp/
The easiest way to do this is to run pscp from windows like this:
pscp mlopenedx#LINUXVMIP:/edx/var/log/tracking/tracking.log c:/someExistingFolder/tracking.log
to have pscp command you need to install PuTTY.
your command looks wrong as one of the paths needs to be Windows valid path C:/Folder/Folder/File.ext. If you are executing that command from Linux VM and 138.91.116.170 is your Linux vm IP address than you are copping files locally - you can try finding your log file on that linux in \tmp\ folder. In order for that to work from remote Linux to local Windows you would need public IP for your windows or some sort of tunnel that would allow this connection.
Also you are adding -r recursive copy and you are pointing to file.

Blank SSHFS mount folder

I am attempting to mount a remote directory located on my web server to a directory in my xUbuntu installedation hosted in a VirtualBox.
I'm using the following command syntax:
sshfs root#*.*.*.*:/var/www Desktop/RemoteMount
Using the file manager, I navigate to the Desktop/RemoteMount directory but find it entirely blank. The SSHFS command above executed with no indication of an error.
Completely by chance, I use the terminal to long list the contents of the Desktop/RemoteMount directory and it shows all the data I was expecting to see in the file manager.
Can anyone tell me why the file manager does not show my remotely mounted data and how I might fix it?
Thanks.
you are missing local mountpoint.
sshfs -o idmap=user mika#192.168.1.2:/home/mika/remotepoint /home/mika/localmountpoint.
And You need to have localmount folder exist.
thanks Mika

Resources