scp files from batch account user's folder to local machine - linux

I am a part of a batch account or a headless user account on a remote machine. To ssh passwordless to the batch account, I have appended my .ssh/authorized_keys to the batch account's .ssh/authorized_keys. This ssh as the batch account user to the remote machine works fine.
Now, I have the need to copy certain files from this headless user account's directories to my machine. So, whenever I do
scp batch_user_account#remote_machine:file_address local_machine_address
it asks for the batch_user_account's password, which I am not aware of.
I also tried to offer my private key as identity file like:
scp -i ~/.ssh/id_rsa batch_user_account#remote_machine:file_address local_machine_address
But this also gives me a permission denied error to the batch user account's folder.
Am I doing something incorrect here?
Can anyone guide me here?
Thank you.

I tried the same task (of copying files from batch user account on remote machine to another machine A) by choosing a different machine B instead of A. I wanted to see if the error reproduced. To ssh passwordless to the batch account on remote machine through this new machine, I appended my .ssh/authorized_keys to the batch account's .ssh/authorized_keys. On this new machine, the command
scp batch_user_account#remote_machine:file_address local_machine_address
worked fine. So, I realized there were permission issues that I had to solve. When I changed the permissions on the file destination machine, it worked.

Related

Unable to mount file on windows from Azure

I made a file on Azure using "File Service" and then tried to mount it using "connect". It has given me the username: localhost\xyz.
Two questions:
why username starting from "localhost" and not with "Azure"?
why I am unable to mount as windows security not giving any error, instead keep on turning back to credentials page?
p.s. TCP port 445 working properly..
Here are a few workarounds that worked for us.
WAY-1
You can directly go to your PowerShell of your machine and paste the script that you have provided in your storage account
WAY-2
You can click on More options and select for different account and then use the storage account name prepended with AZURE\ as the username and a storage account key as the password.
WAY-3
You can create a file share directly by unchecking the connect using different credentials.
OUTPUT:
For all the above ways here is the screenshot of fileshares that got mounted.
REFERENCES:
Mount SMB Azure file share on Windows

Why we get a pem file when creating a VM on Microsoft Azure?

I'm recently working on creating a cloud instance on Azure. Once I created a new VM for the service I need, it always lets me download a pem file. However, it seems like I can log in to the VM through SSH without using the pem file.
Besides that, when I check the "authorized_keys" file on the new VM, it includes a public key, which is not the one on my local machine's "id_rsa.pub" file.
I'm wondering how I could log in without the public key stored in the authorized_keys file?
I think this question is related to SSH, thanks in advance!
Why we get a pem file when creating a VM on Microsoft Azure?
Disabling password logins to SSH is a common practice for SSH hardening [1,2]. The PEM file provided by default will help you achieve this.
Besides that, when I check the "authorized_keys" file on the new VM,
it includes a public key, which is not the one on my local machine's
"id_rsa.pub" file
I believe you are viewing the file for another user or comparing the wrong keys.
I'm wondering how I could log in without the public key stored in the
authorized_keys file?
You could change the authorized_keys file you are referring to by modifying the AuthorizedKeysFile variable in the /etc/ssh/sshd_config file.

How to restrict folder permissions of Gitlab shell executor on Windows

I'm mostly new to Gitlab and the Windows command line. I set up a Gitlab runner on my windows PC and it works well. However, I want to restrict it so that it can only access the folder I set it up in, and all subfolders. What is the most reliable way to do this?
Since the GitLab runner is installed as a service, you could:
create a second Windows account
use that account to register your service
gitlab-runner install --user ENTER-YOUR-SECOND-USERNAME --password ENTER-YOUR-SECOND-PASSWORD
protect the folder (and its content) you want with the second user account.
By default, the second account would not have read/write access to at least your own C:\user\login, provided you change the File and Folder permission setting.

azure linux vm recovery - unable to remote login

forgot user name password for a linux (ubuntu) vm. tried to "Reset Remote Access" from the portal, but it is not helping - more than 30 minutes - it still shown in progress. Tried to do it via azure command line. Created a new user with password, but unable to login. SSH says access denied. Should I do any additional steps?
After creating new user you should also reset your SSH connection. You could refer to Reset Access and Manage Users and Check Disks with the Azure VMAccess Extension for Linux for detailed steps.

Allowing additional users to access and EC2 instance

I have set up an Amazon EC2 instance and am able to SSH into it. Can anyone please tell me how I could allow additional users to SSH into this instance from a different location?
Max.
I started out creating additional users. But it is pointless if you want to give them sudo access anyway. Which you probably do want right? Because giving them sudo acccess gives them access they want to do anyway, so creating their user account was just a waste of time. Additionally creating additional users is an onerous task and leads to a lot of different permissions problems, and means you have to monkey around with the sudoers file to allow them to undertake sudo tasks without entering their password everytime.
My recommendation is to get the new user to provide you with a public key and have them use the primary ubuntu or root account directly:
ssh-keygen -f matthew
And get them to give you the .pub keyfile and paste it into the .ssh/authorized_keys file of your ec2 server.
Then they can login with their private key directly into the ubuntu or root account of your ec2 instance.
I store the authorized_keys file in my private github account. The public keys are not very useful unless you have the private key component so putting it in github seems fine to me. I then put the deployment of my centrally stored authorized_keys file as part of my new server production process.
And then remove the public key from access when they leave your employment. This will lock them out.
Create additional users at a *nix command prompt
useradd
Create a new rule in the security group which has been applied to your instance, enabling ssh for the public IP Range of your remote user
For specific instructions check out: http://developer.amazonwebservices.com/connect/entry.jspa?externalID=1233.
1.
Max.

Resources