Secure File Transfer Using Cron - cron

I have a cron job which backs up my database and ftp's it to my home machine. However, as I understand it ftp is not a secure way to transfer files.
I'v read quite a few things on this subject but non of them will work for me because:
I don't have shell access to my remote hosting account
I can connect to my remote host via sftp, but going the other way sftp will not allow me to use the [-i identity file] option on my remote hosting account to authenticate with my home machine.
I don't have an SSL certificate and would rather not have to buy one.
What is the best way for me to transfer files from my remote server to my home machine? The method needs to be secure and scriptable.

If you don't have shell access how would you update the cron? I always use crontab -e but without shell you probably can't.
Anyway, if you can modify the cron you could have the file you want transferred to a non-web accessible directory and then SFTP in from your home machine once a day - after the backup on the server should be complete - and have the home machine initiate the download.

Related

How can I manage files on Ubuntu server VM on Azure?

I have deployed a LAMP stack to a virtual machine in Azure, by following this tutorial:
https://learn.microsoft.com/en-us/azure/virtual-machines/linux/tutorial-lamp-stack
It's all up and running. However, I can't figure out how to manage the files on the server, and/or copy/upload files to the server.
I can ssh into the VM using the Azure Cloud shell, but I don't seem to have access to my local files if I do it that way. So I installed the Azure CLI on my local machine but when I try to open an ssh session to the server I get 'permission denied (publickey)'.
I've looked into secure copy - scp - and have tried connecting to the server with Putty and WinSCP, but the error that I get is 'No supported authentication methods available (server sent: publickey)'
I'm new to Apache and just can't figure out how to list the files on the server or manage them at all...
When you use the secure copy "scp", there is one point you should pay attention to. If you create the Azure VM with setting the user as azureuser, and then you just can use the command scp /path/file azureuser#domainName:/home/azureuser/filename to copy the file. Because you just have the permission of the user "azureuser" so that you just can copy the file from outside to the vm directory /home/azureuser no matter you use a password or the ssh public key.
Update
If you create the Azure VM with ssh public key, you need to store the key in where you want to connect the VM. For example, you want to connect to the VM in local Windows 10. The key should be stored in the directory "C:\Users\charlesx\.ssh". So that you can connect to the VM, also with scp command.
I solved this by using puTTY and WinSCP. Whereas before I had been using the Azure Cloudshell commands to create the VM and generate the ssh keys - so I could connect to the VM using Cloudshell but since I didn't know where the auto-generated keys were stored, I couldn't connect on my local machine.
My solution was to create the VM through the Azure portal UI interface. I used puTTYgen to generate ssh key pairs on my local machine, then I input the public key into the Azure UI when creating the VM. Once the VM was provisioned in Azure, I could connect to it using puTTY and install LAMP and any other command-line stuff that way.
I also used WinSCP to copy the files to where I wanted - I could have done it command-line with scp, but I'm a visual person and it was useful to be able to see the directory structure that had been created. So a combination of the two worked well for me.

How to access physical UI layer magento root directory in Bitnami over Azure vm

I had deployed magento directly on Microsoft azure platform using BITNAMI(Linux platform), Everything works cool instead i cant able to access the physical directories of mlinux machine. Due to that facing trouble with installing the theme.
If you want access to the filesystem inside the machine where you're running your application, you should connect to your Azure instance via SSH. For example, you can follow this guide.
Also, if you want to know how to obtain your credentials, you can follow this guide.
get your key file or credentials to access the server via SSH / SFTP
confirm SSH PORT ( 22 or 8888 )
in bitnami, Magento is located at : /opt/bitnami/apps/magento/htdocs/
make sure to allow firewall settings for port ( SSH )
you can find all information here

Azure VM - why FTP transfers lead to complete disconnect?

I have a virtual machine with the FTP server configured.
I'm transferring files in ACTIVE mode and at a random file I get disconnected.
I cannot reconnect to the FTP server nor connect remotely to the machine.
I have to restart the machine and wait a while to regain access.
What can I do in this situation to prevent the complete disconnect?
I ended up using the Passive mode, even though it does not suit me because the Active mode kept failing.
You need more than just those two ports open - the design of FTP (either passive or active) is that the FTP server will send data back on a randomised range of ports (see: http://slacksite.com/other/ftp.html) which presents a problem when using a stateless service like Azure's Load Balancing that requires Endpoints that must be explicitly opened. This setup guide is best to see how to achieve what you want on an Azure VM: http://itq.nl/walkthrough-hosting-ftp-on-iis-7-5-a-windows-azure-vm-2/ (and is linked from the SO post referenced by Grady).
You most likely need to open the FTP Endpoint on the VM: This answer will give you some backgroudn you how to add endpoints: How to Setup FTP on Azure VM
You can also use powershell to add endpoint: Add Azure Endpoint

Managing inter instance access on EC2

We are in the process of setting up our IT infrastructure on Amazon EC2.
Assume a setup along the lines of:
X production servers
Y staging servers
Log collation and Monitoring Server
Build Server
Obviously we have a need to have various servers talk to each other. A new build needs to be scp'd over to a staging server. The Log collator needs to pull logs from production servers. We are quickly realizing we are running into trouble managing access keys. Each server has its own key pair and possibly its own security group. We are ending up copying *.pem files over from server to server kind of making a mockery of security. The build server has the access keys of the staging servers in order to connect via ssh and push a new build. The staging servers similarly has access keys of the production instances (gulp!)
I did some extensive searching on the net but couldnt really find anyone talking about a sensible way to manage this issue. How are people with a setup similar to ours handling this issue? We know our current way of working is wrong. The question is - what is the right way ?
Appreciate your help!
Thanks
[Update]
Our situation is complicated by the fact that at least the build server needs to be accessible from an external server (specifically, github). We are using Jenkins and the post commit hook needs a publicly accessible URL. The bastion approach suggested by #rook fails in this situation.
A very good method of handling access to a collection of EC2 instances is using a Bastion Host.
All machines you use on EC2 should disallow SSH access to the open internet, except for the Bastion Host. Create a new security policy called "Bastion Host", and only allow port 22 incoming from the bastion to all other EC2 instances. All keys used by your EC2 collection are housed on the bastion host. Each user has their own account to the bastion host. These users should authenticate to the bastion using a password protected key file. Once they login they should have access to whatever keys they need to do their job. When someone is fired you remove their user account to the bastion. If a user copies keys from the bastion, it won't matter because they can't login unless they are first logged into the bastion.
Create two set of keypairs, one for your staging servers and one for your production servers. You can give you developers the staging keys and keep the production keys private.
I would put the new builds on to S3 and have a perl script running on the boxes to pull the lastest code from your S3 buckets and install them on to the respective servers. This way, you dont have to manually scp all the builds into it everytime. You can also automate this process using some sort of continuous build automation tools that would build and dump the build on to you S3 buckets respectively. Hope this helps..

CentOS: How can I create a secure user account to host a web-site?

On a fresh CentOS application;
How can I create a separate user other than root, to store the
website hosting files?
How can I lock this user down to prevent malicious or bad things from happening?
How can I further protect the php file containing the DB connection strings?
What other security measures shall I take to protect a such server which is only used to server a web-app? (or two)
What other ways shall I employ for sand-boxing the web-app?
I am running Centos on a VPS and want to use Apache or Lighttpd as the web server.
Thank you.
One best practice is always to only run services you actually need on a box facing the internet. So if you only need apache and a database, run only apache and the database on that machine. Long, random passwords for maintenance user, do not allow direct root login.
Regarding the user: add a user with useradd and block shell access for that user (usermod -s, set login shell to /sbin/nologin). Usually a service account for running the web server is created after installing the web server. If you restrict permissions for that account to the web server home and logging directories, you should be fine.
Regarding protecting the database: you can create a db user account that doesn't have drop or create privileges, but as your application needs access to the database, someone acting with the privileges of your web server or application will have access to the data in the database as well.

Resources