I know very little about Azure, but I am looking for a cloud server where I can have clients SFTP their files to us. It will be used primarily for data storage. The only requirement is that the files be sent over SFTP (not FTP).
Does anyone have any experience with this? How difficult is this to setup? Is this even possible?
You can find step by step instructions on how to set up a regular FTP site on Windows Azure VM here - http://nicoploner.blogspot.com/2010/12/ftp-server-on-windows-azure-from.html
Here's how to set up SFTP on Windows Server (applies to Azure VM as well) - http://www.digitalmediaminute.com/article/1487/setting-up-a-sftp-server-on-windows
Yes you can set up an Azure VM Role and then install a SFTP Server for a Windows Server.
You can also set up a Linux VM Role and just use the native sftp command.
Depending on what you are doing, you may want to use a RESTful service that points back to blob storage (this is not SFTP), but it does go over HTTPs and you have all the benefits of Azure Blob Storage directly.
Here are a couple of options and additional resources:
1) You can install SFTP on Windows Server
https://winscp.net/eng/docs/guide_windows_openssh_server
This uses an OpenSSH package on GitHub from Microsoft.
2) You can use an Ubuntu VM
As #Bart Czernicki mentioned, OpenSSH is built into Linux, and it comes with SFTP out of the box. Customize your implementation using the /etc/ssh/sshd_config file.
3) SFTP Gateway
We have a product on the Azure Marketplace called SFTP Gateway that might help. (Disclosure: I work for Thorn Technologies.)
This is a good option for launching an SFTP server without having to build it from scratch. It also has a web interface for managing users, to help minimize the time spent at the SSH terminal.
BTW, although this wasn't asked in the original question, you might want to consider moving data to a durable storage layer (Azure Blob Storage). One approach would be to use incron to listen for file events. Once a file is done transferring via SFTP, use the Azure CLI to copy the file to Azure Blob Storage, and then delete the file from disk on success. This is the approach we used to build SFTP Gateway.
Hope this helps!
Related
We have Windows Server 2016 Azure Virtual Machines using managed disks.
I am trying to create an Azure Data Factory pipeline that will let me copy certain files from a folder on the hard drives of those VMs, to our Azure SQL Server. I was quite surprised to see no ADF connectors available for Azure VMs; then I checked Logic Apps - same issue, no available connectors for connecting to Azure VM's there either.
Then I did some Googling to find out how, in general, you can access an Azure VM file structure from outside (without using Remote Desktop) and was even more surprised to see that there isn't any info out there about this (not even that it can't be done).
Is it possible for me to access the file system of my Windows Server 2016 Azure VM without using Remote Desktop? The VM's are running Managed Disks if that makes any difference.
You can either ssh your_vm_ip and then use rsync command to download or upload files.
rsync -au --progress your_user_name#ip.ip.ip.ip:/remote_dir/remote_dir/ /local_dir/local_dir/
Otherwise you can install Dropbox in the VM and your local computer, transfering small files in the shared Dropbox folder is very fast..
Here are some instruction slides on the Azure storage system and their Storage Explorer App.
I have a web based .Net application whose artifacts are being uploaded to Azure cloud through FTP Upload task. The issue is, it does upload the artifact but it is a zip file. How can I have it unzipped over the target location as there is no option of unzipping in FTP upload task.
I do not have the FQDN or IP of the Azure cloud server as it a PaaS based infrastructure, all I have is FTP location.
You cannot unzip file on FTP server. No matter what client/library/framework you are using. FTP protocol simply does not allow that.
See also:
Can we unzip file in FTP server using C#
How to unzip files via an FTP connection?
Based on my understanding, if you want to use Azure DevOPs FTP Upload task you need a FTP server address, username and password.
If it is that case, you could use the Azure logic App FTP(add or modify file) trigger to extract the file.
If it is not working for you and Azure storage is acceptable.
My workaround is that you could use the [Azure File copy] task to copy the file to your azure storage. Then you can control it by yourselves, for example: you could use the Azure function blob trigger to extract the file with you customized code.
The question is quite vague, but it sound like you might be trying to upload to an Azure WebApp which has FTP and also zip deploy functionality that uses the Kudu interface.
https://learn.microsoft.com/en-us/cli/azure/webapp/deployment/source?view=azure-cli-latest#az-webapp-deployment-source-config-zip
Using this Azure CLI command it will push your zip and deploy/unpack it into the WebApp for you.
PS. It's impossible to FTP without a DNS name or IP so you will have one of them in specified in the FTP location you've been given
Is there a way to enable Anti-malware monitoring of cloud service. With the latest release for enabling cloud service, should we still need to have power shell has start up for cloud Services and Power-shell will invoke XML template of anti malware.
I could not be proper documentation of how to enable from Cloud Services solution perspective
Here are the Some of the questions:
Should i still need to have Power shell has startup and configure it
for CSDEF file
Where should i place xml template in cloud project, in the power shell, we need to give location of xml file, should xml file and power shell script should be # same level i.e in the same directory
If we enable Monitoring of Anti malware services for cloud services, we need to give storage account, is there a way where storage account pickup dynamically based on envrionment we are deploying. End of the day,I'm looking for automated way of setting Monitoring in Production and UAT envrionments
Use the PowerShell cmdlet, Set-AzureServiceAntimalwareExtension to enable antimalware in your cloud service. Here's some more info:
Set-AzureServiceAntimalwareExtension
For installing any software in a cloud service, the approach I've found to be best is to implement the OnStart() method in the WebRole.cs class of a project that you've deployed. (http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.serviceruntime.roleentrypoint.onstart.aspx)
You can do something like:
Download the software you need, or reference a file you've bundled with the code.
Install it.
Configure it.
Run it.
For the big picture - monitoring environments - I wouldn't spend too much time on anti-virus/malware software. No one is installing that if they can't get access to your machine.
Things you can do to lock down your machine/monitor it:
Make sure all your endpoints are locked down. Only expose ports that need to be used, for example, port 80 for HTTP.
Use SSL for HTTP.
Install something like Bosun (http://bosun.org/) or Opserver (https://github.com/opserver/Opserver) to monitor CPU, RAM, network connections etc.
(Note: Tried installing Bosun on a Windows Cloud Service earlier this week and not all the metrics seem to be reporting.)
This might not be so much of a programming question..but still..
I have the need of getting a site the currently is hosted in azure down to a local development environment.. is there anyway to do that?, any tools or such?..
Thanks in advance!
Not currently. Once the cloud service deployment package has been handed over to the Azure Fabric controller, there is no way to reclaim it, even if you submit a support ticket. The closest you can get to this is either upload packages to Windows Azure Blob Storage first, then deploy from there, or enable remote desktop and copy the files from inside the VM to an external storage account.
My suggestion would be to do one of the following:
If you have RDP enabled, you can remote in and grab the files
Otherwise, I would suggest creating a support case and having Microsoft help you get out the files: https://support.microsoft.com/oas/default.aspx?&c1=501&gprid=14928&&st=1&wfxredirect=1&sd=gn
I have a webrole I'd like to host in IIS for the time being.
Does anyone know how involved this is, considering that I still want Azure Storage functions of the IIS site to still work?
Azure Storage (tables, blobs, queues) only run on the actual Windows Azure environment in the cloud. There is a simulated development environment that runs a facsimile on a local SQL Server database, but that is only meant for development purposes and cannot be used for running an actual site.
Theoretically, you could run your webapp locally and connect to Azure Storage over the internet (e.g. by using the REST api), but latency would almost certainly be too high for any interactive site.
So, if you want to be able to run your site on premise on your own IIS environment, you will need to remove all the specific Azure platform dependencies and build in non-Azure alternatives. For Azure Storage, you could either do a relational database (SQL Server, mySQL) or look at a nosql/document database.
If you want to move it to IIS then tijmedvdk's answer is correct.
If your goal is to run it in your data center then you should consider Azure Appliance http://www.microsoft.com/windowsazure/appliance/ this allows you to run Azure applications on premise, without making any changes.
This answers seems misleading. Windows Azure is a platform that provides several services and you can choose from the services that you want to use.
In essence a Windows Azure is just a Virtual Machine with
*Windows Server 2008 R2
*IIS 7.5
So can if you have an application that you are currently hosting in Azure and you want to host it in IIS I don't see much of a problem there.
If you are using Storage, the only problem might be that the Storage account settings were in the WebRole or Service configuration files, but you can change your app logic to take the appropiate settings from other config files.
I have created Windows Desktop applications that for several reasons use Azure Storage and i also think of that as a great advantage of cloud computing.