I have a virtual machine with the FTP server configured.
I'm transferring files in ACTIVE mode and at a random file I get disconnected.
I cannot reconnect to the FTP server nor connect remotely to the machine.
I have to restart the machine and wait a while to regain access.
What can I do in this situation to prevent the complete disconnect?
I ended up using the Passive mode, even though it does not suit me because the Active mode kept failing.
You need more than just those two ports open - the design of FTP (either passive or active) is that the FTP server will send data back on a randomised range of ports (see: http://slacksite.com/other/ftp.html) which presents a problem when using a stateless service like Azure's Load Balancing that requires Endpoints that must be explicitly opened. This setup guide is best to see how to achieve what you want on an Azure VM: http://itq.nl/walkthrough-hosting-ftp-on-iis-7-5-a-windows-azure-vm-2/ (and is linked from the SO post referenced by Grady).
You most likely need to open the FTP Endpoint on the VM: This answer will give you some backgroudn you how to add endpoints: How to Setup FTP on Azure VM
You can also use powershell to add endpoint: Add Azure Endpoint
Related
We are using Azure WebApps and we have this requirement: An external automated client will be connecting to us a few times a day through FTP and drop a small size file < 1KB, we need to act on the data and update our DB accordingly.
Unfortunately, we have no control on the external client and the client will need to communicate via FTP. To me this should have been a RESTful call.
I am fine with setting a Windows Server with IIS as a VM to act as an FTP server or on Ubuntu, but that means a maintenance of a VM for this small requirement.
Are you aware of any Azure specific service that helps in this situation?
We have a setup where we have both VMs and Web Apps in Azure connected to our on-premise resources via a point-to-site virtual network.
We have an folder on premise with access to Everyone open (both on the share and NTFS) and the Azure VMs that are on that virtual network are able to browse to the share without difficulty.
The web apps are not able to access them however.
I'm assuming the following line in this article explains the reason, but I'm looking to confirm this is not possible:
The work required to secure your networks to only the web apps that need access prevents being able to create SMB connections. While you can access remote resources this does not include being able to mount a remote drive.
Coming out of the logs from the attempt from the website to access it:
Taking the C# code out of the picture, trying to get the directory listing from the powershell console on the web app:
I've also tried this with Hybrid Connections, and am getting closer - once it's setup and attached to the Web App, I'm able to tcping the SMB port from the powershell console (which is further than I can get when using the VNET), but it's still unable to list a directory:
Any thoughts? Anyone doing anything similar?
The tcping result is actually misleading - you are really pinging a local port hosted on your web app (hence why the tcping has results of ~1ms). Tcping doesn't actually test the full tunnel for Hybrid Connections because the tunnel is a TCP level data relay only (that is, it does not send TCP headers, etc., over the tunnel, only payload) and tcping does not send any data, only simply verifies that the TCP handshake succeeded.
Unfortunately, the article is correct - SMB will not work at all in your Web App. There are security layers in place that will block the attempt.
We are setting up a new Network which includes a VM in Azure. I can connect to this via RDC.
However, our security guy wants me to access it through a second VM for security reasons. In other words, I first connect with RDC to a "jump server" (which is just another VM in Azure) and then from there, use RDC to connect to the second server.
Is this actually adding a layer of security? It seems to me that unless the RDC on my local machine had somehow gotten a virus or gotten hacked, that there is no benefit to the jump server.
If its a security requirement, the best approach is deploying all your VMs inside a Virtual Network. After that, configure a Point to Site VPN connection on Azure and install the client inside your operating system.
Using that, all your connections to the VMs are done through a secure path to Azure.
I've developed a solution and tested it. It's uploaded to Azure using the convenient method of publishing XML file. Now I realize that it's supposed to be put in on-premise local server (it's an internal application not requiring access to the Internet).
When I go to the server, I use the RDP to access a system. In there, I execute a connection to another RDP. The second system is the one hosting both SQL Server and IIS where the application will reside.
Is it at all possible to construct such a publish XML? If so - how? If not - what should I request form the IT department to open/install on the innermost RDP so I can shove in my stuff by the oh-my-god-I'm-so-lazy press of a button?
You must install webdeploy on your host machine. With IIS and webdeploy installed you can use same publishing techniques as you did with Azure.
I think it goes without saying that you must have direct access to host. If it's on external network you have to open webdeploy's and IIS ports, if you do not want to open this externally I recommend VPN (maybe basic point-to-point) that will create direct line between your dev and host machine.
I have setup FTP in IIS 8.0 on an Azure windows server 2012 virtual machine.
After followed the instructions in this post (http://itq.nl/walkthrough-hosting-ftp-on-iis-7-5-a-windows-azure-vm-2/) I've been able to make FTP work fine in passive mode, but it fails when trying to connect in active mode from FileZilla. FTP client can connect to the server in active mode but fails with timeout error message when trying to execute LIST command.
I carefully revised 20 and 21 endpoints are set in azure vm without pointing to a probe port and that windows firewall allows external connections to 20 and 21 VM ports.
I can't figure out why active mode doesn't work while passive mode works fine.
I know there are other users with some issue.
Please is there someone who had succed setting active ftp in azure VM?.
This previous response is incorrect. https://stackoverflow.com/a/20132312/5347085
I know this because I worked with Azure support extensively. The issue has nothing to do with the server not being able to connect to the client, and my testing method eliminated client side issues as a possibility.
After working with Azure support for 2 weeks, their assessment of the problem was essentially that “Active Mode FTP uses a series of random ports from a large range for the data channel from the client to the server. You can only add 150 endpoints to an Azure VM so you couldn’t possibly add all those ports and get Active FTP working 100%. In order to do this you would need to use “Instance level public IP” and essentially bypass the endpoint mechanism all together and put your VM directly on the internet and rely entirely on the native OS firewall for protection.
If you HAVE to use Active Mode FTP on Azure and are ok with putting your VM on a public IP, he provided this link:
https://azure.microsoft.com/en-us/documentation/articles/virtual-networks-instance-level-public-ip/
UPDATE: Official response from Azure Support:
Josh,
First of all thanks with your patience on this. As I mentioned in my
last email I was working with our Technical Advisors which are Support
Escalation Engineers on reproducing this environment in Azure. Our
tests were configured using WS_FTP 7.7 (Your version 7.1) and WS_FTP
12 client as well as the Windows FTP client. The results of our
testing were the same as you are experiencing. We were able to log in
to the server, but we get the same Command Port/List failures.
As we discussed previously Active FTP uses a random port for the data
plane on the client side. The server connects via 21 and 20, but the
incoming port is a random ephemeral port. In Passive FTP, this can
be defined and therefore endpoints can be created for each port you
use for part of the data plane.
Based on our extensive testing yesterday I would not expect any other Active FTP solution to work. The escalation Engineer that
assisted yesterday also discussed this with other members of his team
and they have not seen any successful Active FTP deployments in Azure.
In conclusion, my initial thoughts have been confirmed with our
testing and Active FTP will not work in the Azure environment at
this time. We are always striving to improve Azure’s offering so
this may be something that will work in the future as we continue to
grow.
You will need to move to a passive FTP setup if you are going to host
this FTP server in Azure.
When using active ftp, the client initiates the connection to port 21 on the FTP server. This is the command or control channel and this connection usually succeeds. However, the FTP server then attempts to open port 20 on the client. This is the data channel. This channel is used for all data transfers, including directory listings.
So, in your case, active FTP isn't working because the server can't initiate a connection to the client. This is either a problem on the server (outbound firewall rule) or on the client itself. This is usually a good thing because you don't want internet-based servers to be able to open connections on client machines.
In passive mode there is a clear client/server distinction where the client initiates connections to the server. Passive mode is recommended so if you got that working I'd stick with that.