Access FTP via HTTP? - .htaccess

We have an external secure FTP server that we want to access through HTTPS (our infrastructure does not support FTPs). I know that's possible but I don't know how. I'm looking for something like this:
ftp://ftp.mozilla.org/pub/mozilla.org/zz
http://ftp.mozilla.org/pub/mozilla.org/zz
Thanks!

To add some clarification: FTP and HTTP are, as SLaks said, two entirely different things. The links you have posted use two separate protocols. One if ftp, and one is http. You appear to be getting confused by the second link because it still has ftp in it. What is happening there is that "ftp.mozilla.org" is the domain name of that server. the pages themselves look similar because there is not actual page you are referencing (you are visiting the directory itself) and there is no default page specified in that directory (for example, no index.html).
The default behavior in this case is to simply list the directory contents, which is pretty much what the ftp protocol does anyway.
So:
You will need to either install a web server program (not an ftp server program!) on the ftp server (the physical box) and let users download files using the http(s) protocol, or you will as SLaks suggested need to create your own proxy (or find one that exists) that will receive commands from the http protocol and transform them into the equivalent ftp commands, which are then sent to the ftp server.
Personally, I recommend the former, as it is less complicated.

FTP and HTTP are two different protocols that have nothing to do with each-other.
You need to run an HTTP server.
You can either run an HTTP server that exposes the same files (like Mozilla does), or write an HTTP proxy for the FTP server.

Sounds like you are looking for a web-based FTP client. http://www.net2ftp.com/ is a good place to start, but you will have to configure the tunnel appropriately within your network. A solution like net2ftp will tunnel traffic to and from the server as HTTP, then running local scripts.
You will also want to remember that there are other file protocols your network administrator can open up aside from SFTP/FTP. Ask them about a private SSH key alternative, which would avoid a public-facing web-based FTP server/client solution.

Related

How to setup forward proxy on Windows server for outgoing HTTP and HTTPS requests?

I have a windows server 2012 VPS running a web app behind Cloudflare. The app needs to initiate outbound connections based on user actions (eg upload image from URL). The problem is that this 'leaks' my server's IP address and increases risk of DDOS attacks.
So I would like to prevent my server's IP from being discovered by setting up a forward proxy. So far my research has shown that this is no simple task, and would involve setting up another VPS to act as a proxy.
Does this extra forward proxy VPS have to be running windows ? Are their any paid services that could act as a forward proxy for my server (like cloudflare's reverse proxy system)?
Also, it seems that the suggested IIS forward proxy plugin, Application Request Routing, does not work for HTTPS.
Is there a solution for both types of outgoing (HTTPS + HTTP) requests?
I'm really lost here, so any help or suggestions would be appreciated.
You are correct in needing a "Forward Proxy". A good analogy for this is the proxy settings your browser has for outbound requests. In your case, the web application behaves like a desktop browser and can be configured to make the resource request through a proxy.
Often you can control this for individual requests at the application layer. An example of doing so with C#: C# Connecting Through Proxy
As far as the actual proxy server: No, it does not need to run Windows or IIS. Yes, you can use a proxy service. The vast majority of proxy services are targeted towards consumers and are used for personal privacy or to get around network restrictions. As such, I have no direct recommendations.
Cloudflare actually has recommendations regarding this: https://blog.cloudflare.com/ddos-prevention-protecting-the-origin/.
Features like "upload from URL" that allow the user to upload a photo from a given URL should be configured so that the server doing the download is not the website origin server.
This may be a more comfortable risk mitigator, as it wouldn't depend on a third party proxy service. A request for upload could be handled as a web service call to a dedicated "file downloader" server. Keep in mind that if you have a queued process for another server to do the work, and that server is hosted in the same infrastructure, both might be impacted by a DDoS, depending on the type of DDoS.
Your question implies that you may be comfortable using a non-windows server. Many softwares exist that can operate as a proxy(most web servers), but suffer from the same problem as ARR - lack of support for the HTTP "CONNECT" verb, which is used by modern browsers to start an HTTPS connection before issuing a "GET". SQUID is very popular, open source, and supports everything to connect to.. anything. It's not trivial to set up. Apache also has support for this in "mod_proxy_connect", but I have no experience in that and the online documentation isn't very robust. It's Apache, though, so it may be worth the extra investigation.

intercept all request from local machine and create specific responses for them

We need to extend pretty old application, which make a lot of integration testing connected with remote datasources, but this cases wasn't properly writtten, so it's not an easy way to change it to some network independent stubs. Is it possible to create some kind of a script for unix, who will be listen for specific requests, let's say to google.com/api/123 and if such request found, not allow it to go forward, but return some value, which we previously mapped on that url?
You most likely need to set up a proxy server like Squid, set up redirection and route all traffic through that proxy.

Setup virtual hosts file to host the source code from remote server

I would really appreciate your support for the below inquiry
Current Situation:
I have a web app (contains a module to upload documents) on a Linux Apache server "A" that can only be HTTP-ed through the intranet.
Required:
Another Linux Apache server "B" is required to host the same web app, while maintaining the source code on server "A" only. Server "B" can be HTTP-ed through the internet and intranet.
Blocking points:
Under the current circumstances we are unable to host the website on server "B" directly (which would seem like the logical solution).
Question:
Is it possible to setup the virtual-hosts of the httpd.conf file for such requirement?
Research:
Usually most of my findings were posts about deploying a load-sharing/load-balancing solution (not my objective), or setup a two-way synchronization process between "A" and "B" (last resort solution).
Googled strings:
share website between two servers, host website on two servers, virtual host to another server, run single website on multiple servers setup, virtual host for website on another server, host a website on two different servers, setup two linux servers to host the same website
Server Details:
Server A:
Server IP: 192.168.xxx.xxx (accessible through the intranet only)
Hosts the website source code
Apache server
OS: RHEL5
Server B:
Accessible through the intranet and internet
Apache server
OS: Same as A (RHEL5)
Summing up what you've probably found yourself by now: unfortunately, there are two things that are called proxying. The you are interested in is called a reverse proxy, in which B will take requests and forward them to A. The client never sees that A even exists. There are few security concerns, depending on what angle of security you look at:
server A only ever sees requests from B, not the original client, so any IP-based restrictions you want should be configured on server B.
The usually mentioned security concern is that a (forward) proxy will ask arbitrary servers for things on behalf of the client, so it masks the client's identity. I don't think you need to worry about this as long as you put ProxyRequests Off to disable forward proxying.
Server A might accidentally reveal its IP, which you might not be comfortable with. When B passes back the answer to the clients request that it has received from A, it will not look at the payload. So, if you return HTML documents, they better all have only relative paths. I think this might be the problem you are having: if your code still contains references to 192.168.x.y, those won't work for the external client. If you are changing paths (i.e. you have something like ProxyPass /somepath http://internal-server/otherpath), things become even more complicated, so try to avoid that. (In general, your backend application would need knowledge of what its publicly-visible URIs are. How to do this depends on the application.)

Forwarding or exporting a client certificate in IIS6/7

Currently, our program runs on JBoss and sits behind an apache reverse proxy. Apache handles verifying the client certificate. We have the +ExportCertData option set in apache, and then we use
RequestHeader set SSL_CLIENT_CERT "%{SSL_CLIENT_CERT}e"
to put the cert in the header field SSL_CLIENT_CERT before forwarding to JBoss. Our application in Jboss then reads the cert looking for the SubjectAltName to get the e-mail address, which we use to save the user a step in entering it in.
Now, we will have to live behind IIS, and will need similar functionality to this. What we really care about is extracting the email address from the SubjectAltName. In an ideal world, IIS would provide the same information as apache, so we wouldn't have to modify our application code too much. But if it's not possible, other options are good as well.
Some other notes:
We will probably need to support IIS6 and IIS7. It would be nice to have one solution that works across both, but not necessary
We are currently using IIRF to forward requests that go to a certain virtual directory, but I would be interested in hearing other solutions that could accomplish what we're looking for along with forwarding to our application server.
Just throwing apache in front of IIS isn't going to be a solution because we have to share the box with other programs that use IIS and they might be wary of such a solution. Also, we can't just run on a different port because of firewall restrictions only allow port 80 and port 443.
Any ideas how to make this possible? Let me know if there's any more information I can provide.

How can I download a file over multiple interfaces in OS X or Linux?

I have a large file I want to download from a server I have root access to. I also have several different, concurrent internet connections from my machine to the server at my disposal.
Do you know of any protocol, (S)FTP client, HTTP client, AFP client, or any other file transfer protocol server and client combination that supports multithreaded downloads over different connections?
One option would be the "old fashioned" multi-part file..
split -b 50m hugefile multiparthugefile_
That will create multiparthugefile_a, multiparthugefile_b and so on. To rejoin them, use the cat command:
cat multiparthugefile_* > hugefile_rejoined
To actually transfer the files using different interfaces, the wget --bind-address=ADDRESS flag should work:
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host.
This problem seems like something Bittorrent is positioned to do well, but I'm not sure exactly how you would do this..
Perhaps create a temporary tracker (or use something like OpenBitTorrent.com), and run multiple clients locally - as long as the clients support the LAN transfer feature, each client would grab different parts from the server, and share them with the (local) clients. You'd end up with multiple copies of the file locally, but it would only transferred over the internet once
Any of these? You'll need a webserver hosting the same file on all the interfaces though.
In case of HTTP or HTTPS, as long as server supports range requests you can fetch the ranges separately and stitch them together. I started working on a use case that is pointed by you. If you are still interested, here is a link to my repository https://github.com/m0hithreddy/MID.
The program (MID) uses SO_BINDTODEVICE socket option to bind to a specific interface, so in most of the cases you require super user permissions and CAP_NET_RAW capability (root user has).
MID determines the network interfaces to use in the download and adopts two step split for downloading the content.
First step: The file is divided among network interfaces (in real time).
Second step: Further the file is divided among several HTTP range requests that arises from that particular interface (NOTE: Server should support them at the first place to make all of this possible)
MID supports HTTP and HTTPS protocol.
Cheers :)
http - check out one of the various download manager (ie firefox with http://www.downthemall.net/ extension)
there are also ftp downloader that support multiple streams

Resources