Window10 iis Simultaneous Download Problem - iis

Tested on Windows 10 home version of iis ftp server.
It is attempting to download 1GB of files from 20 Android devices at the same time.
However, the maximum number of simultaneous downloads is only two.
The maximum number of connections for the ftp setting is 4294967295.
Twenty devices will attempt to log in to ftp under one account and download it.
I don't think this is a problem.
My ultimate goal is to download more than 10 simultaneous downloads.

As far as I know, the windows10's IIS has the connection limit, it just support three client connect to the IIS at same time.
If you want to increase this number, please use windows server instead.

Related

Azure B1Is nonfunctional upon creation

I have an almost static site that i was happy hosting on Storage blob. However, i need to have php script run to support email communication through the contact html form.
So i decided to buy the smallest VM which is B1Is which has 1 CPU and 0.5 GB of memory. I RDP to the server and to my astonishment I cannot even open one file or folder or Task Manager without waiting endlessly before the "Out of memory ...please try to close programs or restart all"!
The Azure team should not sell such a VM if it will be nonfunctional from the get go. Note that i installed ZERO programs on it.
All i want is php and setup the site on IIS. And add a certificate license to it. NO Database or any other programs will run.
What should i do?
Apparently it is because "1 B1ls is supported only on Linux" based on the notes on their page.
https://learn.microsoft.com/en-us/azure/virtual-machines/windows/sizes-general

Log FTP connection failures

I have a FTP server (IIS) in cloud. It deals with files (text) having size in GB sometimes. Customers are complaining about the connection failures or download/upload failures.
Is there any way I can log any failed (negative) action performed with my FTP server?
I have tried IFtpLogProvider in .Net but it does not give me valid FTP Status.
for example, if I start an upload & download from client & disconnect the network, still it records status as 226 which is successful transfer.
Either I am missing something with IFtpLogProvider or I have misunderstood the codes.
Is there any other way to record all the FTP transactions which will allow me to investigate the issue being faced by my customers?
I made a silly mistake. I did not enable FTP Extensibility in Windows Features within IIS. Once enabled, it started working.

How to handle lots of file downloads from my linux server

I have a file 50MB file hosted in my deticated linux server, each day there is almost 50K users that download this file (2.5GB a day).
There are lots of crashes and users reports that sometimes even the file can't be downloaded since the server is overload.
I wonder if someone can help me how do I calculate which server/bandwidth/anything I need to handle that?
Is there any solution where I can host the file and pay per download?
Is there any setting or anything that I can improve or do on my server that will help to fix this issue?
My current server specification is:
2 x Intel Xeon E5 2620V2
2 x (6 x 2.10 GHz)
128 GB REG ECC
256GB SSD HD
1 IP Address
1 Gbit/s port Shared Bandwidth
I'll appreciate any help from you guys.
Thank you very much.
Your hardware configuration should probably be fine. At least if the downloads are more or less evenly distributed over the day.
One of the most effective http servers for serving static content is nginx. Take a look at this guide: Serving static content.
If that doesn't help, you should consider Amazon S3, which is probably the most popular file hosting solution with a reasonable price tag.
This is how not to make the file available for download:
data = read_file(filename)
echo data
You want to be using sendfile(2) to have the kernel stream the file directly into the socket instead of doing it in userspace.
Each server has their own mechanism for invoking sendfile(2); with httpd this is mod_xsendfile and its associated response header (X-SENDFILE). You'll find that moving to this will allow you to not only handle your current userbase but also to add many more without worry.

FTP suddenly refuses connection after multiple & sporadic file transfers

I have an issue that my idiot web host support team cannot solve, so here it is:
When I'm working on a site, and I'm uploading many files here and there (small files, most of them a few dozen lines at most, php and js files mostly, with some png and jpg files), after multiple uploads in a very short timeframe, the FTP chokes on me. It cuts me off with a "refused connection" error from the server end as if I am brute-force attacking the server, or trying to overload it. And then after 30 minutes or so it seems to work again.
I have a dedicated server with inmotion hosting (which I do NOT recommend, but that's another story - I have too many accounts to switch over), so I have access to all logs etc. if you want me to look.
Here's what I have as settings so far:
I have my own IP on the whitelist in the firewall.
FTP settings have maximum 2000 connections at a time (Which I am
nowhere near close to hitting - most of the accounts I manage
myself, without client access allowed)
Broken Compatibility ON
Idle time 15 mins
On regular port 21
regular FTP (not SFTP)
access to a sub-domain of a major domain
Anyhow this is very frustrating because I have to pause my web development work in the middle of an update. Restarting FTP on WHM doesn't seem to resolve it right away either - I just have to wait. However when I try to access the website directly through the browser, or use ping/traceroute commands to see if I can reach it, there's no problem - just the FTP is cut off.
The ftp server is configured for such a behavior. If you cannot change its configuration (or switch to another ftp server program on the server), you can't avoid that.
For example vsftpd has many such configuration switches.
Going to something else like scp or ssh should help
(I'm not sure that calling idiot your web support team can help you)

Improve file download performance in Windows 2003

I have a Watchguard x1250e firewall and a fast network setup at pryme.net in Ashburn, VA. I have Verizon FIOS here at the office (50 mbit) and did a test download from a URL they provided me and I get 5.8 MB/sec from their test file (probably off of Linux). But from my servers running Windows 2003 behind the firewall (x1250e) just using normal packet filter for HTTP, not proxy, very very basic config, I am only getting 2 MB/sec from my rack.
What do I need to do to serve downloads FAST? If I have a network with no bottlenecks, 50 mbit service to my computer, GigE connectivity in the rack, where is this slowdown? We tried bypassing the firewall and the problem remains so it's something in Windows 2003 I presume. Anything I can tweak to push downloads at 6 MB/sec instead of 2 MB/sec?
Any tips or tricks? Things to configure to get better HTTP download performance?
Pryme.net test URL:
http://209.9.238.122/test.tgz
My download URL:
http://download.logbookpro.com/lbpro.exe
Thank you.
It could be the Windows server itself. You could test for bottlenecks in memory, disk access, network access, etc. using PerfMon (1, 2) or MSSPA.

Resources