I'm interested in what is approach for storing of sensitive data in cloud in your projects.
I want to prepare proof of concept of feature which have to archive files in Azure. We are talking about samething around 30GB of new files per day. 10-2000 MB per file.
My first idea was to use Azure Storage to store those files.
Files should be send to storage via Azure App Service so I have to prepare some WEBApi.
Based on this idea I am curious if there wont be any problems with sending such a big files to webapi?
Any tips what should I also consider?
The default request size for ASP.NET is 4MB, so you’ll need to increase that limit if you want to allow uploading and downloading of larger files.
You can do this by setting the maxRequestLength and maxAllowedContentLength values in your web.config file.
Note:the maxRequestLength value is in KB, but the maxAllowedContentLength value is in bytes.
Here is an example for increasing request limits to 2000MB:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.web>
<!-- 100 MB in kilobytes -->
<httpRuntime maxRequestLength="204800" />
</system.web>
<system.webServer>
<security>
<requestFiltering>
<!-- 100 MB in bytes -->
<requestLimits maxAllowedContentLength="209715200" />
</requestFiltering>
</security>
</system.webServer>
</configuration>
You could use async Task, mainly for better support of handling large files.
Here is a article about uploading and downloading large file with Web API and Azure Blob Storage, you could refer to it.
Related
I have an ASP.NET website that I'm trying to enable Static Compression for. My website has the following compression configuration.
<httpCompression directory="%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files" staticCompressionEnableCpuUsage="0" staticCompressionDisableCpuUsage="100" staticCompressionIgnoreHitFrequency="true">
<clear/>
<scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="10" dynamicCompressionLevel="3" />
<scheme name="deflate" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="10" dynamicCompressionLevel="3" />
<staticTypes>
<clear/>
<add mimeType="text/*" enabled="true" />
<add mimeType="message/*" enabled="true" />
<add mimeType="application/x-javascript" enabled="true" />
<add mimeType="application/javascript" enabled="true" />
<add mimeType="*/*" enabled="false" />
</staticTypes>
</httpCompression>
<urlCompression doStaticCompression="true" doDynamicCompression="false" dynamicCompressionBeforeCache="false" />
I do not want to enable dynamic compression. According to Microsoft documentation,
Unlike static compression, IIS 7 performs dynamic compression each time a client requests the content, but the compressed version is not cached to disk.
My web server is fairly heavily loaded with processes, so this would be an unwanted burden. By Static Compression is appealing because the compressed files are cached on disk.
However, even after continuous refreshing of the localhost page (Ctrl+F5), and waiting 15+minutes on watching the compression directory, nothing is being cached.
Also, none of the relevant files (css/js/html) are being returned with an gzip compression header.
Both dynamic and static compression are installed. Dynamic is turned off. If I turn on dynamic compression, I start seeing the gzip HTTP response headers come back.
What am I missing? Why does Static Compression refuse to work?
IIS 10
I had this problem and tracked it down to a bad URL Rewrite rule. The static assets were living in C:\inetpub\wwwroot\MyProject\wwwroot and the rewrite rule was changing ^assets/(.*) to ./{R:1}, so IIS was looking at the top of MyProject and not finding the file. But then when it handed the request off to the .Net app, the app would see the file and serve it. So the two symptoms were:
gzip worked only when dynamic compression was enabled (because the .Net app was serving the files).
turning off runAllManagedModulesForAllRequests (on the modules element) caused our static files to become 404 errors---basically surfacing the problem of IIS not seeing the file.
To fix it I changed the rewrite rule from ./{R:1} to ./wwwroot/{R:1}.
Have you looked at this: https://blogs.msdn.microsoft.com/friis/2017/09/05/iis-dynamic-compression-and-new-dynamic-compression-features-in-iis-10/
Theres not much context to see from your question ... but for me this worked.
Cached by asp.net mvc, because it's a bundle of multiple js files. I guess the IIS can see it's not a static file on disk on thats the reason it's dynamic.
There also help to see what id actually does with your js file to find out why it's not doing compression in the link I posted.
I also saw a line in the link you posted:
Unlike static compression, IIS 7 performs dynamic compression each time a client requests the content, but the compressed version is not cached to disk. This change is made because of the primary difference between static and dynamic content. Static content does not change. However, dynamic content is typically content that is created by an application and therefore changes often, such as Active Server Pages (ASP) or ASP.NET content. Since dynamic content should change often, IIS 7 does not cache it.
Also try to read this post: https://forums.iis.net/t/1071156.aspx
i want to limit upload and download in IIS. i set maxBandwidth to 2000. this limit work for download correctly. but when upload file this limit don't work.
so how can i limit upload bandwidth in IIS?
Note
in my web.config file i set:
<system.web>
<httpRuntime maxRequestLength="2147483647" executionTimeout="3600" />
</system.web>
<system.webserver>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="4249967295" />
</requestFiltering>
</security>
</system.webserver>
when i remove this line upload is limited but when add this lines uploud bandwidth is unlimited.
This is by design, the maxBandwidth setting is for download only.
IIS doesn't support any bandwith restrictions for uploads.
maxRequestLength and maxAllowedContentLength have nothing to do with bandwidth, they apply to the maximum total size of a request.
If I configure IIS to use data compression for static files, the first client usually receives uncompressed content, with later clients getting compressed content. Presumably IIS compresses the file in the background and caches it for later requests.
However, I'd prefer the first client to also receive compressed content. That is: I'd prefer to trade latency for bandwidth. Is there any way I can configure IIS to do this?
Well, it actually works a little bit differently. IIS is not compressing the file in the background but it has a threshold to decide whether is should compress the content at all. This prevents it from using CPU resources and cache store for infrequently requested content. By default IIS will only compress content when it receives two requests for that content within 10 seconds.
You can change these defaults by changing the frequentHitThreshold and frequentHitTimePeriod attributes in the <serverRuntime /> element in web.config (see configuraton reference on iis.net). I've not tested it but I expect that just setting frequentHitThreshold to 1 will give you the desired result.
<configuration>
<system.webServer>
<serverRuntime frequentHitThreshold="1" />
</system.webServer>
</configuration>
Hope this helps.
I'm trying to cache the JSON output of an HTTP Handler (NOT an ASP.NET page, so the Page-level OutputCache directive does not apply). I'm running IIS 7.5 on Windows Server 2008.
I've added lines to the Web.config to enable caching in user mode (not kernel mode, as I need authentication):
<system.webServer>
<caching enabled="true" enableKernelCache="false">
<profiles>
<!-- cache content according to full query string for 12 hours -->
<add varyByQueryString="*" duration="12:00:00" policy="CacheForTimePeriod" extension=".aspx" />
</profiles>
</caching>
<urlCompression dynamicCompressionBeforeCache="true" />
</system.webServer>
<location path="Content">
<system.webServer>
<!-- cache content that's requested twice in 30 minutes -->
<serverRuntime enabled="true" frequentHitThreshold="2" frequentHitTimePeriod="00:30:00" />
</system.webServer>
</location>
The content does successfully cache, but it lives only for 60 seconds. I have looked all over the various config files (Web.config, applicationHost.config, machine config) for a some sort of TTL of 60 seconds, but I'm at a loss.
I suspected that the cache scavenger might be eating my cache entries each time it runs. I modified the registry key so the scavenger runs less often; that did not help.
I also suspected that IIS was overagressively clearing out the cache because the machine is using a lot of its physical RAM. This particular server has about 66% physical RAM saturation. I attempted to allocate a static amount (1 GB) to the output cache, rather than allowing IIS to manage the cache, but that was also unsuccesful.
I believe this is the same question as asked on this Stack Overflow page but that guy never got an answer.
Thanks in advance.
EDIT: I was finally able to solve this problem by modifying the OutputCacheTTL and ObjectCacheTTL registry values, as described in this very helpful article. Seems the Microsoft documentation is rather incompletel.
I am attempting to upload files to my Sharepoint 2010 server running on IIS 7 via the sharepoint client object model. The issue I have found is the file size limit is very well...limiting. I have read quite a few posts on the subject and it seems as though I'm running into an issue that is separate from those that I have found posted previously. After some experimentation and trying different methods I have finally found that the limit I am hitting right now is due to the following config setting in my web.config:
<system.web>
<httpRuntime maxRequestLength="2097151" />
</system.web>
Originally it was set at 51000 or so. I tried to put the 2 gig value that I have seen listed elsewhere at the theoretical maximum in for the value but when this is done the site won't load and the returned error states that the valid range for this setting is 0-2097151. I am wondering if there is somewhere else that this maximum allowed range is being set? It seems strange that it is so low, this basically limits any file upload I could provide to being only 2 megs which is smaller than the Sharepoint configurations upload limit of 50 megs.
The maxRequestLength is measured in kilobytes, so you already set it to be 2GB (2097151 / 1024 / 1024 = 2).
I have the same problem, but I found that you have to put
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483648" />
</requestFiltering>
</security>
</system.webServer>
too here for some IIS up
http://ajaxuploader.com/large-file-upload-iis-asp-net.htm