IIS7 Output Caching - only lives in cache for 60 seconds? - iis

I'm trying to cache the JSON output of an HTTP Handler (NOT an ASP.NET page, so the Page-level OutputCache directive does not apply). I'm running IIS 7.5 on Windows Server 2008.
I've added lines to the Web.config to enable caching in user mode (not kernel mode, as I need authentication):
<system.webServer>
<caching enabled="true" enableKernelCache="false">
<profiles>
<!-- cache content according to full query string for 12 hours -->
<add varyByQueryString="*" duration="12:00:00" policy="CacheForTimePeriod" extension=".aspx" />
</profiles>
</caching>
<urlCompression dynamicCompressionBeforeCache="true" />
</system.webServer>
<location path="Content">
<system.webServer>
<!-- cache content that's requested twice in 30 minutes -->
<serverRuntime enabled="true" frequentHitThreshold="2" frequentHitTimePeriod="00:30:00" />
</system.webServer>
</location>
The content does successfully cache, but it lives only for 60 seconds. I have looked all over the various config files (Web.config, applicationHost.config, machine config) for a some sort of TTL of 60 seconds, but I'm at a loss.
I suspected that the cache scavenger might be eating my cache entries each time it runs. I modified the registry key so the scavenger runs less often; that did not help.
I also suspected that IIS was overagressively clearing out the cache because the machine is using a lot of its physical RAM. This particular server has about 66% physical RAM saturation. I attempted to allocate a static amount (1 GB) to the output cache, rather than allowing IIS to manage the cache, but that was also unsuccesful.
I believe this is the same question as asked on this Stack Overflow page but that guy never got an answer.
Thanks in advance.
EDIT: I was finally able to solve this problem by modifying the OutputCacheTTL and ObjectCacheTTL registry values, as described in this very helpful article. Seems the Microsoft documentation is rather incompletel.

Related

Approach for archive big files in azure storage

I'm interested in what is approach for storing of sensitive data in cloud in your projects.
I want to prepare proof of concept of feature which have to archive files in Azure. We are talking about samething around 30GB of new files per day. 10-2000 MB per file.
My first idea was to use Azure Storage to store those files.
Files should be send to storage via Azure App Service so I have to prepare some WEBApi.
Based on this idea I am curious if there wont be any problems with sending such a big files to webapi?
Any tips what should I also consider?
The default request size for ASP.NET is 4MB, so you’ll need to increase that limit if you want to allow uploading and downloading of larger files.
You can do this by setting the maxRequestLength and maxAllowedContentLength values in your web.config file.
Note:the maxRequestLength value is in KB, but the maxAllowedContentLength value is in bytes.
Here is an example for increasing request limits to 2000MB:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.web>
<!-- 100 MB in kilobytes -->
<httpRuntime maxRequestLength="204800" />
</system.web>
<system.webServer>
<security>
<requestFiltering>
<!-- 100 MB in bytes -->
<requestLimits maxAllowedContentLength="209715200" />
</requestFiltering>
</security>
</system.webServer>
</configuration>
You could use async Task, mainly for better support of handling large files.
Here is a article about uploading and downloading large file with Web API and Azure Blob Storage, you could refer to it.

IIS Static Compression does not Gzip, or cache files

I have an ASP.NET website that I'm trying to enable Static Compression for. My website has the following compression configuration.
<httpCompression directory="%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files" staticCompressionEnableCpuUsage="0" staticCompressionDisableCpuUsage="100" staticCompressionIgnoreHitFrequency="true">
<clear/>
<scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="10" dynamicCompressionLevel="3" />
<scheme name="deflate" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="10" dynamicCompressionLevel="3" />
<staticTypes>
<clear/>
<add mimeType="text/*" enabled="true" />
<add mimeType="message/*" enabled="true" />
<add mimeType="application/x-javascript" enabled="true" />
<add mimeType="application/javascript" enabled="true" />
<add mimeType="*/*" enabled="false" />
</staticTypes>
</httpCompression>
<urlCompression doStaticCompression="true" doDynamicCompression="false" dynamicCompressionBeforeCache="false" />
I do not want to enable dynamic compression. According to Microsoft documentation,
Unlike static compression, IIS 7 performs dynamic compression each time a client requests the content, but the compressed version is not cached to disk.
My web server is fairly heavily loaded with processes, so this would be an unwanted burden. By Static Compression is appealing because the compressed files are cached on disk.
However, even after continuous refreshing of the localhost page (Ctrl+F5), and waiting 15+minutes on watching the compression directory, nothing is being cached.
Also, none of the relevant files (css/js/html) are being returned with an gzip compression header.
Both dynamic and static compression are installed. Dynamic is turned off. If I turn on dynamic compression, I start seeing the gzip HTTP response headers come back.
What am I missing? Why does Static Compression refuse to work?
IIS 10
I had this problem and tracked it down to a bad URL Rewrite rule. The static assets were living in C:\inetpub\wwwroot\MyProject\wwwroot and the rewrite rule was changing ^assets/(.*) to ./{R:1}, so IIS was looking at the top of MyProject and not finding the file. But then when it handed the request off to the .Net app, the app would see the file and serve it. So the two symptoms were:
gzip worked only when dynamic compression was enabled (because the .Net app was serving the files).
turning off runAllManagedModulesForAllRequests (on the modules element) caused our static files to become 404 errors---basically surfacing the problem of IIS not seeing the file.
To fix it I changed the rewrite rule from ./{R:1} to ./wwwroot/{R:1}.
Have you looked at this: https://blogs.msdn.microsoft.com/friis/2017/09/05/iis-dynamic-compression-and-new-dynamic-compression-features-in-iis-10/
Theres not much context to see from your question ... but for me this worked.
Cached by asp.net mvc, because it's a bundle of multiple js files. I guess the IIS can see it's not a static file on disk on thats the reason it's dynamic.
There also help to see what id actually does with your js file to find out why it's not doing compression in the link I posted.
I also saw a line in the link you posted:
Unlike static compression, IIS 7 performs dynamic compression each time a client requests the content, but the compressed version is not cached to disk. This change is made because of the primary difference between static and dynamic content. Static content does not change. However, dynamic content is typically content that is created by an application and therefore changes often, such as Active Server Pages (ASP) or ASP.NET content. Since dynamic content should change often, IIS 7 does not cache it.
Also try to read this post: https://forums.iis.net/t/1071156.aspx

NetScaler/IIS: 413 Entity Too Large

I am facing an issue where I am getting a 413 Request Entity Too Large whenever I post/put JSON to our servers running IIS 7.5 through a Citrix NetScaler.
We have tried to set the aspnet:MaxJsonDeserializerMembers to 30000, 40000 and 512000, as so:
<appSettings>
<add key="aspnet:MaxJsonDeserializerMembers" value="xxx" />
</appSettings>
as well as setting the <jsonSerialization maxJsonLength="xxx"/>
But without any resolution.
Setting the aspnet:MaxJsonDeserializerMembers in our local test environment, where we don't have a Citrix NetScaler, works just fine .
Is there any settings in the NetScaler that I should know of? or Is there some IIS settings I have to be aware of as well, considering that this works in our local test environments I am leaning towards the later, but I wan't all basis covert.
Edit: After further investigation, it surely seems that the NetScaler is the source as we can post to the API from behind the NetScaler.
As it turns out, it was actually a combination between the two products.
Internally we use SSL and Client Certificates which means we needed to configure a IIS property called "uploadReadAheadSize"
http://forums.asp.net/t/1702122.aspx?cannot+find+uploadReadAheadSize+in+applicationHost+config+in+IIS7+5
This is done in the host config or though the IIS manager.
...
<system.webServer>
<serverRuntime uploadReadAheadSize="{BYTES}" />
</system.webServer>
...
We used 10 MB = 10485760 Bytes for now which shows to be enough. Since this is defaulted to 48KB you may reach this rather fast.

Write requests to log file in IIS

I have a cabinet which consists of several servers, handling quite a bit of traffic.
I need to construct a system to keep statistics - and I struggle to find out if it's possible or makes sense to make a null-request to a server? That is, calling something like http://XXX.XXX.XXX.XXX?objectid=9563828&sreq=2854&nc=29291947829 and letting IIS do nothing with it, except adding the request in the log.
As mentioned, my servers handle a lot of traffic - and every bit of CPU-power and byte I can save, counts in the long run = save money.
At the moment, my plan is to make IIS return nothing on 404-errors, but I'm not sure if this is the best approach. And are all requests logged (having caching in mind)?
Theories or suggestions, please?
Solved - it is possible by adding
<configuration>
<system.webServer>
<httpErrors existingResponse="PassThrough" />
</system.webServer>
</configuration>
in the configuration file. The reason I didn't succeed, is that this command is apparently only supported from IIS 7.0 and later and I was testing it on a version 6.5

IIS 7 httpruntime maxRequestLength limit of 2097151

I am attempting to upload files to my Sharepoint 2010 server running on IIS 7 via the sharepoint client object model. The issue I have found is the file size limit is very well...limiting. I have read quite a few posts on the subject and it seems as though I'm running into an issue that is separate from those that I have found posted previously. After some experimentation and trying different methods I have finally found that the limit I am hitting right now is due to the following config setting in my web.config:
<system.web>
<httpRuntime maxRequestLength="2097151" />
</system.web>
Originally it was set at 51000 or so. I tried to put the 2 gig value that I have seen listed elsewhere at the theoretical maximum in for the value but when this is done the site won't load and the returned error states that the valid range for this setting is 0-2097151. I am wondering if there is somewhere else that this maximum allowed range is being set? It seems strange that it is so low, this basically limits any file upload I could provide to being only 2 megs which is smaller than the Sharepoint configurations upload limit of 50 megs.
The maxRequestLength is measured in kilobytes, so you already set it to be 2GB (2097151 / 1024 / 1024 = 2).
I have the same problem, but I found that you have to put
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483648" />
</requestFiltering>
</security>
</system.webServer>
too here for some IIS up
http://ajaxuploader.com/large-file-upload-iis-asp-net.htm

Resources