Azure Website slow to serve static JS/CSS but not binary - iis

I have an Azure Website/Web App that is incredibly slow to serve static JS and CSS files but seems perfectly fine serving binary.
To test the problem I uploaded two 30MB files, one big.js and the other big.rar. The JS file downloads at around 100KB/s if I'm lucky. The RAR file downloads at around 4,000KB/s. The results are extremely consistent.
I've checked in Fiddler and gzip compression is occurring in both cases. As expected, the JS file is being sent with the MIME type application/x-javascript whereas the RAR file is being served as application/octet-stream.
I am struggling to understand this - why would IIS serve one type of static content so much slower than another?

We had this issue, and was able to resolve this with the help of Azure Support Team. The issue was that the slow files would use TransferEncoding: Chuncked. They suggested that we force static compression to get around this issue.
We had to add the following to <system.webServer>:
<serverRuntime enabled="true" frequentHitThreshold="1" frequentHitTimePeriod="00:00:20" />

To elaborate on John Tseng's answer: (from here)
As you saw earlier, IIS 7 caches the compressed versions of static
files. So, if a request arrives for a static file whose compressed
version is already in the cache, it doesn’t need to be compressed
again.
But what if there is no compressed version in the cache? Will IIS 7
then compress the file right away and put it in the cache? The answer
is yes, but only if the file is being requested frequently. By not
compressing files that are only requested infrequently, IIS 7 saves
CPU usage and cache space.
By default, a file is considered to be requested frequently if it is
requested two or more times per 10 seconds.
So, the reason your users are being served an uncompressed version of the javascript file is because it didn't meet the default threshold for being compressed; in other words, the javascript file was not requested 2 times within 10 seconds.
To control this, there is one attribute we must change on the <serverRuntime> element, which controls compression: frequentHitThreshold. In order for your file to be compressed when it is requested once, change your <serverRuntime> element to look like this:
<serverRuntime enabled="true" frequentHitThreshold="1" />
This will slightly impact your CPU performance if you have many javascript files that are being served and you have users quite often, but likely if you have users often enough to impact CPU from compressing these files, then they are already compressed and cached!

Looks like it was an issue on IIS 8.5 and not only Azure specific.
Now App service upgrade to Windows Server 2016 looks complete and this workaround should not be needed.

Related

Azure CDN Standard Microsoft not compressing content

I'm using Azure CDN Standard Microsoft, and the files aren't compressed as expected and I don't know why. I've checked their troubleshooting guide and I don't see anything there that would cause trouble.. any ideas? I'm using a gunicorn server as backend.
Apparently the CDN doesn't compress unless the content is already compressed going from the webserver.
compression does depend on the file type itself, to see in more detail please go to this URL
https://learn.microsoft.com/en-us/azure/cdn/cdn-improve-performance
What file type is it specifically?

JS Files not updating in Site Assets

Ok, I have never seen anything like this before and hoping someone else has. I just finished patching our Dev and Test servers to Nov2017CU (SharePoint 2013). Since then, any solutions that are using JS injection from Site Assets are not updating. I'll make a change to the file, the library reflects that I made the change, but when I attempt to load the page accessing the js file, the changes are not reflected. Hard refreshes and full cache cleans are not affecting it. If I close and reopen my editor (VSCode) my changes are gone. When I look at the version history, the current version doesn't have my changes, but the previous version does. If I try to revert to that version, it doesn't take (still shows the previous version of the file).
Here's where it becomes extra weird. I have deleted the entire file from the library. Reset IIS (heck, I even rebooted the server at one time). It somehow still loads the file. The file is no longer in the library, but the server is still serving it up to the browser. I have confirmed it is not getting it from another location as the Dev tools are showing the file is located in the Asset Library the file was deleted from. Even users who have never accessed the site before are still getting that file in their browser.
This isn't limited to a single site either. I have other developers in different sub sites (same site collection) that are having the same issues.
Anyone seen this before?
Looks like your web application has BLOB cache enabled which is causing files to served from the cache.
There are 2 ways to fix:
1) The heavy handed way would be to flush the BLOB cache using powershell commands mentioned:
$webApp = Get-SPWebApplication "<WebApplicationURL>"
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlob‌​Cache($webApp)
This will flush all the files in the BLOB. Usually, the files are cached based on the max-age attribute value. So, that is the reason that your files are being served even if you had deleted it from the source.
2) The surgical knife approach would be to append a query string, like (https://sitecollurl/siteassets/app.js?v=1.1), to the file references (usually in master page, page layouts, webpart references, script links etc. wherever it is referenced). When you append a query string to the file, it will force the browser to download the newer version of the file. Would prefer this approach as it will not unnecessarily clear other files from BLOB.

JSON must be no more than 1000000 bytes

We have a Jenkins-Chef setup with a QA build project to a website for a client. The build gets the code from Bitbucket, and a script uploads the cookbooks from the Chef Client to the Chef Server.
These builds ran fine for a long time. Two days ago the automated and manual builds started failing with the following error (taken from the Jenkins console output):
Updated Environment qa
Uploading example-deployment [0.1.314]
ERROR: Request Entity Too Large
Response: JSON must be no more than 1000000 bytes.
From what I understand, JSON files are supposed to be related to nodejs which is what the developers use on this webserver.
We looked all over the config files for Jenkins, the Chef-Server and the QA server. We couldn't find a way to change this 1MB limit that is causing this error.
We tried changing client_max_body_size, didn't work.
We checked the JSON files size, non of them reach this limit.
Any idea where we can find a solution? Can this limit be changed? Is there anything we can do (Infrastructure wise) or should this be fixed from the developer side?
So first of all, the 1M value is more or less hardcoded, the chef-server is not intended to store large objects.
What happens is before uploading a cookbook, a json file with it's information is created, as this file will be stored in DB and indexed it should not exceed a too large size to avoid performances problems.
The idea is to upload to the chef-server only what is absolutely necessary, strip CVS directory, any IDE build/project file, etc.
Best solution to achieve it simply is using the chefignore file. It has to be created just under the cookbook_path.
The content of this is wildcard matches to ignore while uploading the cookbook so an example one could be:
*/.svn/* # To strip subversion directories
*/.git/* # To strip git directories
*~ # to ignore vim backup files

Process uploaded animated gif for security

We are working on a website to allow animated GIF upload. To ensure the image is indeed an image and without malware/virus/backdoor/trojan or anything other than image data itself, we try to recreate the original image.
However, the process itself will take sometime when there are lots of frames inside. Is there any other way to ensure an uploaded animated GIF file is free from the issues mentioned above?
You can never 100% guarantee that a file does not contain malware - even with your approach there is the chance that the gif contains some code that could be malicious simply by opening the image in a vulnerable viewer.
That said, the chances are low and you can expect these sort of bugs to be patched fairly quickly in most modern operating systems.
There are various checks you can do on uploaded files though that take less processing time:
Check the file name extension is what you expect - ignore the content-type at upload stage though as this can be spoofed.
Virus scan all uploaded files with a virus scanner with up to date definitions.
Do not store the files in a location where they can be executed - e.g. do not store in the web root (www.example.com/uploads/image.aspx).
Serve the files via a program or script that reads them from storage as data and then streams the output to the browser.
When serving the files, ensure the correct content-type, and if possible, filename extension is set correctly. Use Content-Disposition to set the name the browser will use:
Content-Disposition: attachment; filename="fname.ext"

bypass IIS xml file settings at file/folder level

Our site is currently set to pass all files with the xml file extension through the asp.net worker process because all the xml files on the site at the moment are generated dynamically on being hit, by writing the output directly into the response stream.
However we now have a requirement to add a file which is much larger and takes several minutes to generate in this way. I wrote a console app to generate the file and set it to run nightly, but because of the global IIS setting directing xml files to run through asp_wp, it's not being served properly.
I can't seem to find a way to make an exemption for the treatment of a single file in the IIS settings. Is there any other way we can do it?
Cheers,
Matt
As far as I know, this is not possible for a single file, but you can do it for a whole folder.
You simply place a web.config file in the folder in question and configure the settings you need there.

Resources