Azure CDN Standard Microsoft not compressing content - azure

I'm using Azure CDN Standard Microsoft, and the files aren't compressed as expected and I don't know why. I've checked their troubleshooting guide and I don't see anything there that would cause trouble.. any ideas? I'm using a gunicorn server as backend.

Apparently the CDN doesn't compress unless the content is already compressed going from the webserver.

compression does depend on the file type itself, to see in more detail please go to this URL
https://learn.microsoft.com/en-us/azure/cdn/cdn-improve-performance
What file type is it specifically?

Related

Remotely upload a file to Wordpress from nodejs instance

I’m running a node app on Heroku. It gathers a chunk of data and sticks it into a set of JSON files. I want to POST this data to my Wordpress server (right now it’s on siteground, but moving to WP Engine).
I thought the WordPress REST API would provide what I wanted but after reading the docs I’m not not so sure.
Does anyone have any advice on this? It’s not the kind of thing I’ve done before.
Naturally, I could download the generated files and manually upload them in the right place… but I want it to be automated!
Can anyone point me in the right direction?
Looked at WordPress REST API but don’t think that’s the answer.
An option would be to setup a cron job on your Wordpress site to download the already made JSON files. This would let you download the files via http if they're publicly available, or using an api-key, or SFTP even.
You could also go the opposite way and setup a cron job on the server running your node app and use SFTP to deliver the updated files onto your Wordpress site.
If you are really committed to wanting to use the wordpress rest api, I think the closest one that comes to my mind (not an expert in wordpress) would be that you'd upload the json files as media objects: https://developer.wordpress.org/rest-api/reference/media/.

Prevent Azure App Service from viewing backend configuration

I am working on a project that has us deploying to an Azure Web Site.
The code is overall working and now we are focusing more on security.
Right now we are having an issue that back end configuration files are visible with the direct URL.
Examples (Link won't work):
https://myapplication.azurewebsite.net/foldername/FileName.xml (this
file is in a folder that is contained within the root application)
https://myapplication.azurewebsite.net/vApp/FileName.css (this file
is a part of virtual application sub folder)
I have found this to be true with multiple extensions and locations.
Extensions like:
.css
.htm
.xml
.html
the list likely goes on
I understand that certain files are downloaded to the client side and that those can't be stopped. However backend XML files are something we don't pass to the client (especially if has connection strings).
I did read a similar article, Azure App Service Instrumentation Profiling?
However this didn't directly relate to my issue.
Any insight would extremely helpful.
Do not store sensitive information in flat files, especially under your site root. Even if you web.config it just right you're still one botched commit away from disaster.
Use Application Settings instead, that's what they're for.
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-configure

SmartDL for ftp

I need a python code which can download files from a ftp server. I need a built in multi-part download managing package which can help me to retrieve files faster. I tried SmartDL but the problem is I don't know how to retrieve files in a ftp server. Also I used the add_basic_authentication to ensure that, I am passing the right credentials. Please help me with a solution.
I have no problem using any other solution/package which uses Multipart download.
P.S:- I need to save the Downloaded files on to an Object storage on Cloud. The size of each file may be 300MB and I need to download 20TB of data.
Thanks in anticipation.
Take a look at ftplib, it's a simple FTP library which will permit you to download files from a FTP server.

Azure Website slow to serve static JS/CSS but not binary

I have an Azure Website/Web App that is incredibly slow to serve static JS and CSS files but seems perfectly fine serving binary.
To test the problem I uploaded two 30MB files, one big.js and the other big.rar. The JS file downloads at around 100KB/s if I'm lucky. The RAR file downloads at around 4,000KB/s. The results are extremely consistent.
I've checked in Fiddler and gzip compression is occurring in both cases. As expected, the JS file is being sent with the MIME type application/x-javascript whereas the RAR file is being served as application/octet-stream.
I am struggling to understand this - why would IIS serve one type of static content so much slower than another?
We had this issue, and was able to resolve this with the help of Azure Support Team. The issue was that the slow files would use TransferEncoding: Chuncked. They suggested that we force static compression to get around this issue.
We had to add the following to <system.webServer>:
<serverRuntime enabled="true" frequentHitThreshold="1" frequentHitTimePeriod="00:00:20" />
To elaborate on John Tseng's answer: (from here)
As you saw earlier, IIS 7 caches the compressed versions of static
files. So, if a request arrives for a static file whose compressed
version is already in the cache, it doesn’t need to be compressed
again.
But what if there is no compressed version in the cache? Will IIS 7
then compress the file right away and put it in the cache? The answer
is yes, but only if the file is being requested frequently. By not
compressing files that are only requested infrequently, IIS 7 saves
CPU usage and cache space.
By default, a file is considered to be requested frequently if it is
requested two or more times per 10 seconds.
So, the reason your users are being served an uncompressed version of the javascript file is because it didn't meet the default threshold for being compressed; in other words, the javascript file was not requested 2 times within 10 seconds.
To control this, there is one attribute we must change on the <serverRuntime> element, which controls compression: frequentHitThreshold. In order for your file to be compressed when it is requested once, change your <serverRuntime> element to look like this:
<serverRuntime enabled="true" frequentHitThreshold="1" />
This will slightly impact your CPU performance if you have many javascript files that are being served and you have users quite often, but likely if you have users often enough to impact CPU from compressing these files, then they are already compressed and cached!
Looks like it was an issue on IIS 8.5 and not only Azure specific.
Now App service upgrade to Windows Server 2016 looks complete and this workaround should not be needed.

How deploy Unity build in Windows azure?

I am making one Unity game for Facebook and since I am trying Windows Azure, I would like to deploy two files to this server services(an html and unity3d files), for it, Could anybody help me to do it?
Thanks in advance
Alejandro
Use a free tool like AzCopy (http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx) to publish the files as public blobs to a public Storage Container.
If that doesn't work because you need to set some mime type information you can use Azure Websites (http://www.windowsazure.com/en-us/develop/net/common-tasks/publishing-with-git/). You would need to include a web.config to define the additional mime types for the web server to use.
Most servers require no configuration at all. We have to just upload the .unity3d file, and the accompanying html file but it doesn’t work on Azure. On Azure we have to add a custom mime type.
To do this here I have written a complete blog http://poojabaraskar.com/deploy-unity3d-game-in-windows-azure/

Resources