Cloudfront compression does not invalidate? - amazon-cloudfront

I have been adjusting my AWS Cloudfront settings trying to optimize my site.
I tried turning compression on (a Y-Slow recommendation) and it corrupted the rendering of my site.
So I turned compression off, ran in invalidation on the whole directory tree, but the problem persists. I have had to turn CDN off so my site will render.
Just for kicks I invalidated again, turned CDN on after waiting a bit, but still sending me compressed js and css files.
What did I miss?

How to upload a static HTML to S3 and use it as a Web Server
Pre­requisites:
­ You should have an IAM user with S3 access with the username and password
ready
Some Caveats:
Minify the files yourselves [including CSS/JS and HTMLs] [There is no web server to do
that].
You can use grunt [preferably] or any online tool like http://www.willpeavy.com/minifier/
Gzip the files if you want to enable compression with the below
command: [Rememberto minify the files before you do this step]
­ gzip ­9 ­
This will produce two files like file1.min.css.gz and file2.min.css.gz ­ Now remove the “.gz” extension with the help of mv command like: ­ mv file1.min.css.gz file1.min.css [similarly for file2.min.css.gz] ­
Sign­in into your AWS account and create a S3 bucket like mywebsite.com
○ Actions ­> Create Bucket
Right click on the S3 bucket and goto properties and click on enable website hosting.
Here you will have to enter the index document. This is the root file of your website. If it is located on the root and is named as index.html then simply write it as index.html
Go inside the bucket and click on Actions ­> Upload. On the popup, select your
folder/drag and drop the files that you want.
Next ­ Click on Set Details [Tick mark the default selection and then click on Set
Permissions]
Check mark on “Make Everything public” and click on Set Metadata
Voila, you’re good to go!
Thereafter, click on Add metadata and add two things:
Key:​ Cache­Control ​Value:​ max­age=2592000 [This number is in secs, modify this according to your needs]
Key:​ Content­Encoding ​Value:​ gzip
Now Click on Start Upload and at the end, you can access the site via the S3 link of the
index page
Thanks : #karan-shah

Related

How to prevent users from browsing certain files of my website

I have recently launched a website on GoDaddy hosting. I have keept some images and JavaScript files used in website, in separate folders. I want to prevent the users from browsing those images and files by simply appending the folder and file name in the website URL. For example
www.example.com/images/logo.png
If I understand correctly, you want to have html file with images, that shouldn't be accessible alone? If yes, then it cannot be done. You can watch for correct HTTP Referrer header, but it can be simply faked and it also makes it inaccessible for browsers that don't send referrer or having sending it forbidden for "privacy" reasons.
If you want hide files to be accessible only by server side scripts, ftp/scp, then you can try to use .htaccess (if GoDaddy runs on Apache) and correct configuration: https://httpd.apache.org/docs/2.2/howto/access.html
Another way could be hiding that files and creating one-shot token like this:
<img src=<?pseudocode GEN_TOKEN("file.jpg") ?> /> with another file serving these hidden files just for generated token, then deleting it from DB. Nevertheless, this will not protect anybody from downloading or accessing these files, if they want...
But, anyway, try to clarify your question better...
If you are keeping images/files in folder which is open to public, I guess you kept in that folder for purpose, you want public to access those images and files.
How public know images file name? Stop file content listing for your web site.
I am not aware which language you are using on web server, but in ASP.NET you may write module/ middle ware which can intercept in coming request and based on your logic (e.g. authentication and authorization) you can restrict access. All modern languages support this kind of functionality.

Azure CDN update for WebApp

I have a setup a azure cdn that point to my webapp. while i am changing in my style sheet and deploying webapp, the styles are updating immediately. so is there no any rquiremtn for purge in this case? does in this case cdn automatically update styles from webapp?
I am working according to this article
https://azure.microsoft.com/en-in/documentation/articles/cdn-websites-with-cdn/
If the URL of the resource remains the same, the CDN servers (and the browsers) are free to cache them. So, if you are using CDN, you need to force a URL change every time the file content changes (commonly done by adding a version string).
Since, it is working for you, either your files are not getting served from the CDN at all or somehow the URL is getting updated.
Look at the URL from where your style sheet is getting fetched (network tab in the browser's debugger). Make sure the URL path is actually from the CDN and not your website directly.
If you have a MVC.net app and you are using System.Web.Optimization.BundleCollection for style bundle, it add a query parameter to the URL embedded in the HTML and changes it if the file contents change. This ensures that the stale cached copies of the resources are not used.
See CDN and bundle caching sections at http://www.asp.net/mvc/overview/performance/bundling-and-minification
No, CDN does not automatically update the CSS for webapp.
To be safe, you should always purge.
CDN is a global service, you saw the CSS update doesn't mean everyone else all see the CSS update. Another IP address might still have the old CSS cached.
Besides, cache control header also plays a role here.

Reference to JS file prompts logon

I want to reference a javascript file at the bottom of a page template.
I added the .js file under cmssitemanager / development / javascript files and then added the following to the master template (portal engine):
<script src="/CMSScripts/Custom/FOO/BAR.js"></script>
When I open the page in any browser, the javascript in the file isn't executing.
When try to access the file directly in the browser at domain.com/CMSScripts/Custom/FOO/BAR.js I get redirected to the Kentico logon page.
I don't see how/where I can specify security for the CMSScripts directory, but the user should not have to be logged in to access this file.
Any suggestions?
Update with info from first answer:
We are not currently using Windows Authentication, and I verified that the application pool user account has file-system permissions. I also verified that the application can add/delete/modify files using the cmssitemanager > administration > system > files > "test files" utility which resulted in OK status for creating/deleting/modifying folders and files.
I do have Check files permission checked in cmssitemanager > settings > system > files > security. Unchecking this option does not change the behavior, including after restarting both the application and the windows services.
The site is valid and has the correct license however there was NOT a domain alias defined. I added one however this did not change the behavior.
We have not made any changes to the web.config save for connection string information. If you are referring to this documentation on CMSUseTrailingSlashOnlyForExtensionLess in the web.config, this was not previously in there however adding it did not change anything. To that end, I wasn't seeing the site trying to redirect to domain.com/CMSScripts/Custom/FOO/BAR.js/ so I truly think it's tied to some kind of security ACL that is out of order.
Here is the solution provided by user FroggEye on the Kentico DevNet forum thread on this same issue:
You're best bet is to use the syntax below. Kentico does URL processing on all directories unless excluded and you'd have to exclude the CMSScripts directory from processing which isn't hard but the code below is a better solution.
<script src="/CMSPages/GetResource.ashx?scriptfile=/CMSScripts/Custom/FOO/BAR.js" type="text/javascript"></script>
This will do 2 things:
give you a standard place to call resources from (you can get stylesheets here as well)
allow you to use the build-in minimization functions.
Charles,
This is a strange behavior, it does not normally do that.
Do you have any windows auth going on ? where you possibly removed anonymous access (in IIS) to directories in the website folder ?
Do you have Check Files permissions turned on as a Kentico Site setting? (even though I'm fairly sure that won't affect requests to the JavaScript files in JS module)
Is it a valid site (enabled and running in CMSSiteManager) and domain name with a correct license and alias ?
Any custom modifications to web.config file that possibly changed httpHandlers or modules ? And are you running the correct extensionless URL settings in web.config ?

Amazon CloudFront - Inavalidating files by regex, e.g. *.png

Is there a way to have Amazon CloudFront invalidation (via the management console), invalidate all files that match a pattern? e.g. images/*.png
context -
I had set cache control for images on my site, but by mistake left out the png extension in the cache directive on Apache. So .gif/.jpg files were cached on users computer but .png files WERE not.
So I fixed the apache directive and now my apache server serves png files with appropriate cache control directives. I tested this.
But the cloudfront had in past fetched those png files, SO hitting those png files via cloudfront still brings those png files with NO cache control. End Result - still no user caching for those png files
I tried to set the invalidation in Amazon CloudFront console as images/*.png. The console said completed, but I still do not get cache control directive in png files. --> Makes me believe that the invalidation did not happen.
I can set the invalidation for the complete image directory; but then I have too many image files --> I would get charged > $100 for this. So trying to avoid this.
Changing image versions so that cloudfront fetches new versions is a painful exercise in my code; doing it for say 500 png files would be a pain. --> Trying to avoid it.
Listing individual png files is also a pain --> trying to avoid it as well.
Thanks,
-Amit
If your CloudFront distribution is configured in front of an S3 bucket, you can list all of the objects in the S3 bucket, filter them with a regex pattern (e.g., /*.png/i), then use that list to construct your invalidation request.
That's what I do anyway. I hope this helps! :)

Server Side path to store uploads from website

For security, I want to upload employee contracts to the hidden server side path. The upload works fine.
However, I also want to be able to download or view that file when I am logged in the front end of my website, with a seceret link. Is this possible?
Any other ideas?
Joomla website, hosted on Rack Space Cloud Sites.
Sure, upload the file to a directory outside the webroot, or a directory with a .htaccess file that contains "deny from all". Have a sql table that maps the primary key to the file name, the file's content-type and access control information and other metadata about this file. Then save the file as the primary key, so /uploads/1.
Then to download the file, burn a sql query to figure out if they should be able to download the file.
<?
//...
$q=mysql_fetch_array($q);
header("content-type: ".$q['content_type']);
print file_get_contents("./uplaods/".intval($q['id']));
?>

Resources