Compress web files without losing its readability? - web

Can you compress all the web files like html, css, js, php without losing its readability?
I'm developing a dynamic website, but then client require me to update the site quite frequently, the problem is that whenever I make any changes to the website, I need to backup a .min version(to webhost) and the original files(for development).
The problem is that keeping two sets of files are quite tedious, and prone to error easily.....
Is there a better way like a javascript that will handle all the compress functions when upload to host and I can re-use the same files for developement?
Thanks in advance

First, let's separate compression and minification. I assume we are only talking about minification here (removing spaces etc.) not compression like gzip.
There's 2 common ways to serve minified css/js/etc.
Edit readable version, minify offline, upload the minified files.
Edit and upload readable files, but dynamically serve a minified version to users.
I agree #1 is more tedious and prone to error, if sometimes you are forced to make changes to the min version and forget to bring the changes back to your dev side.
There're a lot of ways to achieve #2. If you're using PHP I would suggest Minify (it not only minifies but joins and compresses your CSS/JS to reduce file requests as well). That way you can maintain 1 set of readable CSS/JS files on both your dev and production side, and let Minify take care of the rest.

Related

Serving gzipped html which is a merge of two gzipped that works on Chrome

I would like to have the top part of the page (from <html> until the </head>) in one shared gzipped file and the rest of the page separately, and serve them as one gzipped html document.
Note that Chrome is not supporting multi member gzipped file (concatenated gzipped)
I can keep the files uncompressed, merge them and compress them in runtime, but that will hurt performance and it is a big issue here (we are caching billions of files)
How can I merge gzipped files without creating multi member file or serve html file that is comprised of two parts?
Is there any workaround for gzipping two files for Chrome?
Try to use https://varnish-cache.org. It has ESI feature https://www.varnish-cache.org/docs/3.0/tutorial/esi.html. So it probably do exactly what you want. As an additional plus, you also dont have to maintain you own caching software, because for decades of development there are already well developed products, that solves this.

Compressing and loading nodejs components efficiently

Is there a way to load the node.js bower components and node modules efficiently i.e. in as much less time as possible?Every time when the user opens the app for the 'first time' it takes ages to download the components and render the page.
Currently it is taking around 10-15 seconds to get downloaded when we run the application for the first time which is not at all good!
I tried using gzip compression but failed to use it efficiently.
Any help with the process or method would be great!
Thanx.
You would usually not want to deliver the individual files to the users.
Instead you can use a tool that minifies and bundles your dependencies. This will lead to the users only needing to download one minified file (usually called vendor.js or something similar) that contains all the javascript needed. This reduces the number of requests and saves time.
You might want to take a look at grunt or other build tools as a possible solution for your problem. Here is a basic example

Minimize Node js application .js files?

We have developed a desktop application using node webkit and it works fine. My only doubt is that, do we need to perform minification on the .js files written as part of node js server component. We usually perform minification on the javascript written mainly for UI view to reduce payload during loading of related javascripts of the HTML and also to hide coding information in the scripts so that its hard to modify.
So do we need to perform similar kind of concatenation and minification process on the node js server side .js files and then share the node webkit executable to the Customer. Without minification of node js files, the application works perfectly fine.
So, going back to my question -- Do we need to perform javascripts concatenation and minification for node js application?
Minification is generally to save bandwidth when downloading script files over the internet, so there isn't any real point to minifying your node.js files on your server if they aren't served anywhere.
I really doubt your server's storage needs to save a few kilobytes.

coffeescript + nodejs: caching the compiled server-side javascript using require

I'm running a node web server via "coffee my_server.coffee", and I load dependencies via
require './my_library.coffee'
My codebase is pretty big, and it's starting to take a significant amount of time for my server to start up, which I believe is due to coffeescript compilation... when I convert the entire thing to javascript, it loads much faster.
What's the least painful way of caching the compiled javascript, so that when I restart my server it only compiles files that I've edited since I last started it? Ideally it would be totally transparent... I just keep on requiring the coffeescript and it gets cached behind the scenes.
Alternatively, I could run "node my_server.js", and have a watcher on my directory that recompiles coffeescript whenever I edit it, but I don't really like this idea because it clutters up my directory with a bunch of js files, makes my gitignore more complicated, and means I have to manage the watcher function. Is there a way for me to have my cake (running the "coffee" executable and requiring coffee files) and eat it too (fast load times)?
Well if you don't want to "clutter up your directory with a bunch of .js files", I think you're SOL. If you don't store the .js files on disk ever, you need to compile .coffee to javascript on the fly every time. The coffee command to my knowledge does not do any comparison of mtime between the .js and .coffee files, although in theory it could, in which case leaving the .js files around could help your situation. Given your preferences, the only thing that I can suggest is:
run a watcher that builds all your .coffee files into a separate build subdirectory tree
start your app with node build/app.js instead of coffee
ignore the build directory in your .gitignore
You'll have to give up on running things via coffee though. Curious to see if others have better suggestions. My projects don't suffer from the startup time issue. SSDs and small projects help to keep that short and not annoying.

Concatenating javascript files on the fly in Liferay

I see a barebone.jsp file created (I guess by the MinifierFilter) as well as for deploying compressed and cached js. I want to separate development and production cases, and as for development, I just don't want Liferay not only to cache produced javascript file, I don't want to have this generated instance at all.
To be more precise, I want all javascript files to be concatenated on the fly. I always want to have an opportunity to edit any statics files at development and to see results as soon as possible.
What is the easiest way to implement it?
include the settings from portal-developer.properties in your portal-ext.properties. This disables minifiers, caching etc. and you can develop without the problems mentioned. You don't want this setting in production though, as all files will be loaded individually.
(Edit: It might be advisable to include my comment from below in the answer):
You find this file in webapps/ROOT/WEB-INF/classes
All the *.fast.load parameters are for the various minifiers (css, js), but typically you want all of the parameters named in there.

Resources