I have download OHIF code and build using yarn install on both root folder and viewer folder inside platform folder in Windows 10.
I can access the OHIF viewer successfully #localhost:3000 and also ipaddress:3000, but am not able to access it from a different machine using ipaddress:3000.
I am able to run various other application on different ports and access it publicly but in case of OHIF it doesn't work.
Am I missing anything?
And one more problem is OHIF creates almost 50MB file from 'yarn run dev' command and everytime I have to load 50MB on page load. Is there any way to compress the OHIF library by building it.
am not able to access it from a different machine using ipaddress:3000
I believe you would need to tweak the WebPack configuration: https://stackoverflow.com/a/35419631/1867984
The webpack-dev-server is not meant to host the application for production use. Ideally, you build the files and host them using a production ready web server. You have a lot of different options here. Check out: https://docs.ohif.org/deployment/recipes/static-assets.html
I have to load 50MB on page load. Is there any way to compress the OHIF library by building it.
If you run the production build with minification and treeshaking, the size drops to 15 MB. You can further lower this by pruning any extensions you don't intend to use.
Also, most of these assets aren't render blocking. Some codesplitting is used to lazy load portions of the application after the critical portions are loaded. This could still be improved.
Related
I'm building a small project using Angular 7. When you run
ng serve
and a NodeJS server is spun up to handle requests, does each request block until processing is complete? We are trying to evaluate how effective using this in production would be as opposed to using a more traditional application server.
Run build --prod to generate a "./dist" folder.
Then you have to put that on a web server.
You can use Angular Server Side Rendering (SSR) to run it on a node.js server.
You should not use ng serve for production because it use webpack-dev-server that is build for development only.
Github link
ng serve runs a webpack development server behind the hood.
a development server.
It's made to mimic the production build and see your final application in an esay way.
If you didn't have that command, you would need to run a command like simplehttpserver after rebuilding all of your application on every change.
This is a convenience tool provided by the CLI to ease your development, in no case it's suited for production mode. This is a server without security, without optimization, without performance, without ... Well, without anything that makes a server, a server. By default, it deosn't even make your application accessible outside of your localhost. Not so useful for a production mode ...
So, never, I repeat, never, use this command for your production server.
Run ng build --prod
It will generate minification code in "dist" folder. you have to upload the file content of this "dist" folder. It will give faster response for loading web pages.
For more details please refer Angular deployment guide
When using ng serve, you are spawning a backend nodejs environment with a web server to handle requests towards your angular application. That's great for reloading and quick startup when developing. But needing such resources for static pages is unnecessary.
At the end of the day Angular is just a framework telling you its opinion on how to build an SPA. No matter the framework or library you use, you will always end up with an index.xxx, Javascript files and other resource files from vendors or internally. Only these matters to the browser loading the webpage.
Hence, you need to build your app to generate the static files that will be served (i.e. ng build --prod). Then you have 2 good options:
Choose a web server that will serve the files (i.e. nginx) on a dedicated server (or even container).
Place the files behind a CDN provider. Since they are static, they will be cached and served to a browser requesting them based on its location.
I would opt for #2 as opposed to #1 forcing you to keep running resources (CPU, RAM, HDD) for files that will be requested not that often. I say not often because your SPA will be handling all routes within itself in the client's browser (and minimum once a day will request a cache refresh).
I have a Java Spring Boot backend and React.js frontend. I need to place compiled Node.js app into folder "static" of my Spring Boot application so it can be served as static content. This is done using the command npm build.
The problem with this is the compilation is quite slow and consumes several seconds before it's done. On the other hand, when I run my frontend app directly with "npm start" then projecting local code changes into my webbrowser takes only one second.
It's not acceptable for me to wait 10s or more until build into my Spring Boot is done. Is there a way to "link together" node.js project files without any optimisations, or to speed up the build?
You're referring to a common pain point for repeatable builds, dependency installation consumes too much time. The only known workaround is to use a cache. Here's an example. Well, some people checkin node-modules, but that's just shooting yourself in the foot.
If you're feeling adventurous, you can also consider adding squid as proxy in your production build environment, which will help with faster docker image downloads in addition to just npm installs.
I'm using Admin-on-rest and deploy the production app to AWS S3.
I created Admin-on-rest app with create-react-app in Admin-on-rest's instructions.
To build app: I run this script: npm run build
Size of file main.js is too big (5 MB). And for the first time load, it takes more than 5 mins. (My internet's speed test is 3MB/s)
Is there any way to reduce the size of main.js file?
I'm reading about JS chucking but it's not easy to apply to Admin-on-rest
First things first. AOR itself has not THAT much of an impact on application size and optimising an app is a very different task.
Following are the steps I took when optimising my in production app from 2.8 MB ish to 400 KB served file size.
1) First step is to know what part of the code is causing bloats. You can do this quite easily with Source Map Explorer.
https://www.npmjs.com/package/source-map-explorer
2) After identifying the packages/modules you should explore ways of reducing their size on your code. e.g
Don't do
import _ from 'lodash'
DO
import find from 'lodash/find'
Reducing unnecessary imports is low hanging easy fruit
3) Check if there are smaller versions of the libraries available that you can plug in.
For instance moment.js now has moment-mini which tracks the main package and does not have all the internationalisation data in it. That shaved off about 400 KB of the bundle size for me.
4) GZIP GZIP GZIP. The biggest impact on my file size yet.
I was using Nginx to serve my static files and I could simply add a config to it to gzip the package while in transit. Worked super smoothly.
If you are using something like Apache then you can also probably figure out a way to do this.
Another option is to eject from Create React App and then configure webpack to generate gzipped files at build time. This theoretically should lead to even smaller bundles as you can use Tree Shaking during compression. This was my experience just 2 months ago, Create React App might have made some way to inform webpack to gzip the bundle. Do explore that avenue as well.
This reduced my bundle size down from 1.6 mb to a ~400 kb served file.
5) Finally, there you can exit React itself and use Preact which is the hot in the market way to reduce bundle size. I haven't used this. But you can try and document your findings, for the benefit of us all.
6) If you need in production app to be even smaller then you will have to look at more advanced strategies such as loading a dashboard/landing page with server side rendering and the rest of the app bundle async when the dashboard gets loaded.
Best of luck.
Edit on - 26/02/2018
Just discovered this --> https://github.com/jamiebuilds/react-loadable
More details here. This does not use the react-loadable and instead teaches you how to use code splitting, which is supported by Create-React-App bootstrapped pages out of the block.
https://scotch.io/tutorials/lazy-loading-routes-in-react
I am able to deploy CoreCLR ASP.NET apps to Linux and have them run, hurray. To do this I am using
dnu publish --no-source -o <dest-dir>
which gives me a dest-dir full of many CoreCLR packages, one of which is a package for my published app specifically.
This folder is pretty big, around 50 MB for the simple Web Application Basic (no auth) described at https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-app-using-vscode/ .
Is there a sensible way to deploy to Linux without pushing so much around? Can I get rid of a bunch of those CoreCLR packages somehow? Is there a good way of deploying source-only and doing the work on the server (I may have seen something about this, but I lost it if I did)?
You are already publishing without runtime (--runtime option on dnu publish) which reduces the bundle size significantly.
You need to get somehow those packages on the server. Even if you deploy only the sources, you'll have to restore which will download the same amount of packages. Also, running from sources makes the application start significantly slower (depending on the number of dependencies).
However, if you publish the entire bundle once and you app's dependencies don't change, you can upload only the folders corresponding to your projects, instead of re-uploading all the dependencies.
I am currently working on a project that is using Dojo as the js framework. Its a rather rich ui and as such is using (and thus loading) a lot of different .js files for the dojo plug-ins
When run on an apache server running on a mac, the files (all around 1k) are served very quickly (1 or 2 ms) and the page loads pretty fast (<5 seconds)
When run on IIS on Win 7, the files are served at an unbelievably slow rate (150ms - 1s), thus causing the page to take up to 3 minutes to load.
I have searched the internet to try to find a solution and have come up empty.
Anyone have any ideas?
Why not let Google serve the Dojo files for you?
The AJAX Libraries API is a content
distribution network and loading
architecture for the most popular,
open source JavaScript libraries. By
using the google.load() method, your
application has high speed, globally
available access to a growing list of
the most popular, open source
JavaScript libraries.
What you need to do is build an optimized version of your code. That way you will have far fewer hits to your server (but I guess they'll still be slow, until you discover the iis problem) Dojo runs out of the box as individual files which is great for development, but without running the build scripts to concatenate all these files together, the experience is poor. The CDN does build profiles for dojo base and certain profiles, like dijit.dijit. Doing a dojo.require on these profiles in addition to the individual requires would enable this after running a build. You would need to do create layers for your code as well. The build scripts can also concatenate css and inline template files, remove comments and whitespace, etc.
Have you actually tried measuring the load times on the intended target production server?
If you're just testing this on local development environments (or in development/test VM's) then I think you're comparing apples with oranges (pardon the pun :) ).