Yeoman working very slowly loading pages - node.js

I'm working on a project that uses Yeoman
it's been working great on my machine till recently some changes have been made to the project (introducing angular mainly) while I wasn't working on it for a month.
since I came back every page load has been taking around 2 mins only to get HTMLs and JS files!
the cpu is between 30%-50% physical memory around 60%, the computer is in good shapes.
other people working on the same project are getting very fast load times..
what can it be?
10x!
Igal

Sounds like the problem is on your side, therefore I would just delete the whole mess and check it out again, I assume you use some kind of Version Control.
If this does not work, it could be some caching problems - but it is hard to say with the current information.
Do you have anything that the others don't?
I would also make sure to update Yeoman, Node.js ect. to the lastest version, or just the same as your co-workers.
For extra performance you can disable the force option in the gruntfile, but seeing as the other people on your project has no problems this should not do anything about your situation.

Related

Angular SSR Hot reload not working, no error, hung

I added angular universal to my project and upon saving, it compiles successfully though in the network tab I can see the request for whatever route im on gets hung on pending and the website stays loading forever in the browser. This issue is not present during CRS. Things I have tried that did not work:
I have cloned the same project elsewhere to see if it was some fat finger accident
I have not only add logic to stop DOM methods from being used, I have outright removed the use of them everywhere in the app
I added 3rd party libraries like Domino js to mock methods on Node
I started a completely blank angular project, added nothing to it other than angular universal and I get the same issue!
I will try it on another computer soon to see if it is just an issue on this machine.
What could the issue be? I don't get any errors and refreshing the page it works.
Is hot reload broken for me? missing anything?
EDIT: I have now tried it on a 2 other Windows machines and the result is the same. Perhaps this is a bug. Essentially makes Angular useless to me.
Angular Version 12.1
I was able to get it to work by upgrading my Angular to version 14, and starting a fresh project with SSR added. Frustratingly to say the least but I can finish my project and I copied back in most of my code from the project I was working on.
For what ever reason Angular 11 and 12 were giving me this issue on multiple machines on fresh angular projects. It never gave me any errors to show. I will report back if any time during the rest of the porting of code if the issue persists.
I had the same hassle when initially integrated the Angular Universal to my project.
After a period of time I consider the according workaround:
SSR for the single page applications is useful in production for SEO, Open Graph, etc.
Angular Universal adds additional files:
main.server.ts
app.server.module.ts
Then the classic angular approaches are still possible to be used (the old app.module.ts).
Now for the local development environment I'm continue using the classic "ng serve" command and the Angular continue reloads normally on each file change.
And the SSR is compileing only for the production build.
Then if I want to debug something related to the SSR I'm deploying the production build to the test environment and calling the test urls to see the results.

Retroarch js emulator not working on webpage

I am trying to develop a simple web page that allows a user to play a retro game (like Mario) using his browser. For this I have decided to use the js emulators that have been compiled from retroarch using emscripten. I have been told that some of the js emulators available on libretro website currently (https://buildbot.libretro.com/stable/1.7.0/emscripten/) do not work properly (example: n64 js emulator). So, I am trying to use the older version available on play-roms.com but I have not been able to make it work even after a lot of work.
The problem
I am trying to just replicate this game page to work locally on my machine: https://play-roms.com/nintendo-64/super-mario-64 Since, it is mostly dependent on HTML, CSS and JS, I simply copied all the HTML,CSS, JS files and also the emulator and .mem files. When I tried to make them work locally, they simply do not work. I get a constant warning in console in infinite loop:
"RetroArch [libretro INFO] :: mupen64plus: Memory initialized"
This warning does not allow the game to load. Please note that I do not get any other warning or errors on the console which are not already happening on the original mario page of play-roms from which I copied the files.
I assume that the problem is happening because of some issue with .mem file. Next, I tried to fetch the mem file from play-roms server itself (just for testing purpose) but that also did not help. (Please note that I am aware of CORS and know how to handle it). I still get the same error even when mem file is fetched from play-roms
I talked to someone who has worked in this area before and he confirmed that he too faced the exact same issue of "Memory initialized" in infinite loop when he tried it. He too could not solve it.
Please note that copying some other website is not my goal. I am just trying to make the retroarch js emulators work for my website.

Debugging React/Node back-end

So, here's the situation. I've recently "inherited" a decent sized web application, built using React (and Redux) and compiled using webpack/babel. Two files are generated, app.js and server.js, both of which run on Node.
The original developer of the project was mostly "winging it" on the back-end (server.jsx and so on -> server.js), using console.log to figure out what was going on, and then just gut feeling the fix. That works on a smaller scale, but will be problematic in time.
I can debug both of the actual files, but only app.js is ever mapped properly, meaning I can debug the source code. This also affects hot loading. Any breakpoints related to server.js will only trigger in the actual server.js file in IntelliJ, which is a completely unreadable mesh, so that's not really an option.
I'm using IntelliJ (WebStorm for those of you who only use the web version), and I've tried to use every single guide I've come across to set it up, which usually comes down to babel-node, babel watchers or webpack-dev-server. The current app.js is run using webpack-dev-middleware, and debugging it in Chrome works like a charm, but for some reason it always just bundles in server.js and then fires when ready (in these Star Wars times).
I understand that it's hard for Chrome to get access to server.jsx, but surely there must be some way of setting up IntelliJ (or WebStorm) to do so? I'm more used to a Java or C# server side, so I'm a bit baffled that this isn't a straight forward, out-of-the-box option. Or maybe it is, and the initial setup is lacking?
PS. When using React (and Redux) as both the front and back end on Node, is it meant to be virtually impossible to distinguish between the two? Server.jsx is fairly obvious, but there are quite a few duplicated javascript files and dependancies, especially related to handling/building the Redux store.

Run Monogame exe without Visual Studios

This is probably a stupid question but I am no the best with technology so I figured I might as well ask. I am working on creating a website for myself and I would like to put Monogame work on there. Is there a way I could I guess compress it all into one file for a person to download and then play it? or possibly make it playable via my website and how have them download anything?
This is my first post on her so sorry if this is not worded properly (it being 2:30 a.m. is not helping either). Thank you very much!
You cannot implement MonoGame as a game on your website. The closest you're gonna get is having it working over Local Network. And this has only been tested with XNA.
However yes, if you compile a complete version of your game and Zip it, that should work. As far as I've experienced, if you simply make sure to include MonoGame.Framework.Dll, it should work without any further requirements (apart of course from the standard ones, such as DirectX and .Net Framework in general).
You might want to test this on a clean computer (Virtual machine would also work I think). If this doesn't work, make an installer instead, using the Visual Studio Publish feature. I've never had that fail before

Dojo load time extremely slow on iis

I am currently working on a project that is using Dojo as the js framework. Its a rather rich ui and as such is using (and thus loading) a lot of different .js files for the dojo plug-ins
When run on an apache server running on a mac, the files (all around 1k) are served very quickly (1 or 2 ms) and the page loads pretty fast (<5 seconds)
When run on IIS on Win 7, the files are served at an unbelievably slow rate (150ms - 1s), thus causing the page to take up to 3 minutes to load.
I have searched the internet to try to find a solution and have come up empty.
Anyone have any ideas?
Why not let Google serve the Dojo files for you?
The AJAX Libraries API is a content
distribution network and loading
architecture for the most popular,
open source JavaScript libraries. By
using the google.load() method, your
application has high speed, globally
available access to a growing list of
the most popular, open source
JavaScript libraries.
What you need to do is build an optimized version of your code. That way you will have far fewer hits to your server (but I guess they'll still be slow, until you discover the iis problem) Dojo runs out of the box as individual files which is great for development, but without running the build scripts to concatenate all these files together, the experience is poor. The CDN does build profiles for dojo base and certain profiles, like dijit.dijit. Doing a dojo.require on these profiles in addition to the individual requires would enable this after running a build. You would need to do create layers for your code as well. The build scripts can also concatenate css and inline template files, remove comments and whitespace, etc.
Have you actually tried measuring the load times on the intended target production server?
If you're just testing this on local development environments (or in development/test VM's) then I think you're comparing apples with oranges (pardon the pun :) ).

Resources