I am working on a small hobby project, where I would really like some input and advice.
This is my first "real" node project, and I hope it will teach me a lot about node.js development. I am a .net developer by day, and have been for about 15 years professionally. I have had periods of doing Java as well. I have created small node.js projects to be used as micro services.
But this project can no longer be classified as a micro service ;-)
The purpose of the project is to sample some sensor data, and do some reporting. An idea I got from playing around with a PLC at university. I do that by sampling from a PLC, and emitting the data using ZeroMQ. My node.js server then listens for this sensor data, and stores it in a MongoDB.
I expose that data in a REST api. The REST api also exposes resources like batches and other stuff like authentication etc. On top of that I have an AngularJS app, that creates defines the UI.
The one thing that really annoys me, is that I want to globally assign what batch is running. I have a collection of batches, and one of them is the running one. There are a two ways I see to do this, and both illustrate my novice status in the node.js world. All users should be able to see what batch is running, and I want to be able to easily tell from anywhere in the code as well.
1) Set a flag on the object in Mongo. This has a number of problems. The obvious one being performance. I receive sensor data 10 times a second, and I don't want to ask the database every time what batch to save it under.
2) Save the info on the global object. I really don't like this either. I don't like global state in my code.
What is a good pattern for doing something like this? Does my question make any sense?
Thanks in advance
You can make a simple REST call to set the active batch and call it inside the batch when is started up and ready to accept requests. For example:
app.put('/active-batch', function(req, res, next){
// Make sure req.body is defined
app.set('active-batch', req.body);
res.end();
});
Then everywhere in the code you can use:
app.get('active-batch');
The app.set let you save data globally accessible in your app and app.get let you read previously stored data.
Related
I have 1 project which is divided into multiple SPA, I have 5 SPA, written in 2 in Angular, 2 in react and 1 in vue js. Now I have an integrated server which will serve the different files as per routing. I need to share the data from one app to another with least interaction of database. This is a scenario of micro frontends. Hope this clears my problem.
Any help will be appreciated.
There are three ways with which you can share data:
URL: Query Params/Path Params (Only for small data like ID, filters, etc.)
Session Storage: Use this only if you are not navigating to other tab/window
Local Storage: Most convenient and preferred way
Of course, if you are persisting state to Local Storage, then you have to handle flushing of the state by yourself when the user logs out.
This is a bit painful process to handle. You will have to write code to manage serialization and deserialization of JSON to Local Storage. To ease this, it is better if you have the same state management solution across all micro-apps. I recommend the use of Redux/MobX to do this. But if you are using Redux for React, Ng-Rx for Angular and Vuex for Vue, then you will not have any ready-made solution.
Also, when you are saving the state to Local Storage, either debounce it or do it lazily with little delay for performance reasons.
We are using micro-frontends for last two years and we use the mix of Local and Session storage to do our things. Luckily, for all the apps we use Redux, even with Vue, and that allows us to use redux-localstorage.
You can also use Cookies but it is generally better to avoid them.
1st, Custom element creation
I have worked for micro-front-end elements base architecture with #Angular/element module. As I worked, I used bellow flow
For code, I have followed build-a-micro-frontend-application-using-angular-elements.
Here it will provide you elements like native html elements.
2nd, Another good approach is to use library feature in angular. You can write your components, directive or pipe, then publish them or use them into other project directly. In this approach again you can reuse the same code.
3rd, We can use i-frame, but now days it is causing lot of security issues from browser.
Another option is to use a frontend event bus like EEV. Your application shell would be responsible for creating a shared event listener. Then each micro-frontend could emit events on that shared channel.
You could also use an RxJS Subject as a message bus within your App Shell and subscribe to it in the Micro Frontend Applications. Here's an example
I hope that gives you a couple additional ideas.
I came across same scenario where i have to pass the data from one micro-app to another.
and after lot of R&D i found that event based communication is the best , where i transfer the data in form on Event Objects.
here is some Example:
For sending data:
var event = new CustomEvent('userData', { "detail": { "id": id, ...rec } });
window.dispatchEvent(event);
and for receiving the data on other app is:
window.addEventListener("userData", function (e: CustomEvent) {
this.id = e.detail.id;
this.country = e.detail.country;
this.contact = e.detail.contact;
this.company = e.detail.company;
this.changeDetectorRef.detectChanges();
}.bind(this));
This approach does not need the DB communication.
Hope this will resolve your query!!!
Happy Coding!!!
I am building a sports data visualization application with server-side rendering in React (ES6)/Redux/React-Router-Redux. At the top, there is a class-based App component, and there are two different class-based component routes. (everything under those is a stateless functional component), structured as follows:
App
|__ Index (/)
|__ Match (/match/:id)
When a request is made for a given route, one API call is dispatched, containing all information for the given route. This is hosted on a different server, where we're using Restify and Sequelize ORM. The JSON object returned is roughly 12,000 to 30,000 lines long and takes anywhere from 500ms to 8500ms to return.
Our application, therefore, takes a long time to load, and I'm thinking that this is the main bottleneck. I have a couple options in mind.
Separate this huge API call into many smaller API calls. Although, since JS is single-threaded, I'd have to measure the speed of the render to find out if this is viable.
Attempt lazy loading by dispatching a new API call when a new tab is clicked (each match has several games, all in new tabs)
Am I on the right track? Or is there a better option? Thanks in advance, and please let me know if you need any more examples!
This depends on many things including who your target client is. Would mobile devices ever use this or strictly desktop?
From what you have said so far, I would opt for "lazy loading".
Either way you generally never want any app to force a user to wait at all especially not over 8 seconds.
You want your page send and show up with something that works as quick as possible. This means you don't want to have to wait until all data resolves before your UI can be hydrated. (This is what will have to happen if you are truly server side rendering because in many situations your client application would be built and delivered at least a few seconds before the data is resolved and sent over the line.)
If you have mobile devices with spotty networks connections they will likely never see this page due to timeouts.
It looks like paginating and lazy loading based on accessing other pages might be a good solution here.
In this situation you may also want to look into persisting the data and caching. This is a pretty big undertaking and might be more complicated than you would want. I know some colleagues who might use libraries to handle most of this stuff for them.
I'm trying to figure out how best to architect my app. But for starters trying to understand typical practices with respect to where to put things, and how the app should wire up to things like server.js, how server.js should work, and how you keep a persistent connection open for the website, and any backend services or modules.
This is a general question but let me try to be more specific, as specific as I can since I am new to Node.. and for a basis to start on with this question.
Lets say I plan on designing a simple Express App.
I've got this kind of structure for example so far:
Right now in server.js, I am just playing around with trying to connect to a mySQL database. So I've got a connection pool I'm creating, one call to the store to retrieve data, requires at the time for the node-mysql middleware I'm using, etc.
app.js just has very simple code, it's not modular yet, or even production ready but that's just me playing with the code, spiking things out. So in it I have you're typical stuff like setting the view, var app = express();, importing an express route definition from another module in my routes\index.js, stuff like that.
if I'm going to keep a database connection open, or other things open, how is that best organized/done by convention?
If you look at this example code, he's moving the var app = express() definition into service.js: https://github.com/madhums/node-express-mongoose-demo/blob/master/server.js. He keeps an open connection running and started from there, which makes sense, hence "server".
so should app.js do much? what is best practice or scope of what this should be doing. Once I start modularizing things out into their own .js files and node modules, how does the app.js morph through all those refactorings, meaning in the end what's its role and is it very thin in the end where it's just used to wire stuff up?
Then what should www.js which is now required by express 4 have and it's role?
It's kinda hard for me to start with just one aspect so I'm kinda going all over the place here in the above. I just want to know common conventions for putting stuff in app.js vs. server.js and then best way to keep and managed open connections to things...both in the backend and front-end such as http requests coming in, what should be the central point? routes of course but then so is app.js responsible for referencing routes?
I have found a few resources such as this but looking for more so if you know any or have any input, please reply. I'm more interested in the talk around app.js, server.js, connections, www.js, and where things should wire up to each other with these particular specific parts. I realize the rest is up to you on how you wanna name folders, etc.
There is no right way and (arguably) no wrong way. There are which are better than others, but then someone might say that they don't like this way and you should do it the other way and so on, until your project is over the deadline.
I often refer to this blog post about best practices when developing an express app.
You could also try one of yeoman generators. Choose one that suits most/all of your needs.
Bottom line, there is sadly still no answer to best structure of an app, I would recommend you to pick something that works best for you (and your team) and stick with it. Consistency is the most important thing to keep in mind while developing and JavaScript community it clearly lacking it.
I just started using node, backbone and mongoose not so long ago to create my first web app.
At the very beginning, I followed tutorials, and used backbone client side to define models. Those models mirror my mongoose schemas server side.
When I run schema.save() on one of my models, my data is automatically sent back to the client, with an _id.
But now that my app is almost finished, I realize that I don´t really need to save anything, as the only thing I do is query an api, and the data doesn´t have to be reused.
So my question is, what is the best way to keep the same mechanisms, but without saving anything?
The end reason is that I plan to run the app on an ec2 instance, and knowing that I don´t need to save anything, I think it is more beneficial to reduce the IO usage by not having any database.
Thanks, and sorry if the question seems dumb.
I have a website which can have up to 500 concurrent viewers, with data updated every three seconds. Currently each user has an AJAX object which calls a web-page every three seconds which queries a DB and returns with the results.
What I would love to do is have each client get a socket to a node.js object, this node.js would poll the DB every 3 seconds for updated data, if it had updated data it would then be announced (ideally through JSON) and each client would then have the data pushed to it and update the page accordingly.
If this is possible, does anyone have a recommendation as to where I start? I am fairly familiar with JS but node.js seems to confuse me.
Thanks
I myself have quite few experience with node.js.
It is absolutely doable and looks like the perfect use case for node.js.
I recommend starting with an Express Tutorial and later on use socket.io.
I don't know which DBMS you are using, but there probably is a nice package for that as well. Just look through this list.