loopback4 Project Structure - node.js

I come from express.js background and pretty new to loopback framework, especially loopback4 which i am using for my current project. I have gone through the loopback4 documentation few times and got some good progress in setting up the project. As the project is running as expected, I am not much convinced with project structure, Please help me to solve below problem,
As per docs, database operations should be in repositories and routes should be in controllers. Now suppose, My API consist lots of business logic along with database operations say thousand of lines. Which makes controllers routes difficult to maintain. More difficulty would arise, if some API demands version upgrade.
Is there any way to organise the code in controllers in more
scalable and reusable manner? What if i add one more service layer
between controllers and repositories and put business logic there?
how to implement it in the correct way? Is there any official way to
do that which is suggested by loopback community only?
Thanks in advance!!

Is there any way to organise the code in controllers in more scalable and reusable manner?
Yes, services can be used to abstract complex logic into its own separate class(es). Once defined, the service can be injected into the dependent controller(s) which can then call the respective service functions.
How the service is designed is dependent on the user requirements as LoopBack 4 does not necessarily enforce a strict design requirement.

Related

How to classify services in microservices?

I am new in microservices. I am coming from monolithic background in current environment i have different kinds services for different purposes like search, file, email, notification. I have taken so many courses but in that the instructor separate each entity and make it's own database also create API for that(like separate shopping cart entity, product entity) it makes no sense, I am not getting what is real world use of microservices or how to make separate component to build it's own microservice.
Can anyone give Real Project example?
Thanks in advance
Read this and this. Also look here and here. I don't think that anyone will give a link to the real working project, so you can try this.
I am not getting what is real world use of microservices
mostly as you heard in all of those tutorials the microservices architecture leverage advantages of:
the smaller services are easy to maintain and develop
easily can scale specific services rather than the whole project(monolith). for example you scale service-1 to 4 instances that request traffic split into these 4 instance and service-2 to 2 instances and go on (load balance). and these services may distributed in to different servers and locations.
if one service failed to work it does not terminate the whole system since they are independent.
services can be reusable for other scenarios or features.
small team can works for each services and its easy to manage both project and development flow.
and also it suffer from disadvantages of
services are simple and small but all as a whole system is complex so designing part are very critical.
poor performance and it requires do some extras to improve the performance (different types of caching on different levels).
transactions are complex and its developments are time costly. imagine simple update should be projected to other services if its required and you have to consider failure and rollback strategy ( SAGA ).
how to make separate component to build it's own microservice
this is the most challenging part of microservices. you need deep study on Domain driven design DDD.
Decompose by subdomain
Decompose by Business Capabilities
Can anyone give Real Project example?
there are many projects the develop microservices with different patterns. I think you have to start your own and make your hands dirty.

Multiple web services sharing single database using Sequelize

I'm trying to make a service architecture which includes two Node.js apps which shares the same database. The overall service architecture looks like below (simplified version)
I'm planning to use Sequelize as an ORM to access the database. As far as I know, if a service uses Sequelize, it needs model to get the structure of data tables. In my case, api and service will access the same database, which means they should share the same Sequelize model.
So here is the question: where should I locate the common Sequelize relevant files? It seems I have two choices:
put them on the upper common location (assuming the project structure is monorepo) so that each apps can use the single same files
maintain copies of files in each apps' project folders. In this case, each apps will be independent(Let's say I want to dockerize each apps) but in case the Sequelize files modified, the same action should be done for the other.
I'm not sure how I understood is correct. Is my question valid? If so, what is the better choice and practice? I appreciate for your answers in advance.
There is no correct answer, it depends on the specific situation, but sharing a database between multiple microservices is a bad design.
Sharing a database means tight coupling at the data level. The direct consequence is that when a service modifies the database table structure, such as deleting the name field of the user table, it may break the APIs of other services and all use the sequelize user model. All services need to update the model definition and modify the implementation code of the API.
If all of your services are maintained by a team, I suggest you choose the first solution, which costs less and is easier to maintain. If your services are maintained by different teams, the two solutions are actually similar, because as long as the table structure is modified, the application layer model needs to be modified or verified whether it still works well.
Therefore, I recommend following the best practices of microservice architecture, first splitting the database vertically according to the business model, and building application APIs on top of it.
Core principles of microservices:
loose coupling
high cohesion

API Architecture - Business logic tightly coupled to routes?

To speed up development for my next Node-API I was looking for a suitable Framework. In the past I was building my APIs with express only.
One Design pattern I always found useful is to completely seperate the business logic from route-handling in services. Those services only accept the required information (like a user id or data) and return a promise resolving the result of the operation.
This way it is easy to reuse these services in other routes, to combine them, test them, or call them based on schedules or other events - totally independent from endpoint-calls. Routing and Middleware take care of access-controll, error-handling and respondig.
Looking at the documentations of those frameworks (sailsjs, keystonejs, ...) I mostly see the business-logic tightly coupled to individual routes, directly accepting request objects and handling the responses. Only as an afterthought it seems there is sometimes offered a way to extract "often used code" into helper functions.
Am I missing something? How come this pattern seems to be the standard of API design? Is this a best practice for a reason?
It might have to do with Node.js services being smaller in size. If you're coming from an enterprise background, you're well aware mixing business-logic with controller code doesn't fly in the long run. Perhaps small projects can get away with defying that, but once the size increases, you can't avoid the laws of physics. It's best to separate concerns and keep the codebase maintainable.
I'd also add that below services, it's good to have a separate layer that handles talking to outside process boundaries. That way, you can test business logic in isolation by providing appropriate test doubles for integrations. Here's a longer explanation of how it would work in a Node project: Organize Node.js API project using 3-layer architecture.

How to design a sails.js project with microservices architecture?

I learned about microservices from here
Now, I want to use microservices architecture in my next sails.js project.
One way I could think of is:
Breaking my one sails.js application into multiple small sails.js sub-projects/repositories.
Having one controller-model in one sub-project. For example, If we consider simple eCommerce app with entities say User, Products, Orders, etc. then there will be separate sails.js repositories for each of them with respective sails.js model-controller. Then this single sub-repository will from my one microservice.
Each sub-repository then will obviously have its own configs.
These microservices will them communicate with each other using some HTTP node module.
Then writing my own API gateway for routing in node.js, which will be responsible for invoking methods/web-services from these sub-repositories depending on the request from clients.
Is this the best way OR is there alternative way to design your project using microservices architecture?
What will be the best way to implement inter-service communication, API gateway with sail.js? If one microservice designed with above mentioned approach get bigger, and if I have to split it up in 2, how sails.js model should be changed?
The most important aspect of designing microservices is the separation of concerns which means each microservice will have a defined boundary under which they need to work.
Each microservice is designed to do a defined work so, first you need to find the independent functionalities in you project and try to create a microservice for it.
The most important thing to note is you should first start with a monolithic architecture and if you identify that some functionalities needs to be separated then you can create a microservice out of it.
As far as sails is considered then it is a good candidate for MVC and if the project is monolithic but if the number of microservices is large then it is not a good choice because running large number of microservices with sails.js will consume more of your system RAM.Sails.js internally uses so many libraries which you will not need. You can make a simple microservice with just node.js core modules and they will consume less memory too.
Also when each microservices handles small functionalities so the amount of
code will be less and there is no need for mvc arcitecture. you can use less number libraries to create it.
Conclusion
If number of services is less and you don't worry about system RAM then go for multiple sails application.
If number of services going to be more then try to make your services without using sails
I agree with the previous answer and I would add that Sails is a great candidate for clustering and in an environment where you may wish to scale horizontally to improve availability. I do not believe sails is the right candidate for the micro service architecture, however it is most likely the focus for an application which requires the usage of multiple services in its own right.
I use a message service to glue together multiple applications, with sails consuming these messages in order to update a webpage. I probably see those applications as offering smaller services, with defined boundaries and my sails application as the front end, with the controller gluing what is necessary to satisfy the requirements of the end user.

Choice of technical solution to handling and processing data for a Liferay Project

I am researching to start a new project based on Liferay.
It relies on a system that will require its own data model and a certain agility and flexibility in data management as well as its visualization.
These are my options:
Using Liferay Expando fields and define their own data models. I must do all the view layer.
Using Liferay ECMS adding patches creating structures and hooks that allow me to define data models Master - Detail. It makes much easier viewing issue (velocity templates), but perhaps is the most "dirty" way.
Generating data layer and access to services with Hibernate and Spring. (using Service Factory, for example).
Liferay Service Builder would be similar to the option of creating the platform with Hibernate and Spring.
CRUD generation systems as OpenXava or your XMLPortletFactory
And now my question, what is your advice? What advantages or disadvantages do you think would provide one or another option?
Thanks in advance.
I can't speak for the other CRUD generation systems but I can tell you about the Liferay approaches.
I would take a hybrid approach.
First, I would create the required data models as best as I can with the current requirements in Liferay Service Builder and maintain them there as much as possible. This would require that you rebuild and redeploy your plugin every time you changed the data model but would greatly enhance performance compared to all the other Liferay approaches you've mentioned. Service Builder in that regard is much more rigid and cannot be changed via GUI.
However, in the event for some reason you cannot use Service Builder to redefine your data models and you need certain aspects of it the be changed via GUI, you can also use Expandos to extend the models you've created with Service Builder. So, it is the best of both worlds.
On the other option, using the ECMS would be a specialized case and I would only take this approach if there is a particular requirement it satisfies (like integration with the ECMS).
With that said, Liferay provides you many different ways to create your application. It ultimately depends on how you're going to use your application.

Resources