We always need to perform validation and some business logic both in the server and client side. As we are using Angular2 and node with typescript (TS), we have conclude that a vast amount of the code and class model will be equal except by the persistence model (socket from client to server, sql from node to database).
As the objects can persist itself, we should be able to override just the save() method and so use codes like bellow
<input [ngModel]=‘person.getSomeProperty()’ (ngModelChange)=“person.setSomeProperty($event)“>
...
person.save()
person.save() would send the information to node that would rehydrate the class and perform the same person.save(), but by this time performing the dml into the database.
From this perspective, I have some question regarding the strategy:
* Is it sound reasonable to reuse code in such way? The uses cases do match.
* Is it correct to create a node module with all the business logic (like 20 classes) and export it to be used by the components?
* Can I two way bind to the same person.someMethod() from different components and views?
Related
Creating new project with auto-testing feature.
It uses basic express.
The question is how to orginize the code in order to be able to test it properly. (with mocha)
Almost every controller needs to have access to the database in order to fetch some data to proceed. But while testing - reaching the actual database is unwanted.
There are two ways as I see:
Stubbing a function, which intends to read/write from/to database.
Building two separate controller builders, one of each will be used to reach it from the endpoints, another one from tests.
just like that:
let myController = new TargetController(AuthService, DatabaseService...);
myController.targetMethod()
let myTestController = new TargetController(FakeAuthService, FakeDatabaseService...);
myTestController.targetMethod() // This method will use fake services which doesnt have any remote connection functionality
Every property passed will be set to a private variable inside the constructor of the controller. And by aiming to this private variable we could not care about what type of call it is. Test or Production one.
Is that a good approach of should it be remade?
Alright, It's considered to be a good practice as it is actually a dependency injection pattern
Well, actually I'm working on developing a web app using ReactJS. This app includes several components; each has its own attributes defined in the state. In addition, each has its own CRUD components. E.g. I have Posts component, which has PostsList, NewPost, UpdatePost, PostInfo components. Also I have Users component, which includes UsersList, NewUser, UpdateUser, UserInfo components, and so on.
What has been noticed that, every time I have to repeat the same steps for each component (Posts and Users), and the only different things are: the state attributes' names & types, and the APIs that manipulate the data.
So, how can I clean my code to keep it DRY as possible??
Thanks in advance.
The project I'm working on uses the feathers JS framework server side. Many of the services have hooks (or middleware) that make other calls and attach data before sending back to the client. If I have a new feature that needs to query a database but for a only few specific things I'm thinking I don't want to use the already built out "find" method for this database query as that "find" method has many other unneeded hooks and calls to other databases to get data I do not need for this new query on my feature.
My two solutions so far:
I could use the standard "find" query and just write if statements in all hooks that check for a specific string parameter that can be passed in on client side so these hooks are deactivated on this specific call but that seems tedious especially if I find this need for several other different services that have already been built out.
I initialize a second service below my main service so if my main service is:
app.use('/comments', new JHService(options));
right underneath I write:
app.use('/comments/allParticipants', new JHService(options));
And then attach a whole new set of hooks for that service. Basically it's a whole new service with the only relation to the origin in that the first part of it's name is 'comments' Since I'm new to feathers I'm not sure if that is a performant or optimal solution.
Is there a better solution then those options? or is option 1 or option 2 the most correct way to solve my current issue?
You can always wrap the population hooks into a conditional hook:
const hooks = require('feathers-hooks-common');
app.service('myservice').after({
create: hooks.iff(hook => hook.params.populate !== false, populateEntries)
});
Now population will only run if params.populate is not false.
I have a few PHP scripts which I am in the process of migrating to Node.js. I am using Sails.js for this and I would like to know how I can change databases for each request based on a request parameter.
Currently I have 3-4 identical PostgreSQL databases. Let's just say that each database corresponds to a different client.
Below is a segment of the current PHP script where the database connection is established:
$database = $_GET['db'];
$conn_details = "host=localhost port=5432 dbname=$database user=****** password=******";
$dbconn = pg_connect($conn_details);
Here you can see that the database name is coming from the request parameter "db".
I would like to have a similar functionality in my sails.js controller. I know that i can declare multiple databases in the connections.js and that I can have models use different databases but what i am after is for the models to stay the same and only the database to change based on each request.
I have found 2 similar questions but they have both stayed unanswered for quite some time now. (Here and here)
I think you are looking for something like sub apps
sails-hook-subapps
but it's experimental module. So i wouldn't recommend using it on production. Other option also not good is multiplying your Models like that:
One main model with all methods, attributes and "stuff"
Many models with connections config
In 'parent' model you will select to which model you want to send send action. For example write method:
getModel: function(dbName){
return models[dbName];
}
in models Object you will store all Models with different connections. Not sure how validators will works in this scenario. You need to test if it will not be required do do something like this in child Models
attributes: parentModel.attributes
In which of the MEAN stack level is it best to load bulk data? I have about 200 - 800 entries of 2 - 3 different types (i.e. they would require 2 - 3 different Mongoose schemas).
Here are the options to load these data (feel free to point out any misunderstandings, I'm new):
Client side: Angular level
Automate lots of user inputs
Server side: Nodejs + Express + Mongoose
Define the schema in Mongoose, create the objects, save each one
Database side: Mongodb
Make a json file with the data, and import it directly into Mongo:
mongoimport -d db_name -c collection_name --jsonArray --file jsonfilename.json
The third way is the purest and perhaps fastest, but I don't know if it's good to do it at a such low level.
Which one is the best? If there is not an optimal choice, what would be the advantages and disadvantages of each?
It depends on what you're bulk loading and if you require validations to be done.
Client side: Angular level
If you require the user to do the bulk loading and require some human readable error messages that's your choice
Server side: Nodejs + Express + Mongoose
You can bulk import from a file
Expose a REST endpoint to trigger bulk import of your data
You can use Mongoose for validation (see validation in mongoose)
Mongoose supports creating multiple documents with one call (see Model.create)
Database side: Mongodb
Fast, No code needed
No flexible validation
I'd choose the option that fits your understanding of the bulk data import best: If it requires a UI your option is 1 combined with 2, if you see this as part of your "business" logic and you're importing data from a external file or want other systems to trigger that import your option is 2, if you see it as a one time action to import data or you don't require any validation or logic related to the import the best choice is option 3.
Loading it via the client side will require you to write more code to handle importing and to send to the backend, then handle it in Node.js.
The fastest method out of all of them would be to directly import it the data using mongoimport.