Single Shared Database, Fluent NHibernate, Many clients - c#-4.0

I am working on inventory application (C# .net 4.0) that will simultaneously inventory dozens of workstations and write the results to a central database. To save me having to write a DAL I am thinking of using Fluent NHibernate which I have never used before.
It is safe and good practice to allow the inventory application which runs as a standalone application to talk directly to the database using Nhibernate? Or should I be using a client server model where all access to the database is via a server which then reads/writes to database. In other words if 50 workstations when currently being inventoried there would be 50 active DB sessions. I am thinking of using GUID-Comb for the PK ID's.

Depending on the environment in which your application will be deployed, you should also consider that direct database connections to a central server might not always be allowed for security reasons.
Creating a simple REST Service with WCF (using WebServiceHost) and simply POST'ing or PUT'ing your inventory data (using HttpClient) might provide a good alternative.
As a result, clients can get very simple and can be written for other systems easily (linux? android?) and the server has full control over how and where data is stored.

it depends ;)
NHibernate has optimistic concurrency control ootb which is good enough for many situations. So if you just create data on 50 different stations there should be no problem. If creating data on one station depends on data from all stations it gets tricky and a central server would help.

Related

NestJs Design Problem: How can I avoid creating a Nodejs Instance for each team?

I made a CRM app using NestJs with Nodejs. I designed it in a way that each team has its own database because every teams data is difference and has no relation with other teams and also it made the process of back up much easier.
However, Now that I want to deploy my service I noticed that for each team I must create a separate nodejs Instance which makes ram usage very high. Imagine just for 10 teams I may need around ~500MB ram which will hurt me economically even in short run.
Solutions
I used TypeORM in NestJs so the first thought I had was to find a way to have multiple databases (not multiple connections) having them sharing same schema but dynamicly use one of them based on request's scope and details. Which seems the best solution so I can avoid creating another NodeJs instance and in same time I now have seperate database for each team.
I read nestJs and TypeORM documents but didn't found any way to accomplish that. So my other solution was to just use one database for everone and add something like team_id column to each table to make a filter data for each team.
Is it a good way?
Is there any other solutions to use one nestJs instance but with same schema for multiple databases?
I recommend to use one database.
The database can have a table saving all of the teams and other tables will have a new team_id column as you think.
One database for each team has disadvantages.
Multiple DB Connections
Since you need to use same Entities for all of the databases for the teams, you cannot use Single Database Connection. According to every incoming API request, the server will have to switch db connections.
DB Configuration in TypeORM
For multiple databases, the configuration will be looking like below:
imports: [
...,
TypeOrmModule.forRoot({
name
type
host
port
username
password
...
}),
TypeOrmModule.forRoot({
name
type
host
port
username
password
}),
...
]
If you need to add a new team, you have to update your code base for adding a new db for the team and have to redeploy your application. (maybe you will create a new database and perform migration too?)
Backup
I agree with you that it's better to backup a single team with multiple databases. But how about when you want to backup all teams? In most of cases, I believe it will need to backup all teams, not just a specific team.
Teams Management
Where do you save a team's information? How to know what team has what db?
Maybe you saved teams somewhere(in a separated db?). To know which database connection should be used in each request, it needs to make a new query?
Cost
If there are 100 teams, you are gonna make 100 databases? Also each application has development and production environment. In some cases, there can be more environments like staging. 2 envs will double the number of dbs.
Conclusion
Of course there will be a way to automate some of the items in the above list and it's still possible to use multipe databases in NestJS + TypeORM for your project but it looks not a good way and not a worth effort for your project.
I have seen some big multi-tenant applications (like grafana) and they weren't using multiple databases strategy.
I don't know how you are storing users, but since you are speaking about teams I suppose you have a place where users are stored and assigned to a team, could it be a table in a login common database?
A solution could be to bind each team to it's own database; once a user login (accessing data from common login database) you read the team which it belongs and the database for its data, then you can access CRM data from the database bound to the team the user belongs.

Is storing data on the NodeJs server reliable?

I am learning how to use socket.io and nodejs. In this answer they explain how to store users who are online in an array in nodejs. This is done without storing them in the database. How reliable is this?
Is data stored in the server reliable does the data always stay the way it is intended?
Is it advisable to even store data in the server? I am thinking of a scenario where there are millions of users.
Is it that there is always one instance of the server running even when the app is served from different locations? If not, will storing data in the server bring up inconsistencies between the different server instances?
Congrats on your learning so far! I hope you're having fun with it.
Is data stored in the server reliable does the data always stay the way it is intended?
No, storing data on the server is generally not reliable enough, unless you manage your server in its entirety. With managed services, storing data on the server should never be done because it could easily be wiped by the party managing your server.
Is it advisable to even store data in the server? I am thinking of a scenario where there are millions of users.
It is not advisable at all, you need a DB of some sort.
Is it that there is always one instance of the server running even when the app is served from different locations? If not, will storing data in the server bring up inconsistencies between the different server instances?
The way this works typically is that the server is always running, and has some basics information regarding its configuration stored locally - when scaling, hosted services are able to increase the processing capacity automatically, and handle load balancing in the background. Whenever the server is retrieving data for you, it requests it from the database, and then it's loaded into RAM (memory). In the example of the user, you would store the user data in a table or document (relational databases vs document oriented database) and then load them into memory to manipulate the data using 'functions'.
Additionally, to learn more about your 'data inconsistency' concern, look up concurrency as it pertains to databases, and data race conditions.
Hope that helps!

Decision path for Azure Service Fabric Programming Models

Background
We are looking at porting a 'monolithic' 3 tier Web app to a microservices architecture. The web app displays listings to a consumer (think Craiglist).
The backend consists of a REST API that calls into a SQL DB and returns JSON for a SPA app to build a UI (there's also a mobile app). Data is written to the SQL DB via background services (ftp + worker roles). There's also some pages that allow writes by the user.
Information required:
I'm trying to figure out how (if at all), Azure Service Fabric would be a good fit for a microservices architecture in my scenario. I know the pros/cons of microservices vs monolith, but i'm trying to figure out the application of various microservice programming models to our current architecture.
Questions
Is Azure Service Fabric a good fit for this? If not, other recommendations? Currently i'm leaning towards a bunch of OWIN-based .NET web sites, split up by area/service, each hosted on their own machine and tied together by an API gateway.
Which Service Fabric programming model would i go for? Stateless services with their own backing DB? I can't see how Stateful or Actor model would help here.
If i went with Stateful services/Actor, how would i go about updating data as part of a maintenance/ad-hoc admin request? Traditionally we would simply login to the DB and update the data, and the API would return the new data - but if it's persisted in-memory/across nodes in a cluster, how would we update it? Would i have to expose this all via methods on the service? Similarly, how would I import my existing SQL data into a stateful service?
For Stateful services/actor model, how can I 'see' the data visually, with an object Explorer/UI. Our data is our Gold, and I'm concerned of the lack of control/visibility of it in the reliable services models
Basically, is there some documentation on the decision path towards which programming model to go for? I could model a "listing" as an Actor, and have millions of those - sure, but i could also have a Stateful service that stores the listing locally, and i could also have a Stateless service that fetches it from the DB. How does one decide as to which is the best approach, for a given use case?
Thanks.
What is it about your current setup that isn't meeting your requirements? What do you hope to gain from a more complex architecture?
Microservices aren't a magic bullet. You mainly get four benefits:
You can scale and distribute pieces of your overall system independently. Service Fabric has very sophisticated tools and advanced capabilities for this.
You can deploy and upgrade pieces of your overall system independently. Service Fabric again has advanced capabilities for this.
You can have a polyglot system - each service can be written in a different language/platform.
You can use conflicting dependencies - each service can have its own set of dependencies, like different framework versions.
All of this comes at a cost and introduces complexity and new ways your system can fail. For example: your fast, compile-time checked in-proc method calls now become slow (by comparison to an in-proc function call) failure-prone network calls. And these are not specific to Service Fabric, btw, this is just what happens you go from in-proc method calls to cross-machine I/O - doesn't matter what platform you use. The decision path here is a pro/con list specific to your application and your requirements.
To answer your Service Fabric questions specifically:
Which programming model do you go for? Start with stateless services with ASP.NET Core. It's going to be the simplest translation of your current architecture that doesn't require mucking around with your data layer.
Stateful has a lot of great uses, but it's not necessarily a replacement for your RDBMS. A good place to start is hot data that can be stored in simple key-value pairs, is accessed frequently and needs to be low-latency (you get local reads!), and doesn't need to be datamined. Some examples include user session state, cache data, a "snapshot" of the most recent items in a data stream (like the most recent stock quote in a stream of stock quotes).
Currently the only way to see or query your data is programmatically directly against the Reliable Collection APIs. There is no viewer or "management studio" tool. You have to write (and secure) an API in each service that can display and query data.
Finally, the actor model is a very niche model. It serves specific purposes but if you just treat it as a data store it will not work for you. Like in your example, a listing per actor probably wouldn't work because you can't query across that list, or even have multiple users reading the same listing simultaneously.

Getting data from Sharepoint to another server

I am currently working on a mobile concept.
We are running a Sharepoint 2010 Intranet solution, which is ONLY accessible within the company.
We want to make a mobile solution (for people outside), with data from the Sharepoint server.
I would like to have the data moved i.e. every 10-15 minutes through a cron job, and then move the data to an external database, which the mobile solution can access.
What is the easiest way to move the data? Using the web services, or are there any other ways to do this?
Thank you on beforehand,
Jens
A possible solution is to code a timer job, which is a cron job scheduled by sharepoint that you can set to run every night with some Sharepoint Object model code that extracts all the data and sends it to the other server, you can do this using ado.net or any equivalent technology like ORMS etc, so this method pushes the data to the server.
If you have limitations on the connectivity like firewalls that only allow http traffic then definitely you will need to use either web services or the client object model, this method pulls the data from the server.
Client Object Model is preferred over web services as it among other features it batches the requests to make it more efficient, the api is better to manipulate data, etc.
Another option is to use SSIS to do the job as described in this artice:
http://msdn.microsoft.com/en-us/library/hh368261.aspx

Core Data - serve data to website?

I am designing a server that accepts network clients from native apps and can transact with them, resulting in data held on the server. I'm strongly considering using Core Data for this data store.
I also want a website to exist that could give users read-only access to information.
How can I achieve this sharing of data between separate processes (or even servers, potentially) using Core Data? Also, how can I actually pull info from a Core Data store to display on a website?
Core Data is not a database engine. It is an API for constructing the model layer of a Model-View-Controller design app. As such it has no mechanisms for concurrency or other multiuser database features. You could certainly create a server with Core Data but it would be a small dedicated server which would support only a handful of clients.
The best design would be to use Core Data in the client apps but to serve the data using a dedicated server platform. You can send the information back and forth however you like e.g. JSON.

Resources