Should i store data that i fetch from third party databse - node.js

I have an API that fetch airplane schedule from third party databases. When the frontend show the data that the API fetch from the database, should the application take data from local database or take the data from the third party?

I am considering this data as dynamic in nature and pretty critical too ( airplane schedule).
I am also assuming that you are aggregating this data from a number of providers and you must have transformed this data into a generic structure ( a common format across all providers).
In my opinion, you should save it into a local database, with a timestamp which indicates when the data was refreshed last.
Ideally you should display the last refreshed info against each provider ( OR airline) in your site. Also you could run a scheduler to refresh the data in regular intervals.
It would be nice to show that next refresh is at "nn" minutes. ( with a count down ).
If you can afford to, you can let the user refresh the data, but it is risky if the concurrent users are considerably more in number.

This is only my opinion.
If the API data/record is not subject to change then saving it to local database can be a good idea.
Users will fetch the data from the local database and for updating this local database, you can create another program (run in server) to update/fetch the data from API. This way only limited connection is requesting to API.

Related

How can I store the state of Node.js REST API?

I build an API, which will send data to another API when has been collect 10 hashes. The client sends 1 hash per hour.
For example:
The client POST hash to API
API need to store it somewhere until the hashes number becomes to 10
When the number of hashes becomes 10 need to send data to another API and start from 0 again
My question related to the 2nd point. I can store the hashes in the array, the problem is that the data will be lost when the server will be shut down suddenly.
This is the only data which I need to store in API, so I don't want to use DBS.
By the way, it's my first time of developing API, so will be glad to your help.
Thanks in advance.
Sorry but your only options of storing data are either memory or disk.
If you store data in variables, you're using memory. It is fast and instant but it's not durable as you already said.
If you store data in database, you're using disk storage. It is slower but it is durable.
If you need durability, then database is your only option. Or maybe if you don't want to store the data in your machine, you could use cloud database such as firebase database.
Maybe your problem will be solved with Redis.
I had one feature where I needed to use some user's pieces of information on the server side in runtime and it could not be persisted at the database.
So, I used this.
In simple words, the redis will save the information in your cache and you can retrieve when you need.
There's no disk use and are more stable than a hand made memory control.
I hope this helps you.

use localstorage instead of database to avoid requests to the server

I am creating an application in which the user can post information as well as choose as a favorite the publication of someone else, when the user performs any of these actions I keep the necessary information in the in the database, specifically in a document where the information linked to the user is found (name, surname, telephone number, etc.).
so when the user logging in the page I get all that information with a single query and I keep it in the LOCALSTOAGE and reduce the queries in the database, then in a different section you can see the publications you have created as well as the ones you have marked as favorites, very similar to what we commonly see in an online store
I'm using angular 6, noje.js and mongoDB. My question is the following:
Is this a correct and effective way to do it?
Should I save it in the database and then perform the corresponding query to obtain it?
shows a screenshot of local storage for explicit use:
As you can see I also save the token that I use to authenticate the user's queries and obviously I do not show your password I would like your opinions.
You never should consider localStorage as an alternative to the database.
At some point, you might have a huge amount of data and your browser would crash to load them.
Bring the data you required from the server.
For some minimum and temporary amount of data, you can consider localStorage. Don't bring all the data in a single query to save database operation. Databases are built to do that for you.

How can I clear my local database using azure mobile services?

I'm using Azure Mobile Services and I want to clear local database, how can I do that?
I have a problem with my local database. When I logout in app and login with other user, the data of the previous user is loaded for current user and I don't have idea why this occurs. I use debug on server side and the server return correct data, then I believe that the problem is the local Database.
I'm using Azure Mobile Services and I want to clear local database, how can I do that?
For deleting your SQLite file, you could follow Deleting the backing store. Also, you could leverage the capability provided by IMobileServiceSyncTable to purge records under your offline cache, details you could follow Purging Records from the Offline Cache.
When I logout in app and login with other user, the data of the previous user is loaded for current user and I don't have idea why this occurs. I use debug on server side and the server return correct data
Since you did not provide details about your implementations (e.g. user log in/log out, user data management,etc), I would recommend you check whether your server/client side both enable per-user data store. You could use fiddler to capture the network traces when other user logging in, and make sure that the correctly user identifier (e.g. UserId) is returned, then check the query against your local database. Moreover, I would recommend you follow adrian hall's book about Data Projection and Queries.
You can delete all of the local DB files by doing the following.
var dbFiles = Directory.GetFiles(MobileServiceClient.DefaultDatabasePath, "*.db");
foreach (var db in dbFiles)
{
File.Delete(db);
}
However, this would delete all data each time and cause performance issues, as every time after you did this, you'd be getting a fresh copy of the data from your Azure DB, rather than using the cached copy in the device's SQLite DB.
We typically only use this for debugging, and the reason it's in a foreach is to capture all databases created (refer to my last suggestion)
There are a few other things you could try to get around your core issue of data cross-over.
There's another reason you might be seeing this behaviour. With your PullAsync, are you passing it a query ID? Your PullAsync line should look similar to this.
GetAllFoo(string userId)
{
return await fooTable.PullAsync("allFoo"+userId,fooTable.Where(f=>f.userId == userId));
}
Note that the query ID will be unique each time (or at least, for each user). This is used primarilly by the offline sync portion of Azure, but in combination with the Where statement (be sure to import System.Linq), this should ensure only the correct data is brought back.
You can find more information about this here.
Also, some things you may want to consider, store a separate database for each userId. We're doing this for our app (With a company ID) - so that each database is separate. If you do this, and use the correct database on logging in, there's no chance of any data cross over.

Multiple pouchdbs vs single pouchdb

I created couchdb with multiple dbs for use in my ionic 3 app. Also upon integrating it with pouchdb for client side syncing i created seperate pouchdbs for each one of the dbs. Total 5 pouchdbs. My question
whether it is good idea storing multiple pouchdbs on client side owing to the no. of http connections that would be created by syncing the pouchdbs. Or shall I put all Couchdb databases into one database and use type fields to separate the docs. Then only one pouchdb need to be created and synced on client.
Also using pouchdb-authenticaion plugin, authentication data is valid for only the database on which signup/login methods were called. Accessing other databases returns unauthenticated.
I would say, if your pouchdbs are syncing in realtime, that should be less expensive to reduce their amount to one and distinguish records by type.
But it should not be that costly, but still very convinient to set up multiple changes feed per each ItemStore (e.g. TodoStore, CommentStore, etc) with corresponding filter function passing only docs of the matching type into the store it belongs to. It can also be achieved by filtering on the basis of design_docs (I'm not sure if it saves anything, at least in the browser)
One change feed distributing docs to store would be probably the cheapest solution. But I suppose the filter function can't be changes after the change feed was established, so it must know about all the stores (i.e. doc types) beforehand

Azure Change Feed Support and multiple local clients

We have a scenario where multiple clients would like to get updates from Document Db inserts, but they are not available online all the time.
Example: Suppose there are three clients registered with the system, but only one is online at present time. When the online client inserts/updates a document, we want the offline client(s) on wakes up to go look at change feed and update itself independently.
Now is there a way for each client to maintain it's own feed to the same partition (when they were last synced) and get the changes when the come online based on last sync?
When using change feed, you use continuation token per partition. Change feed continuation tokens do not expire, thus you can continue from any point. Each client can keep its own continuation token and read changes as needed/wakes up, this essentially means that each client can keep its own feed for each partition.

Resources