Consume external api into kentico - kentico

What is the best way to consume an external api's data?
Do I need to create a new web api project and set up routing?
In the past I use a web service data source and attached a repeater. This won't work because I have an API instead of a web service.
Thanks much

you can try this, this is how i've converted my JSON / XML apis (or anything really) into a Transformable object, just clone this tool and adjust to your needs
https://devnet.kentico.com/marketplace/utilities/universal-api-viewer-(with-hierarchy-support)
A custom Data Source is what you would still want to do, as all a data source does really is return a Data Table, my tool there takes it another step by assigning it hierarchy structure and psuedo page types so the Repeater can treat them like items on the content tree.
After reading you can now connect externally do the database, you can use Kentico's ConnnectionHelper to connect to the external database via the Connection String, then query it.
If you have access to the external database, then you can use Kentico's ConnectionHelper class to pass in the external database connection string and run queries against it if you wish.
GeneralConnection ConnectionObj = ConnectionHelper.GetConnection("GetConnectionStringFromWeb.ConfigHere");
ConnectionObj.Open();
DataSet Results = ConnectionObj.ExecuteQuery(new QueryParameters("select * from SomeTable", null, QueryTypeEnum.SQLQuery));

Related

CosmosDB return data from external API on read

I am attempting to write an Azure CosmosDB integration (Core SQL Api) that integrates with an external service to provide some of the query data. As an example, I need a query made on Cosmos DB to convert some of the data returned (e.g. ID's) by the query into real data by calling an external service via a REST API. This should only happen when querying certain columns.
I initially investigated using a JS stored procedure and/or a UDF to make this external call, but the JS environment seems to be extremely limited and doesn't provide any way to make external calls. I then tried using this https://github.com/Oblarg/cosmosdb-storedprocs-ts repository, which uses webpack to bundle all of node.js into the stored procedure, allowing node modules to be used in stored procedures. Whilst this does allow some node modules to be used, whenever I try and use "https", "fetch", or "axios" modules to make an HTTP GET request I get errors (the same code works fine in a normal node environment, but I'm not a JS expert and can't seem to work past these errors). After a day of attempts it seems like the stored procedure approach is not possible.
Is this the case or is there some way of making HTTP GET requests from a JS stored procedure? If not possible with stored procedures, are there any other techniques to achieve the requirement of reading data from a remote API when querying cosmos DB?
Thanks
There is no way to achieve this from CosmosDB directly, for queries you also cannot use the change feed as the document dont change, so really your only option is to use a function or some preprocessor app to handle it, as you say its not ideal but there is no other solution here. If it was an insert or an update then change feed would allow you to do this but for plain queries its not possible.

Liferay Search Container

I have used Liferay search container for displying data for custom entity and it is working.
I have another portlet where data coming from REST API so is there any way that I can use search container? OR I need to use datatable for that.
My REST API is with pagination and without pagination.
You can use SearchContainer with any data. Its uses with DB is more natural of course but where the data comes from is irrelevant as long as SearchContainer can access it. So you can build your own service that talks to remote API and provides data to the SearchContainer. In case you haven't done that for your other service, see here for example how it can be build server side.

Fetching Initial Data from CloudKit

Here is a common scenario: app is installed the first time and needs some initial data. You could bundle it in the app and have it load from a plist or something, or a CSV file. Or you could go get it from a remote store.
I want to get it from CloudKit. Yes, I know that CloudKit is not to be treated as a remote database but rather a hub. I am fine with that. Frankly I think this use case is one of the only holes in that strategy.
Imagine I have an object graph I need to get that has one class at the base and then 3 or 4 related classes. I want the new user to install the app and then get the latest version of this class. If I use CloudKit, I have to load each entity with a separate fetch and assemble the whole. It's ugly and not generic. Once I do that, I will go into change tracking mode. Listening for updates and syncing my local copy.
In some ways this is similar to the challenge that you have using Services on Android: suppose I have a service for the weather forecast. When I subscribe to it, I will not get the weather until tomorrow when it creates its next new forecast. To handle the deficiency of this, the Android Services SDK allows me to make 'sticky' services where I can get the last message that service produced upon subscribing.
I am thinking of doing something similar in a generic way: making it possible to hold a snapshot of some object graph, probably in JSON, with a version token, and then for initial loads, just being able to fetch those and turn them into CoreData object graphs locally.
Question is does this strategy make sense or should I hold my nose and write pyramid of doom code with nested queries? (Don't suggest using CoreData syncing as that has been deprecated.)
Your question is a bit old, so you probably already moved on from this, but I figured I'd suggest an option.
You could create a record type called Data in the Public database in your CloudKit container. Within Data, you could have a field named structure that is a String (or a CKAsset if you wanted to attach a JSON file).
Then on every app load, you query the public database and pull down the structure string that has your classes definitions and use it how you like. Since it's in the public database, all your users would have access to it. Good luck!

Sail.js - how to structure JSON based live data output with existing static data in the model

In my Angular app, I want to display a table which contains the following
a) URL
b) Social share counts divided by different social networks
Using Sails.js, I already have the api created for the URL when the results show up, I can display the URL now I'm confused how to get the appropriate social counts showing right besides
Here's the API I'm using: https://docs.sharedcount.com/
by itself, I can see the JSON it produces
But here are my questions:
Should I create a new api (model/controller) for social count data or include it in my model where I have the 'url' action defined?
If I create a new api or include the social_counts as an action in the current, what would my JSON query look like? to retrieve the URL's, I'm using default API blueprint that Sails provides, so:
http://www.example.com/url/find?where={"title":{"contains":"mark"}}
Struggling a bit in terms of the thought process, would be great to get input on this
It depends on your app. is your app will store that data or just consume it? If it need to store, of course you need the API. In purpose for modification or aggregating the data for example.
No, you can't do that. That shortcut method only works if you have the data in your database and let the Sails Waterline ORM and Blueprint API served it.
Perhaps, if you only need to consume the data from that Sharedcount API, you didn't need to use Sails as a backend, in this context. Just use Angular as a client of that API. Except if you need to modify the data first and store it in your own database, so Sails will helps with it's Waterline ORM and Blueprint API.

Restify: Passing User Data

I am writing service using NodeJS + Restify. I have split each actual service into separate file (what, I assume, everyone is doing). They all are going to be using mysql database so I thought I could open a single connection to database which could be used by each service rather than opening connections every time a request is done.
The problem is that I don't seem to find a way to pass user data. By user data I mean any custom data that would be accessible by every service callbacked by the server.
I primarily use NodeJS + Express, but having looked through some of the documentation of Restify, I believe you could use the authorization parser (under Bundled Plugins on their site: click here to go there)
I think that would be the most basic way to pass user data.
I haven't tested it but, I believe you'd just add this to use it:
server.use(restify.authorizationParser());
You could then access the user data with:
//This is based on the structure of req.authorization in the documentation.
req.authorization.basic.user
I believe you could set new user data (when the user logs in or something) like:
req.authorization.id = 'id'

Resources