I know this is possible with Rest but would like to know if there is any way to upload Documents that are valid JSON only without the need to deserialize them and provide an object?
This is purely a performance optimization. Our SQL query returns the object as JSON and would prefer not to have to deserialize it in .Net consuming resources to be able to upload to Azure Search.
After thinking through using a custom JsonConverter scenario where I store a string and retrieve it, I have dismissed that option as probably not worth while.
No, the SDK does not support this scenario. You'd be better off writing a simple client to send your JSON to the REST API directly.
Related
I am attempting to write an Azure CosmosDB integration (Core SQL Api) that integrates with an external service to provide some of the query data. As an example, I need a query made on Cosmos DB to convert some of the data returned (e.g. ID's) by the query into real data by calling an external service via a REST API. This should only happen when querying certain columns.
I initially investigated using a JS stored procedure and/or a UDF to make this external call, but the JS environment seems to be extremely limited and doesn't provide any way to make external calls. I then tried using this https://github.com/Oblarg/cosmosdb-storedprocs-ts repository, which uses webpack to bundle all of node.js into the stored procedure, allowing node modules to be used in stored procedures. Whilst this does allow some node modules to be used, whenever I try and use "https", "fetch", or "axios" modules to make an HTTP GET request I get errors (the same code works fine in a normal node environment, but I'm not a JS expert and can't seem to work past these errors). After a day of attempts it seems like the stored procedure approach is not possible.
Is this the case or is there some way of making HTTP GET requests from a JS stored procedure? If not possible with stored procedures, are there any other techniques to achieve the requirement of reading data from a remote API when querying cosmos DB?
Thanks
There is no way to achieve this from CosmosDB directly, for queries you also cannot use the change feed as the document dont change, so really your only option is to use a function or some preprocessor app to handle it, as you say its not ideal but there is no other solution here. If it was an insert or an update then change feed would allow you to do this but for plain queries its not possible.
We are integrating a new system with DocuSign. The system is built using C# objects.
My question is, which is the best practice to interface with DocuSign - call the DocuSign Web API methods directly, or include the DocuSign C# client library as a reference to our code, and call that directly?
Thanks!
I recommend using the C# client library. It will save you time and make it very easy to use.
The code for it is also public in github, so if for some reason you want to fork and use it that way - you can do that too.
The DocuSign C# SDK saves you the bother of:
serializing the request objects into a JSON structure
sending the HTTPS request
deserializing the response objects into C# objects.
It also includes helper methods for implementing the OAuth JWT Grant flow.
These are all good reasons to use the SDK.
If you expect that you will be regularly sending documents that are above 20MB in size then you may want to implement the Envelopes::create call yourself. Why? Because the current version of the SDK BASE64 encodes the documents you upload to DocuSign.
If you implement the Envelopes::create call yourself, you can send the documents in binary. This isn't so easy to do but is important if you have very large source documents. An example of how to send in binary mode.
Added
Size limits: 25MB per API call. But document(s) in an API call that are BASE64 encoded have alot of overhead. So in this case, the effective max doc size is around 20 MB.
You can have multiple documents in an envelope. To have multiple large documents, create the envelope as a draft, then upload the additional documents as separate API calls.
See the API Limits document
I need to set up an application in Azure and make communicate 2 functions (one written in C# and one written in JavaScript).
The C# fragment consists in analyzing a XML feed, get the data and save in objects then finally send them to the other JavaScript function by parameter.
I did read that we could establish communication between both functions using HTTP calls but is it possible to do it with parameters ?
If not, would have any suggestions in order to achieve something like this properly? I'm getting started with Azure and i don't have enough visibility to know what is recommened in such a situation
Thank you for your advices
Yes, this is absolutely possible. How you do this is up to you. If you look at the default HTTP trigger templates, you can see that they take parameters (for example, as query string parameters). You can find more examples in the HTTP and webhook recipes documentation.
You can use other trigger types for cross-function communication as well. Take a look at this documentation for related best practices: https://learn.microsoft.com/en-us/azure/azure-functions/functions-best-practices#cross-function-communication
What is the best way to consume an external api's data?
Do I need to create a new web api project and set up routing?
In the past I use a web service data source and attached a repeater. This won't work because I have an API instead of a web service.
Thanks much
you can try this, this is how i've converted my JSON / XML apis (or anything really) into a Transformable object, just clone this tool and adjust to your needs
https://devnet.kentico.com/marketplace/utilities/universal-api-viewer-(with-hierarchy-support)
A custom Data Source is what you would still want to do, as all a data source does really is return a Data Table, my tool there takes it another step by assigning it hierarchy structure and psuedo page types so the Repeater can treat them like items on the content tree.
After reading you can now connect externally do the database, you can use Kentico's ConnnectionHelper to connect to the external database via the Connection String, then query it.
If you have access to the external database, then you can use Kentico's ConnectionHelper class to pass in the external database connection string and run queries against it if you wish.
GeneralConnection ConnectionObj = ConnectionHelper.GetConnection("GetConnectionStringFromWeb.ConfigHere");
ConnectionObj.Open();
DataSet Results = ConnectionObj.ExecuteQuery(new QueryParameters("select * from SomeTable", null, QueryTypeEnum.SQLQuery));
In my Angular app, I want to display a table which contains the following
a) URL
b) Social share counts divided by different social networks
Using Sails.js, I already have the api created for the URL when the results show up, I can display the URL now I'm confused how to get the appropriate social counts showing right besides
Here's the API I'm using: https://docs.sharedcount.com/
by itself, I can see the JSON it produces
But here are my questions:
Should I create a new api (model/controller) for social count data or include it in my model where I have the 'url' action defined?
If I create a new api or include the social_counts as an action in the current, what would my JSON query look like? to retrieve the URL's, I'm using default API blueprint that Sails provides, so:
http://www.example.com/url/find?where={"title":{"contains":"mark"}}
Struggling a bit in terms of the thought process, would be great to get input on this
It depends on your app. is your app will store that data or just consume it? If it need to store, of course you need the API. In purpose for modification or aggregating the data for example.
No, you can't do that. That shortcut method only works if you have the data in your database and let the Sails Waterline ORM and Blueprint API served it.
Perhaps, if you only need to consume the data from that Sharedcount API, you didn't need to use Sails as a backend, in this context. Just use Angular as a client of that API. Except if you need to modify the data first and store it in your own database, so Sails will helps with it's Waterline ORM and Blueprint API.