Can Azure Data Catalog's Rest API be used to upload the data to be anaylzed? - azure-data-catalog

When I check official documentation, I understood that I need to prepare the metadata about a data source and then I can upload this information to data catalog using the rest API. When I use the web interface however, I can upload documents to be analzed by the Azure Data catalog itself. Is it possible to use the REST API in the same manner?

No. The REST API doesn't do any analysis. I believe only the client does that.

Related

ASP NET core API timesout while posting huge JSON

We have created Azure function to load the json file from Azure Blob storage and tried to post the data to an API to insert the data into SQL DB. since the file size is huge the POST API is timed out. I need suggestion for alternative approach to insert the json data to SQL DB quickly
Without any more detail than what you've provided, AND assuming you control both ends of this API equation... it sounds like you need to
write logic that can communicate some sort of transaction on the target API
rewrite the Azure func to segment the JSON blob into multiple API calls
then communicate to the target that all calls have completed (committing the transaction that you started)

Azure Data Lake Store exposed via OData

Is it possible to expose an Azure Data Lake Store via OData? The main goal is consume this service in Salesforce (via Salesforce Connect).
If so, should it take place through Azure Data Factory?
Update
A little bit more of context:
We have historical data stored in Azure Data Lake Storage (ADLS) that we want to expose via OData (to be visualised in Salesforce via Salesforce Connect / External objects). After digging into the issue and potential solutions, we don't think ADLS is the right service to be used in this particular case. Instead, we'll might need to configure a Data Factory pipeline to copy the data we are interested in to a SQL Database and read the data from there via a simple ASP.Net application using Entity Data Model and WCF Data Services Entity Framework Provider (got some insights from this website).
I don't think OData has a connector for ADLS. However, given OData is basically a REST API, you could probably build an OData API over the existing ADLS REST APIs if they are not providing what you need. I am not sure how ADF would come into this picture?
Maybe it would be useful if you tell us what you want to achieve?

How to implement REST API using NodeJs for group/multiple/batch insert in table of azure storage?

I went through the documentation of REST BatchSave Reference but they didn't give any example or step for it as they give in insert/update/delete/fetch, i am not able to understand how to design REST API for batch save. I am done with insert, update, delete and fetch of entity from a table of azure datastore, but got stuck on Batch/Group/multiple insert in a table of azure datastore.
Need help!!!
Here is an example of implementation by Microsoft Azure storage Node.js Client library, please look into:
tablebatch.js
batchresult.js
tableservice.js executeBatch function
Would you also share your concern about directly using azure-storage node.js package?
Best Wishes

How programically do what Microsoft Azure Storage Explorer is doing?

there is a tutorial in Microsoft docs that you can see here:
Tutorial: Build your first pipeline to transform data using Hadoop cluster
in "Prerequisites" section, in step 6, they wrote "Use tools such as Microsoft Azure Storage Explorer".
the question is, can I use some other tools? especially, is it possible to use scripting languages like Python directly?
I need to do all these 7 steps dynamically, using something like Azure Function Apps. do you know is it possible and if it is, from where I should start?
Short answer is YES. But again, you have not shared details on what functionality are you looking for specifically.
What you can do is call the REST API endpoints for the corresponding service.
Depending on whether you are using Blobs or Table or Queues, there are specific API's that you can call.
Azure Storage Services REST API Reference
Blob service REST API's
Queue Service REST API
Table Service REST API
File Service REST API
Taking Blobs as example, we have API's to upload content using PUT method. See this for more details: Put Blob
Similarly, you have API's for reading containers, listing containers etc.
There is also some samples on working with Azure Storage in Python on Github. See this: Azure-Samples/storage-python-getting-started
HTH

how can i connect to document DB from Azure ML

Is there any data source URL for Document Db to the Azure ML Data reader?
when i try to give the following URL
https://DBName.documents.azure.com:443/
It is asking for authorization
Does anyone tried giving document DB URL in Azure ML Portal(Reader Module)?
Thanks In Advance
Document DB is not supported out of the box by Azure Machine Learning currently.
I think the easiest current work around would be creating a web endpoint that provides all of the data. If it is very large data, maybe and endpoint that can provide subsets your webserver can support. This is a work around as AzureML does support web endpoints to query data from. You can add query support to your .net endpoints.

Resources