Unable to add/connect CosmosDB in Azure Analysis Services - azure

I have created an Azure Analysis Service and CosmosDB in same region (West-India).But when I try to add new Model in Azure Analysis Service, I able to find only Sample Data(Adventurous DW) in the data source list. I am unable to find CosmosDB name in the drop down list of data sources.
Microsoft has mentioned that we can connect to Cosmos DB from Azure analysis Services for In-memory model. But I can't even find Cosmos DB in the list.
https://learn.microsoft.com/en-in/azure/analysis-services/analysis-services-datasource

Followed the official tutorial:Add a sample model from the portal,you could only work with sample model which is a completed version of the Adventure Works Internet Sales (1200) sample data model. A sample model is useful for testing model management, connecting with tools and client applications, and querying model data.
But based on the statements in the supported list data source document,the cosmos db needs Tabular 1400 and higher models only.
So,please follow the Adventure Works tutorial to create a tabular model project.Then you could create a connection to your cosmos db account inn Tabular Model Explorer, right-click Data Sources > Import from Data Source.
Key your db infomation:
Also,you could refer to a case related to this:https://social.msdn.microsoft.com/Forums/en-US/9394a10b-f085-4a68-9951-5000a6f799ef/cosmos-db-data-source-how-to-configure-the-key-in-azure-analysis-services?forum=AzureAnalysisServices

Related

Unable to get the data in Azure ML from SQL Server

My question ,is there a way to retrieve the data from a SQL Server into an Azure ML pipeline or into the datastore ?
I am currently have a pipeline in Azure machine learning that's taking in data from the datastore and then training a model on it.
Go to Azure ML studio and click on Data -> Datastore and provide required details.
Next create dataset, Set the name and type for your data asset -> Query the database.
Go to Designer, Visualize the data in workflow
For more information refer this link.

Migrate Data from Azure cosmos US region to European Region using data factory

I want to migrate records which has country column as one of the EU countries from US region cosmos DB to cosmos DB of West Europe.
I have multiple collections and I want to dynamically iterate through these collection, run queries "where country in ('abc')" and migrate/copy data to sink(cosmos of EU region).
How can I design a pipeline using azure data factory ?
I tried following this
https://learn.microsoft.com/en-us/azure/data-factory/tutorial-bulk-copy-portal.
I am not able to find a dynamic content for all tables from cosmos as source.
Thanks,
Rihuk
A Cosmos DB should be created in European Region and need to make that as a sink in copy activity in Azure Data Factory.
Below are the steps to create Cosmos DB for which you need to create 2 linked services one in US region that should be source database and other should be sink database in European Region.
Below image will help you in configuring service details, testing connection and creating new linked service.
Here is the Microsoft Document regarding Copy and Transform data in Azure Cosmos DB.
Also, you can Refer Azure CosmosDB container link if you are using Open API.

Azure Data Factory and GraphDb

In the Azure environment, I have an Azure SQL Db and a CosmosDb Graph. Using an Azure Data Factory, I
need to insert/update data from the Sql db to the GraphDb.
My thinking is that I need to first transform the data to json and from there insert it into the GraphDb.
Is this the way to go? Are there any other ways?
Thank you.
1.Based on the ADF copy activity connector and the thread: How can we create Azure's Data Factory pipeline with Cosoms DB (with Graph API) as data sink ? mentioned by #silent,Cosmos db graph api connector is not supported in ADF so far. You could vote up this feature in this feedback link which is updated at April 12, 2019.
2.Cosmos db migration tool isn't a supported import tool for Gremlin API accounts at this time. Please see this link:https://learn.microsoft.com/en-us/azure/cosmos-db/import-data
3.You could get an idea of graph bulk executor .NET library now.This is the sample application:git clone https://github.com/Azure-Samples/azure-cosmosdb-graph-bulkexecutor-dotnet-getting-started.gi

Azure Data Discovery and Classification

With the recent preview release of 'Data discovery & classification' for Azure SQL databases, has anybody found where this data is stored and if it can be queried directly from the Azure database? I know for on-premise databases if you right click on a database and choose 'Tasks - Classify Data...' anything you enter into that interface is stored as extended properties on the 'table/column'. However, after entering the same data via the interface in the Azure portal, there are no extended property values that I can find in my Azure SQL database. I would really like to be able to query this classification data directly so I can incorporate other metadata about the column such as data type, sample value, collation etc.
For Azure SQL DB, this metadata is stored in new attributes that have been introduced into the SQL Engine to support tagging column sensitivity, which are currently not exposed. We plan to expose them via REST/Powershell/T-SQL as the feature continues rolling out.
Please follow our announcements and the online feature documentation for updates.
Thanks,
Gilad (MSFT)

How can we create Azure's Data Factory pipeline with Cosoms DB (with Graph API) as data sink ?

How can we create Azure's Data Factory pipeline with Cosoms DB (with Graph API) as data sink ? (data source being Cosmos DB only (Document DB as API)
One option that is available to you is to simply continue using the Document API for the graph enabled CosmosDB sink. If you transform and write your documents into the destination in GraphSON format as regular documents they will be automatically usable as vertices and edges in future graph traversals.
The ability to use both DocumentSQL and Gremlin APIs against the same collection is one of the most exciting and powerful features of CosmosDB IMO (and the team plans to support more APIs interacting with the same dataset in the future).
Not only is this possible, but I've personally observed significant improvements in throughput when importing large datasets into a graph enabled Cosmos collection using the Document APIs instead of gremlin. I plan to release a blog post describing this process in more detail in the near future.
Cosmos DB Graph API is not supported yet and we will add to our product backlog.

Resources