I would like to analyze some metrics from a Team in my Azure DevOps Organization using Kibana along with some other data. Therefore I need to import all commits into the elastic search database. However, according to elastic’s integration page https://www.elastic.co/de/integrations?solution=all-solutions they only support Acquisition of data from Azure’s infrastructure portfolio not their DevOps product.
This is why I thought about gathering the data using DevOps’s REST-API which I would call on a daily basis.
My Question: Is there a more optimal way? Has someone already done something similar?
Thank you in advance.
You may check Content Sources:
Workplace Search can ingest data from many different content sources.
A content source is usually a third-party service like GitHub, Google
Drive, or Dropbox. You can also build your own connectors using
Custom API sources, which allows you to create unique content
repositories on the platform and send any data to Workplace Search via
uniquely identifiable endpoints.
Since Workplace Search doesn't support DevOps, you could check Connecting custom sources. And work with Commits - Get Commits API to retrieve git commits for a project:
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits?api-version=6.0
Related
Since ADC is provided by MS as SaaS to customers, is MS taking backups of the dataset and business glossary? If yes, how often and how can a customer get access to the backups for recovery purposes?
Unfortunately, there is no explicit backup/restore feature available for catalogs.
I would suggest you to vote up an idea submitted by another Azure customer.
https://feedback.azure.com/forums/906052-data-catalog/suggestions/33125845-azure-data-catalog-backup-feature
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
The closest way to achieve this with current functionality is to use the Azure Data Catalog REST API to extract all assets and persist them locally (and re import them manually later).
There is a sample application available that demonstrates this technique: Data Catalog Import/Export sample tool.
I've been working on migrating all of the work items from one Azure DevOps (Services) project to another project in the same Organization.
I used the nkdAgility azure-devops-migration-tools to successfully copy the majority of existing work items across, but it did not grab our Shared Queries.
I played around with the Azure Rest API in powershell to list the queries. I also looked at the AZ CLI suite to see if there was a way to list the queries. I was able to find a couple at the root level, but it was not the entire list of Shared Queries.
Is this possible to accomplish through either of the above methods?
My Google-fu was strong today! Here's a link to a script that does almost exactly what I want.
Migrate Azure DevOps work items queries to a new organization
The only difference is that I am staying within my Organization, so making mods accordingly. Also, the Azure Rest API has probably evolved a bit since the original script was written, so I am updating the requests to handle that.
Thanks Josh Kewley!
I connected to Azure DevOps Boards using personal access token to fetch workitems by referring to the link: https://learn.microsoft.com/en-us/azure/devops/report/powerbi/data-connector-connect?view=azure-devops
I was able to connect to only one organization and one project under that at a time.
I have a requirement in which I need to connect to multiple organizations and projects and fetch all work items under that.
Please advise how can I go about accomplishing that.
I need to connect to multiple organizations and projects and fetch all
work items under that.
You can try with combining OData and Manage Parameters in Power BI to achieve what you want. This is the new feature we provided last month. Just refer to and follow this blog description.
This blog has provided very detailed steps on how should we do. The nutshell of this feature is using Parameters to automatically have a report to create filter, then it will load a data model from azure devops, here our azure devops OData can provide this data model. This Parameters can let users generate a report, and providing values for its Parameters.
Hope this blog is help.
Is there anyway to automate the creation of an Azure Data Explorer Data Connection.
I want to create it as part of an automated deployment so either ARM or through C#. The Data Connection source is an EventHub and needs to include the properties specifying the table, consumer group, mapping name and data format.
I have tried creating a resource manually and epxporting the template but it doesn't work. I have also looked through the MSFT online documentation and cannot find a working example.
This is all I have found example
Please take a look at this good example which shows how to create control plane resources (cluster, database, data connection) using ARM templates, and using a data plane python API for the data plane resources (table, mapping).
In addition, for C# please see docs here and following C# example for how to create an event hub data connection:
var dataConnection = managementClient.DataConnections.CreateOrUpdate(resourceGroup, clusterName, databaseName, dataConnectionName,
new EventHubDataConnection(eventHubResourceId, consumerGroup, location: location));
I've actually just finished building and pushing an Azure Sample that does this (see the deployment template and script in the repo).
Unfortunately, as I elected not to use the Azure CLI (and stick with pure Azure PowerShell), I wasn't able to fully automate this, but you can at least see the approach I took.
I've filed feedback with the product group here on UserVoice
According to the link, for Azure search to work, the data needs to be uploaded to the search service. If i have a No-SQL database in Azure as DocumentDB, can the search service be configured to access the data directly from database, rather than uploading the data to the service?
I can not comment below the current thread, so I will add a new reply.
I am a Program Manager with Azure Search and I can confirm Daron's comments about this being a top request. There is also a fair amount of voting for it from our UserVoice page (http://feedback.azure.com/forums/263029-azure-search/suggestions/6328680-auto-indexing-of-docdb). As a result, we have been investigating tighter integration of these technologies.
DocumentDB has POST triggers. You might be able to use it for an integration.
From my understanding a built-in integration is one of the top requests in the Azure Search and DocDB community. We had a lot of discussions around this with DocDB / Azure Search insiders and I remember a lot of people asking for it.