Azure DevOps, how to get the blobIds - azure

I am trying to download a blobbed zip for a repository on the Azure DevOps Server 2019 API using the following documentation.
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/blobs/get%20blobs%20zip?view=azure-devops-server-rest-5.0
The request body is supposed to contain:
REQUEST BODY
Name Type Description
body string[] Blob IDs (SHA1 hashes) to be returned in the zip file.
How can I obtain the blob ids?

How can I obtain the blob ids?
You can try Items-List api to obtain the blob ids:
GET https://{instance}/{collection}/{project}/_apis/git/repositories/{repositoryId}/items?recursionLevel=Full&api-version=5.0
The response:
Also, if you're trying to get the IDs programatically, you can use GitHttpClientBase.GetItemsAsync Method
.
Ps: As Daniel commented above, it's more recommended to use git command to download whole repository. So you can also try to call git-related api in your code if you want to do that programatically. There're many discussions online about this topic, like this.(Since it's not your original question about Blobids, I won't talk too much here.)

Related

How to pull Azure Cognitive Search Quota and usage?

I am trying to pull azure cognitive search current and quota .
Can anyone help me on:
" how can I get that information to csv using some service principle."
I found one link to pull this kind of data :
https://learn.microsoft.com/en-us/rest/api/searchservice/get-service-statistics
But I am not sure how to use this api to get above screenshot information.
The link you posted is the correct way to do it, it returns all the information you need.
It looks like this endpoint does not support OAuth2 & RBAC. So instead of using service principal, you need to provide the admin api-key in the request header.
You can check here how to access the api-key. If you'll be doing that from a powershell script, you can authenticate with your service principal and fetch the key using Get-AzSearchAdminKeyPair, then use this key to make a http request to get the statistics and finally convert them to CSV format.

How to list the files with content that are in azure repo using REST API

I am referring this link.
https://learn.microsoft.com/en-us/rest/api/azure/devops/git/items/get?view=azure-devops-rest-6.0
I require assistance in forming the api link for the below scenario.
List all the files (with content) that are in a particular repo with .yml extension.
Please assist
The only way to Get the list of Files available in Azure Repo is Rest API call. For this we have to use the PowerShell.
List All Repos
Used to retrieve the Repos
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories?api-version=5.1
# With Optional Parameter
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories?includeLinks={includeLinks}&includeAllUrls={includeAllUrls}&includeHidden={includeHidden}&api-version=5.1
Get Files
It Get the content of Single item. Download parameter is used to check whether we can download the content or not.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&api-version=5.1
# With Optional Parameter
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&scopePath={scopePath}&recursionLevel={recursionLevel}&includeContentMetadata={includeContentMetadata}&latestProcessedChange={latestProcessedChange}&download={download}&$format={$format}&versionDescriptor.versionOptions={versionDescriptor.versionOptions}&versionDescriptor.version={versionDescriptor.version}&versionDescriptor.versionType={versionDescriptor.versionType}&includeContent={includeContent}&resolveLfs={resolveLfs}&api-version=5.1
You can implement above Api in your PowerShell accordingly. Refer here for how to implement this in PowerShell.

Find the type of azure storage using SAS URI of blob

I was wondering how to tell if the type of azure storage based on SAS URI of blob? or more specifically how to know if it is PageBlob or BlockBlob.
There is a REST API that returns a type of blob by just doing a HEAD request to SAS URI and if the file exists there is a field in the response header x-ms-blob-type which indicates the type of blob. However, if the file doesn't exist it returns 404. Now when we get a 404 we can upload a dummy file using BlockBlob and if it fails then we know it's a PageBlob. But I am wondering is there a better way? more straightforward way.
Example of SAS URI:
var sasUriStr = "https://storageaccountname.blob.core.windows.net/containername/file?sp=r&st=2021-08-10T00:34:00Z&se=2021-08-15T08:34:00Z&spr=https&sv=2020-08-04&sr=c&sig=ABCDEFGH/YJKLMNOP=";
There's a way to find that information however it requires you to bring in your logic and it requires a different kind of SAS token.
What you have to do is create an Account SAS (currently you're using Service SAS) and then invoke Get Account Information REST API using that token. Next you will need to extract x-ms-sku-name and x-ms-account-kind response headers. Based on the values of these, you will have to come up with a logic for supported blob types. For example,
If the value of x-ms-account-kind is BlobStorage, then it only supports Block Blobs and Append Blobs.
If the value of x-ms-account-kind is not BlobStorage or BlockBlobStorage and value of x-ms-sku-name is PremiumLRS, then it only supports Page Blobs.
I wrote a blog post some time ago where I created a matrix of features supported by account kinds and skus. You can read that blog post here: https://www.ais.com/how-to-choose-the-right-kind-of-azure-storage-account/
From this blog post:

Azure LogicApp for files migration

I am trying to figure out if Azure LogicApp can be used for files/documents migration from Azure Blob Storage to a custom service, where we have REST API. Here is the shortlist of requirements I have right now:
Files/documents must be uploaded into Azure Storage weekly or daily, which means that we need to migrate only new items. The amount of files/documents per week is about hundreds of thousands
The custom service REST API is secured and any interaction with endpoints should have JWT passed in the headers
I did the following exercise according to tutorials:
Everything seems fine, but the following 2 requirements make me worry:
Getting only new files and not migrate those that already moved
Getting JWT to pass security checks in REST
For the first point, I think that I can introduce a DB instance (for example Azure Table Storage) to track files that have been already moved, and for the second one I have an idea to use Azure Function instead of HTTP Action. But everything looks quite complicated and I believe that there might be better and easier options.
Could you please advise what else I can use for my case?
For the first point, you can use "When a blob is added or modified" trigger as the logic app's trigger. Then it will just do operation on the new blob item.
For the second point, just provide some steps for your reference:
1. Below is a screenshot that I request for the token in logic app in the past.
2. Then use "Parse JSON" action to parse the response body from the "HTTP" action above.
3. After that, your can request your rest api (with the access token from "Parse JSON" above)

Download Task attachment during a pipeline run in Azure Devops

I have created a Task in my Azure devops project. This Task has an excel file as an attachment. The excel file contains the list of users to be created along with the permission sets. My pipeline has code that reads the excel file and automates the user creation in the sales force org. I am currently stuck at a point wherein, my pipeline will be expecting this attachment in the work space during the execution.
Is there a way to fetch the attachment of a task in the VSTS pipeline via python? I did come across below API to fetch it:
Attachments - Get
However, I am not able to access this via python or post man. It keeps throwing me "Could not get any response".
Is there an easier way to feed the pipeline with the excel file that is present in the task?
You need first get a personal access token. You can create one for yourself following this documentation.
Then put you access token as password and Basic as username (select type = Basic Auth)
You also need get your attachment id - you can use this url (after workitems you should put your workitem id)
https://dev.azure.com/<Your organization>/<Your project>/_apis/wit/workitems/<Work item it>?$expand=all&api-version=5.0
You will find your attachments in releations collection:
You can find this question also valuable

Resources