Azure DevOps Extension Process on Delete - azure

I'm developing Azure DevOps extension. When client download extension he can register in Azure Hub then his account is added to my database.
When the client delete extension his account should also be removed from my database.
How can I add process for Azure DevOps extension that can be triggered on uninstall / remove extension?

I don't think there is an API for this, but you can see uninstalls here in the Marketplace portal. I guess you could poll this, or figure out the underlying API that's being used. any integration against these APIs is unsupported.
https://marketplace.visualstudio.com/manage/publishers/{PublisherID}/extensions/{ExtensionID}/hub?_a=uninstall
Also, remember that for troubleshooting purposes people uninstall/reinstall extensions and they may need to reinstall as part of migration/upgrade scenarios for which their assumption is likely going to be that no data is lost in the progress.
It's probably best to ask for contact details, upon registration, monitor usage and warn that data will be removed after X days of no usage.

Related

Programmatically Set Up Azure Cloud Shell

Is there a way to programmatically configure Azure Cloud Shell without having to launch it and have it create a new storage account and file share (or go manually configure it to use existing resources)?
I'd like to be able to have an Azure CLI script (or Terraform) create the storage account and file share for me and have Azure Cloud Shell detect these resources and use them so when the Cloud Shell button is clicked there is not configuration needed.
This is needed whenever a new account is created or the resource group that contains the Cloud Shell storage account is deleted.
I believe currently feature to programmatically configure Cloud Shell to detect and select particular storage account and file share is not supported. So I would recommend you to raise this as a feature request in this Uservoice or feedback forum.
However, IF the actual use case and requirement is - " to access the Cloud Shell (https://shell.azure.com/) without configuration need or at least by configuring only one time per subscription (that may be done manually)", THEN this is the feature request for the same which is already raised in Uservoice or feedback forum. So if interested, I recommend you to upvote it and other features that are of interest.
In general, Azure feature or product team would check feasibility of a feature request, triage it, prioritize against existing feature backlog, add in roadmap as appropriate and would announce and/or update the related Azure document once a feature request is addressed.

Azure Update Management

I am having an issue with my Azure Update Management not pushing the updates to the Windows VM.
This was working before I domain joined to Azure Active Directory Domain Services.
From what I have read on the Microsoft Site, it looks like this is a known issue, wanted to shoot it off here to see if anyone has found a work around. It looks like that it is not selecting the updates. I do have the selection in the schedule setup to do all updates.
Things I have tried.
Recreated the Services Account.
Deleted and redid the automation account
Delete and setup the Updates Management Configuration
Verified they do work manually.
I am relatively new to Azure, so I apologies for the lack of knowledge.
I have since found the answer. The Worker process from Azure has to be reinstalled if it was installed prior to the domain join.

azure devops access log

I am looking to find access logs for azure devops to
1) List time and date of authorized users who have accessed the code repository
2) List the changes made for all of the the repository and by whom
3) Assuage audit fears of unauthorized users downloading the code
Looks like there is auditing capabilities slotted in the roadmap but I need something now. I tried using the azure portals activity logs but I get zero results for azure devops events
Note: we do not use Active Directory integration yet
Any help is greatly appreciated
For auditing repo changes, every write operation in source control is part of its history.
For limiting read access, you already know the solution, because you said you aren't using it yet: Azure AD. Limit access to within your organization.
For auditing access, as you said, there is no solution yet, it's on the backlog. The reason it's on the backlog is because there is no way to do it at the moment.

Unable to save Policy in Azure APIM

I have been able to work in Azure APIM with no problems until yesterday. Another member on my team can edit and save with no problems; but my save to an Inbound Processing rule always fails with:
Could not save policy for "Access API 1.2" API. Please try again
later.
Thoughts?
Of Note:
Our companies security access team verifies that I am a contributor to APIM
I login in through the companies' two factor authentication system into Azure.
Same results on Edge/Chrome.
I can update individual endpoint api policies.
Our company opened a Microsoft Support ticket on this and their response was
You are running into a known issue with APIM integration with ARM. The
dev team is working on a fix for this issue now and we are told it
will get deployed by this evening.
The following day it was working for me
The APIM dev team fixed the issue late yesterday and you should now
see the ability to update policies for the API scope too.
Note to anyone running into this situation in the future the secondary advice given revolved around the browser which was
Make sure you’ll not pulling down cached files. Try loading an
in-private session or press CTRL+F5 to refresh the page and pull down
new files.

What is the best practice for updating an already existing web app deployment using ARM?

My company developed an Azure Resource Manager-based solution that deploys a set of resources (essentially a Storage, SQL DB and Web App), and it is already implemented as our provisioning process for new customers.
However, we are now studying the best way to perform updates, and one of the hypotheses we are considering is having a specific template that updates the binaries of this application.
The idea is to have a separate template, that only has the web app, an app host and a MSDeploy resource that gets the latest version of our package and reuploads it to that web app.
The only problem I'm seeing with this solution is the ability to handle any changes in configuration that are necessary with newer version of the binaries - we do not want users to have to re-input any parameters they placed for the original deploy (done via a Deploy To Azure button), so, any configurations will have to be performed within the application - the plan is for it to use the Microsoft.WindowsAzure.Management.WebSites library.
The major limitation with using Microsoft.WindowsAzure.Management.WebSites is that you are restricted to authenticating with either a certificate or a service principal. Ideally we would like to find a way for the updates to not need any authentication other than the one you provide when you are deploying the update.
Is there any recommendation of best practices to follow for this kind of scenario?
Thank you.
Link to the equivalent discussion on TechNet
It is possible to update only via ARM templates.
For example connection strings can be added automatically to the application settings even when creating the dependent resources themselves.
Ex. Account storage connection string.
Only first time creation of your web sites will take a bit more time, something like 30 sec.
ARM will not destroy your WebApps if they exist already. it will update only.
If there are no changes, then the deployment is very fast.
If your changes require a new Appsettings parameter, you can enter it in ARM , check in to your repository.
and next deployment will pick up and update the WebApp.
So no need for anyone to log-in and update.
Our final decision was to give up on using ARM exclusively. The Service Principal solution, through the SDK, would allow us to use a Web Job or a Site Extension to perform (automatic or prompted) updates that included configuration changes. However, it would require "too many" privileges - why would a customer trust an application that can, at will, create new resources or update existing ones to increase his Azure bill?
The decision was made to utilize Powershell only for updates - if the customer can see the scripts and authenticate himself, this is not a concern. Sadly, this increases update complexity, but we found it to be a necessary evil.

Resources