Publishing a database project with Always On Encryption and Azure Key Vault - azure

I have an existing SQLAzure database. We are developing a new feature and need to have column level encryption. I am researching how to set this up.
We currently push changes to our local and to production via a database project. A simple right click > publish.
When I do this in my local environment after setting up the column encryption with the key stored in the Azure Key Vault, I receive the following error.
Cannot proceed as Key Vault support is not present in the current
application. For Key Vault support during deployment, install
DacFramework.msi and run SqlPackage.exe from its install location.
I've done that and still receive the same error.
Is it possible to maintain the right click publish functionality with the column level encryption or will I need to create an external script to set this up from now on?

It is possible that the newly updated hosted agent has a newer version of the DAC framework installed which is taking precedence over the full version that was installed separately, or else there is some other sort of discrepancy between the two and your system is trying to use the older version of the framework. It is also possible that some of your VMs have the framework and others do not, and this is causing that error.
There is a known issue with that error message that the product team is working on. See also, the user voice item.
Can you provide any of your error logs for further troubleshooting?
I would also recommend sending an email with your subscription ID to AzCommunity#microsoft.com so that I can open a support case for you. It will be easier to diagnose with more insight into your environment.

Related

Signing files with Azure Key Vault + AzureSignTool

I have a program (.msi file) that is being built on a remote machine that I have no physical access to. The program is supposed to be distributed to users with Windows machines, however, in order to prevent the Smart Defender popup, I need to sign the installer with EV Code Signing certificate.
In order to implement the code signing on the remote machine that I am using right now, I was thinking about using Azure Key Vault along with AzureSignTool.exe, however, I am not sure whether this setup is reasonable. Is there anybody who had any experience with such a setup and could let me know how well it works?
I have also seen that in the Azure Key Vault there is a Managed HSM Pool option, however, I couldn't find much information on what that is. Am I ever going to need that or can I just simply ignore it?

Azure LinkedAuthorizationFailed on Active Directory Account / Key Vault Authentication when running DevOps Server Deployment Template

I am trying to deploy an on-prem instance of Azure DevOps Server to a VM in an Azure Government subscription (which by nature, seems it does not support standard DevOps).
This template is referenced within support material directly from Microsoft:
https://github.com/usri/deploy_DevOps_Server_AzureSQL
All the referenced resources were created from scratch for the purpose of getting this server running.
This requires an AAD account with the associated password stored in a Key Vault. However, every attempt I make to run the template returns the following error on the 'Write VirtualMachines' step (when all other components pass):
The client has permission to perform action 'Microsoft.Compute/images/read' on scope '(MY_SUBSCRIPTION)\(MY_RESOURCEGROUP)\(VM)', however the current tenant '(MY_KEYVAULT)' is not authorized to access linked subscription '(ID in the template with the deployment files)'
This seems to me like the password cannot be retrieved from Key Vault- is it a formatting issue with the Secret? An access control issue somewhere? I've tried many various combinations of both. Hopefully this is just a trivial issue..
I am the original author of the code in that repo. I went ahead and merged a pull request into that repo which should address your issue. I did the following:
Updated the ReadMe file to include information on creating the image
Updated the azuredeploy.json with parameters for Key Vault & image references
Updated the ps1 file to eliminate hard links for KV (a particularly bad oversight on my part, my apologies).
Updated and tested everything for the latest version of Azure DevOps Server 2020
This should fix your issue and several other related ones. I retested the entire deployment from scratch and it worked as designed. A couple of other quick notes:
The USRI and all of it's repositories including the one being utilized here are not Microsoft official repositories. They represent an open-source Azure community dedicated to regulated entity customers. The members which contribute there are mostly Microsoft employees and the repos themselves just represent interesting and sometime niche templates that might be of interest.
This particular repo shows a manner in which Azure templates could be used to deploy services when no internet connection is available or permitted. I just used Azure DevOps Server because it was interesting and regulated industry customers use it.
All the best

Provisioning Mode fails for existing app but works on a newly created one

Trying to edit provisioning mode in an Azure Enterprise App through SCIM but keep getting the error below:
Funny enough I created another app to test the same credentials and the credentials work. I am wondering what could be the issue with this current app. Below you can see I am not even able to change the provisioning to manual, it's stuck on automatic:
Anyone familiar with this in Azure deployment?
List item
That error shows when you've customized your mappings / schema and set something invalid such as including Id as a target attribute or you have decided to primary keys in the advanced settings.
Select the check box for restore default attribute mappings and clear state and restart and save.
Got it. That design needs to be improved. You can use graph to delete the job. Sign into graph explorer as a global admin to delete the job and you will be able to authorize access again and get the default configuration on the old app. https://learn.microsoft.com/en-us/graph/api/synchronization-synchronizationjob-delete?view=graph-rest-beta&tabs=http

No applications found - Azure Blockchain workbench

Yesterday I created a new blockchain workbench guided by the docs on: https://learn.microsoft.com/en-us/azure/blockchain/workbench/deploy
The main issue is when am going to the web app, since I can't see anything, it only shows: No applications found, Contact the Workbench administrator to get access to an application.
Funny thing is that I added a user to work along with me (both propietaries and admins). He can acces to the resource and create apps. Also he can see me as a member (i only see me). Another thing to add is that he created a contract, which appears to me in the workbench and lets me to take actions on any instance of the contract (permitted by the role ive been given).
How did i lose (if) my privileges/permissions if i created the workbench?
I am sure you must have checked the access level for your own id and anyone else did not delete your access or downgrade it by mistake.
The only other way is to raise a ticket to Azure tech or recreate the scenario and track it where and at what step it is failing.

What is the best practice for updating an already existing web app deployment using ARM?

My company developed an Azure Resource Manager-based solution that deploys a set of resources (essentially a Storage, SQL DB and Web App), and it is already implemented as our provisioning process for new customers.
However, we are now studying the best way to perform updates, and one of the hypotheses we are considering is having a specific template that updates the binaries of this application.
The idea is to have a separate template, that only has the web app, an app host and a MSDeploy resource that gets the latest version of our package and reuploads it to that web app.
The only problem I'm seeing with this solution is the ability to handle any changes in configuration that are necessary with newer version of the binaries - we do not want users to have to re-input any parameters they placed for the original deploy (done via a Deploy To Azure button), so, any configurations will have to be performed within the application - the plan is for it to use the Microsoft.WindowsAzure.Management.WebSites library.
The major limitation with using Microsoft.WindowsAzure.Management.WebSites is that you are restricted to authenticating with either a certificate or a service principal. Ideally we would like to find a way for the updates to not need any authentication other than the one you provide when you are deploying the update.
Is there any recommendation of best practices to follow for this kind of scenario?
Thank you.
Link to the equivalent discussion on TechNet
It is possible to update only via ARM templates.
For example connection strings can be added automatically to the application settings even when creating the dependent resources themselves.
Ex. Account storage connection string.
Only first time creation of your web sites will take a bit more time, something like 30 sec.
ARM will not destroy your WebApps if they exist already. it will update only.
If there are no changes, then the deployment is very fast.
If your changes require a new Appsettings parameter, you can enter it in ARM , check in to your repository.
and next deployment will pick up and update the WebApp.
So no need for anyone to log-in and update.
Our final decision was to give up on using ARM exclusively. The Service Principal solution, through the SDK, would allow us to use a Web Job or a Site Extension to perform (automatic or prompted) updates that included configuration changes. However, it would require "too many" privileges - why would a customer trust an application that can, at will, create new resources or update existing ones to increase his Azure bill?
The decision was made to utilize Powershell only for updates - if the customer can see the scripts and authenticate himself, this is not a concern. Sadly, this increases update complexity, but we found it to be a necessary evil.

Resources