Fail to create in-demand hadoop cluster in Azure Data Factory; additionalProperties is not active - azure

It's my first time trying out the Azure data factory so I hope this is not a bad question to ask.
So I'm using the Azure portal trying to create an on-demand hadoop cluster as one of the linked service in Azure Data Factory following the steps in the tutorial.
But whenever I click create, the following error message pops up.
Failed to save HDinisghtLinkedService. Error: An additional property 'subnetName' has been specified but additionalProperties is not active.The relevant property is 'HDInsightOnDemandLinkedServiceTypeProperties'.The error occurred at the location 'body/properties/typeProperties' in the request.;An additional property 'virtualNetworkId' has been specified but additionalProperties is not active.The relevant property is 'HDInsightOnDemandLinkedServiceTypeProperties'.The error occurred at the location 'body/properties/typeProperties' in the request.
I couldn't understand why it requires the 'subnetName' and 'virtualNetworkId'. But I tried putting values under Advanced Properties -> Chose Vnet and Subnet -> From Azure subscription -> and put in the existing vitrual network ID and subnet name. But the problem still present and the same error message shows up.
Other background information:
For the tutorial I posted above, I did not use its powershell code. I have existing resource group and created a new storage account on the Azure portal.
I also created a new app registration in Azure Active Directory and retrieve principal service application ID and authentication key following this link
Some parameters:
Type: On-demand HDInsight
Azure Storage Linked Service: the one listed in the connection
Cluster size: 1 (for testing)
Service principal id/service principal key: described above
Version: 3.6
...
Any thoughts or anything I might be doing wrong?

From the error message, it clearly states that “subnetName” is not active, which means it has not created at all.
Note: If you want to create on-demand cluster within your Vnet, then first create Vnet and Subnet and the pass the following values.
Advanced Properties are not mandatory to create a on-demand cluster.
Have you tried created on-demand cluster without passing the Vnet and Subnet?
Hope this helps. Do let us know if you any further queries.

Related

Azure ADF using Azure Batch throws Shared Access Signature generation error

I am working on a simple Azure Data Factory pipeline where I have simply added a Batch Service and in that specified the Batch Service account (which I have created thru linked service and tested the connection is working). In the command I am just running a simple "ls" command and when I do a debug run I get this error: "Cannot create Shared Access Signature unless Account Key credentials are used." I have following linked services "Azure Batch", "Azure Blob Storage" and Key Vault (where we store the access key). All linked services connections are working properly.
Any help on how to fix this error: "Cannot create Shared Access Signature unless Account Key credentials are used."
Azure Batch Linked service:
Azure Storage Linked service:
Azure Data factory pipeline:
The issue happens because you use "Managed Identity" to connect ADF to the Storage. It will say "successful" when doing a connection test on the linked services but when this storage is used for a Batch, it needs to have "Account Key" authentication type (see here).

Azure DevOps: Service connection is not being recognized

I can't seem to authorize access to my Azure subscription in Azure DevOps to run a build whenever a commit is pushed to master. I keep getting the below error:
Also, when I click Authorize resources, it says the authorization was successful, but the next time I run the pipeline, I get the same exact error. I verified in Project settings -> Service connections that I have an active connection to the subscription.
How can I get around this issue? When I go to Deployment Center in Azure Functions and wire up the connection there, it creates a task-based pipeline, but I want to use yaml.
The above indicates the azureSubscription you specified in your azure function deployment task doesnot exist, or you didnot have the permission.
If the service connection is already correctly setup, but you still encounter above error. You can follow below to troubleshoot the issue.
1, Check your yaml pipeline.
The azure subscription is validated at compile time. If you use variables to reference the azure subscription yaml pipeline. You need to make sure the variable can be retrieved at compile time.
You can check out this thread.
2, Check the service connection security setting.
Go to project settings-->Service Connections under Pipelines--> Select your azure service connection --> More settings(3 dots)-->Security-->Try adding your pipeline to the Pipeline permissions list.
If the azure subscription service connection is not set up. You need to create an service connection of azure Resource Manager type to connect to your azure subscription. See below steps:
1, Go to project settings-->Service Connections under Pipelines--> New Service connection-->Select Azure Resource Manager--> Next
2, Then select the Authentication method. If your azure devops is connected to AAD. You can select Service principal (automatic) as Authentication method. This will automatically create a service principal in your Azure AD.
3, If you want to create new service principal. You can select Service principal (manual). See below document to create service principal in Azure
Use the portal to create an Azure Active Directory application and a service principal that can access resources
Use Azure PowerShell to create an Azure service principal with a certificate
Then enter the related information in the service connection configuration page.
After the your azure subscription service connection is created. You can use it in your yaml pipeline task by specify the service connection name. See below example:
- task: AzureFunctionApp#1
displayName: Azure Function App Deploy
inputs:
azureSubscription: myAzureSubscription
Note: You need to add the correct role assignment for above service principal to enable the service principal to deploy to your azure resources.
You must create a new connection from the task itself (you may need to use the advanced options to add an existing service principal).
under "Azure subscription" click the name of the subscription you wish to use
Click the drop down next to "Authorize" and open advanced options
Click " use the full version of the service connection dialog."
Enter all your credentials and hit save
I spent a while trying to figure out why I got the same problem. Compared my yaml to another yaml I had worked on previously and couldn't spot any problems, also verified the service connections.
But as #Levi Lu-MSFT mentions, verifying the yaml lead me to finding what caused my issue so I thought I'd share it here even though it's not 100% related:
My variables weren't indented correctly. I was a bit tired and thought DevOps was just goofing with me. So verify that your yaml is properly setup. Sometimes it can be really small things that causes these issues.

Azure Data Explorer error when creating cluster: subscription '' is not registered

While working on this official tutorial Create an Azure Data Explorer cluster and database, I am getting the following error when creating a Cluster. Question: What I may be missing and how the issue can be resolved?
Remarks:
I'm using Visual Studio Enterprise Subscription - MPN
My online search shows similar error here but the context seems different since those error messages are related to The subscription not registered to use namespace. Not sure if there is a relevance to my error.
{"code":"DeploymentFailed","message":"At least one resource deployment operation failed. Please list deployment operations for details. Please see https://aka.ms/DeployOperations for usage details.","details":[{"code":"SubscriptionNotRegistered","message":"The subscription 'a86d7e9f-210d-48e8-8f5e-528015d1c998' is not registered."}]}
Using the link provided in the error, I got the following:
When I click on the 'write cluster resource' link from the above screen:
The error is because you did not register the Kusto resource provider as described here
However, once you create a new cluster for the first time on a given subscription and it fails because the provider is not registered, Kusto tries to register it for you. So if you try again it should just work, if not please follow the process in the link.

Want to create VDI but getting below error

I want to create VDI in azure but I'm facing the following issue while creating windows host pool
Deployment template validation failed: 'The provided value 'Microsoft.WindowsAzure.ResourceStack.Frontdoor.Common.Entities.TemplateGenericProperty`1[Newtonsoft.Json.Linq.JToken]' for the template parameter 'newOrExistingVnet' at line '152' and column '24' is not valid. The parameter value is not part of the allowed value(s): 'existing'.'.
This workaround assumes that you're getting this error in the portal:
First manually create the Resource Group and VNet.
Then within the VNet, add a service endpoint to e.g. Azure AD.
Next
rerun the wizard, this time choosing the Resource Group and VNet you
previously created. The template should validate successfully

Azure Databricks move Log Analytics

Databricks VMs are pointing to Default Log Analytics but I want to point them to another one
If I try to move VMs to antoher workpacks it tells me that its locked
Error: cannot perform delete operation because following scope(s) are locked
Unfortunately, you are not allowed to move Log Analytics for the Managed Resource Group created in Azure Databricks using Azure portal.
Reason: By default, you cannot perform any write operation on the managed resource group which created by Azure Databricks.
If you try to modify anything in the managed resource group, you will see this error message:
{"details":[{"code":"ScopeLocked","message":"The scope '/subscriptions/xxxxxxxxxxxxxxxx/resourceGroups/databricks-rg-chepra-d7ensl75cgiki' cannot perform write operation because following scope(s) are locked: '/subscriptions/xxxxxxxxxxxxxxxxxxxx/resourceGroups/databricks-rg-chepra-d7ensl75cgiki'. Please remove the lock and try again."}]}
Possible way: You can specify tags as key-value pairs when while creating/modifying clusters, and Azure Databricks will apply these tags to cloud resources.
Possible way: Configure your Azure Databricks cluster to use the monitoring library.
This article shows how to send application logs and metrics from Azure Databricks to a Log Analytics workspace. It uses the Azure Databricks Monitoring Library.
Hope this helps.

Resources