I need to find info who starts the pipeline (trigered Manual); In the pipeline runs section there is no info about user only about parent pipeline if applicable (Triggered by column).
I miss something or is this info isn't accessible?
EDIT:
More specifically, I would like to know who launched a pipeline that has the status "Triggered by" = "Manual Trigger"
Yes, you are following the process is correct. Checking, who is running the pipeline in Azure Synapse but because of the RBAC permission action issue, you do not have the required permission access.
Please follow the below steps to solve the permission issue:
Open synapse studio ->workspace, expand the Security section on the left and select Access control -> Add a Synapse role assignment.
Check whether your pipelines running or not in azure synapse
Reference:
https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/synapse-analytics/security/how-to-manage-synapse-rbac-role-assignments.md
https://learn.microsoft.com/en-us/azure/data-factory/monitor-visually
https://learn.microsoft.com/en-us/azure/synapse-analytics/security/synapse-workspace-synapse-rbac-roles
Related
Inside the Synapse Studio Access Control page, the screen shows:
Failed to load
Failed to load role assignments due to server error, error code 500. Refresh or troubleshoot the issue.
As far as I can tell, no one setup the Synapse Administrator or any other roles within the studio itself.
I cannot add the Synapse Administrator. When I attempt to, I get another 500 error:
"body": "{"error":{"code":"HttpWrapOperationAsyncFailed","message":"System.Exception : No RoleAssignment for workspace :
I've also tried it through the CLI and got a similar error there as well.
I'm afraid that something went wrong with the deployment of the resources related to this Synapse workspace, but I don't see anything wrong within the portal. Help?
I can't seem to authorize access to my Azure subscription in Azure DevOps to run a build whenever a commit is pushed to master. I keep getting the below error:
Also, when I click Authorize resources, it says the authorization was successful, but the next time I run the pipeline, I get the same exact error. I verified in Project settings -> Service connections that I have an active connection to the subscription.
How can I get around this issue? When I go to Deployment Center in Azure Functions and wire up the connection there, it creates a task-based pipeline, but I want to use yaml.
The above indicates the azureSubscription you specified in your azure function deployment task doesnot exist, or you didnot have the permission.
If the service connection is already correctly setup, but you still encounter above error. You can follow below to troubleshoot the issue.
1, Check your yaml pipeline.
The azure subscription is validated at compile time. If you use variables to reference the azure subscription yaml pipeline. You need to make sure the variable can be retrieved at compile time.
You can check out this thread.
2, Check the service connection security setting.
Go to project settings-->Service Connections under Pipelines--> Select your azure service connection --> More settings(3 dots)-->Security-->Try adding your pipeline to the Pipeline permissions list.
If the azure subscription service connection is not set up. You need to create an service connection of azure Resource Manager type to connect to your azure subscription. See below steps:
1, Go to project settings-->Service Connections under Pipelines--> New Service connection-->Select Azure Resource Manager--> Next
2, Then select the Authentication method. If your azure devops is connected to AAD. You can select Service principal (automatic) as Authentication method. This will automatically create a service principal in your Azure AD.
3, If you want to create new service principal. You can select Service principal (manual). See below document to create service principal in Azure
Use the portal to create an Azure Active Directory application and a service principal that can access resources
Use Azure PowerShell to create an Azure service principal with a certificate
Then enter the related information in the service connection configuration page.
After the your azure subscription service connection is created. You can use it in your yaml pipeline task by specify the service connection name. See below example:
- task: AzureFunctionApp#1
displayName: Azure Function App Deploy
inputs:
azureSubscription: myAzureSubscription
Note: You need to add the correct role assignment for above service principal to enable the service principal to deploy to your azure resources.
You must create a new connection from the task itself (you may need to use the advanced options to add an existing service principal).
under "Azure subscription" click the name of the subscription you wish to use
Click the drop down next to "Authorize" and open advanced options
Click " use the full version of the service connection dialog."
Enter all your credentials and hit save
I spent a while trying to figure out why I got the same problem. Compared my yaml to another yaml I had worked on previously and couldn't spot any problems, also verified the service connections.
But as #Levi Lu-MSFT mentions, verifying the yaml lead me to finding what caused my issue so I thought I'd share it here even though it's not 100% related:
My variables weren't indented correctly. I was a bit tired and thought DevOps was just goofing with me. So verify that your yaml is properly setup. Sometimes it can be really small things that causes these issues.
Databricks VMs are pointing to Default Log Analytics but I want to point them to another one
If I try to move VMs to antoher workpacks it tells me that its locked
Error: cannot perform delete operation because following scope(s) are locked
Unfortunately, you are not allowed to move Log Analytics for the Managed Resource Group created in Azure Databricks using Azure portal.
Reason: By default, you cannot perform any write operation on the managed resource group which created by Azure Databricks.
If you try to modify anything in the managed resource group, you will see this error message:
{"details":[{"code":"ScopeLocked","message":"The scope '/subscriptions/xxxxxxxxxxxxxxxx/resourceGroups/databricks-rg-chepra-d7ensl75cgiki' cannot perform write operation because following scope(s) are locked: '/subscriptions/xxxxxxxxxxxxxxxxxxxx/resourceGroups/databricks-rg-chepra-d7ensl75cgiki'. Please remove the lock and try again."}]}
Possible way: You can specify tags as key-value pairs when while creating/modifying clusters, and Azure Databricks will apply these tags to cloud resources.
Possible way: Configure your Azure Databricks cluster to use the monitoring library.
This article shows how to send application logs and metrics from Azure Databricks to a Log Analytics workspace. It uses the Azure Databricks Monitoring Library.
Hope this helps.
I tried following the Quickstart: Run a Spark job on Azure Databricks using the Azure portal as described at: https://learn.microsoft.com/en-us/azure/azure-databricks/quickstart-create-databricks-workspace-portal
But when I later try to delete resource group for that databricks resource I got the following two errors:
Delete resource group databricks-rg-mydatabricksws-5mlo3dio7wef2
failed The resource group databricks-rg-mydatabricksws-5mlo3dio7wef2
is locked and can't be deleted. Click here to manage locks for this
resource group.
UnauthorizedApplicationId "The management lock ... is owned by system
application"
See: https://aka.ms/arm-lock
Lock Deletion Failure The lock named mydatabricksws was unable to be
deleted for the following reasons: {"errorThrown":"Unavailable in
batch","jqXHR":{"responseJSON":{"error":{"code":"UnauthorizedApplicationId","message":"The
management lock 'mydatabricksws' is owned by system application(s)
'd9327919-6775-4843-9037-3fb0fb0473cb'.
I also encountered the same problem before. I get the answer from this link.
Log into your Azure Databricks workspace as the account owner (the user who created the service), and click the user profile Account icon at the top right.
Select Manage Account.
In the Azure Databricks service, click Azure Delete and then OK.
You also could get the Azure Databricks code demo from this document.
Whenever I try to delete any pipeline from ADF, I get following error message:
The scope '/subscriptions/<subscription_id>/resourcegroups/<RGName>/providers/Microsoft.DataFactory/datafactories/<ADF_Name>/datapipelines/HivePipe'
cannot perform delete operation
because following scope(s) are locked: '/subscriptions/<subscription_id>/resourceGroups/<Name of Resource Group>'.
Please remove the lock and try again.
Can anyone guide me on how to delete unwanted pipelines using Azure portal?
Open your Azure Data Factory Blade
Click "Author and Deploy"
Expand Pipelines node
Right click and "Delete" the pipeline
Hope this helps.
It's much easier and faster to delete them using powershell. If you have dozens of pipelines, it takes lot of time to do it from the UI:-
Powershell remove pipeline reference