A Synapse Trigger created via azure cli is missing in Synapse UI - azure

EDIT:
This is a Workspace that is connected to a git repository.
If I am adding triggers to a Workspace that is in "Live"-mode, the triggers will show. How can I add triggers to a workspace that is setup to a git repo?
Using Azure Synapse, I had some issues when publishing the workspace after creating and deleting some triggers.
I decided to delete all triggers in the UI, and then publish worked fine.
To avoid having to recreate the trigges in the UI, I wanted to create the triggers through the azure commandline, but the triggers I create do not show up in the Synapse UI.
I am creating a Synapse trigger through the azure commandline:
az synapse trigger create --workspace-name wsname --name triggername --file #"path/trigger.json"
Listing the triggers through the cli shows the newly created trigger:
$ az synapse trigger list --workspace-name wsname
[
{
"etag": "sometag",
"id": "/subscriptions/subscription/resourceGroups/rg/providers/Microsoft.Synapse/workspaces/wsname/triggers/triggername",
"name": "triggername",
"properties": {
"additionalProperties": null,
"annotations": [],
"delay": "00:00:00",
"dependsOn": [],
"description": null,
"endTime": "2022-08-17T08:32:00+00:00",
"frequency": "Hour",
"interval": 24,
"maxConcurrency": 50,
"pipeline": {
"parameters": null,
"pipelineReference": {
"name": null,
"referenceName": "Pipelinename",
"type": "PipelineReference"
}
},
"retryPolicy": {
"count": null,
"intervalInSeconds": 30
},
"runtimeState": "Stopped",
"startTime": "2022-08-17T06:32:00+00:00",
"type": "TumblingWindowTrigger"
},
"resourceGroup": "rg",
"type": "Microsoft.Synapse/workspaces/triggers"
}
]
If I look in Synapse UI, under Synapse > Manage > Triggers, it shows "No triggers to display..."

The answer was that the Synapse Workspace was setup using a Git Repository.
To add the triggers, it was as simple as adding them to the repository, and pushing the changes.
If it is setup in Live-mode, the azure cli commands works fine.

Related

ARM template - bad request failed when deploying a linked service

I am trying to deploy a synapse instance via an ARM template and the deployment is successful via the Azure DevOps portal, but when I try to deploy the same template with an Azure Keyvault linked service I encounter the following error:
##[error]At least one resource deployment operation failed. Please list deployment
operations for details. Please see https://aka.ms/DeployOperations for usage details.
##[error]Details:
##[error]BadRequest:
After inspecting the activity logs from the Synapse instance I found out the following:
"resourceGroupName": "platform-test-group",
"resourceProviderName": {
"value": "Microsoft.Synapse",
"localizedValue": "Microsoft.Synapse"
},
"resourceType": {
"value": "Microsoft.Synapse/workspaces/linkedservices",
"localizedValue": "Microsoft.Synapse/workspaces/linkedservices"
},
"resourceId": "/subscriptions/xxxx-xxxx-xxxx-xxxx/resourcegroups/platform-test-group/providers/Microsoft.Synapse/workspaces/synapsedataapp/linkedservices/AzureKeyVault",
"status": {
"value": "Failed",
"localizedValue": "Failed"
},
"subStatus": {
"value": "NotFound",
"localizedValue": "Not Found (HTTP Status Code: 404)"
},
"submissionTimestamp": "2022-02-01T02:30:31.1471914Z",
"subscriptionId": "xxxx-xxxx-xxxx-xxxx",
"tenantId": "0f44c5d4-xxxx-xxxx-xxxxx",
"properties": {
"statusCode": "NotFound",
"serviceRequestId": null,
"statusMessage": "{\"error\":{\"code\":\"BadRequest\",\"message\":\"\"}}",
"eventCategory": "Administrative",
"entity": "/subscriptions/xxxx-xxxx-xxxx-xxxx/resourcegroups/platform-test-group/providers/Microsoft.Synapse/workspaces/synapsedataapp/linkedservices/AzureKeyVault",
"message": "Microsoft.Synapse/workspaces/linkedservices/write",
"hierarchy": "xxxx-xxxx-xxxx-xxxx/Enterprise/Group/Group-Test/xxxx-xxxx-xxxx-xxxx"
},
"relatedEvents": []
}
As you can see, the 404 error appears when the template tries to deploy to the tenant id which is not found, however, when I deploy the keyvault via the synapse UI I encounter no error.
Below is the code snippet that I use in my ARM template to deploy the keyvault to the synapse instance:
{
"name": "[concat(variables('workspaceName'), '/AzureKeyVault')]",
"type": "Microsoft.Synapse/workspaces/linkedservices",
"apiVersion": "2021-06-01-preview",
"properties": {
"annotations": [],
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "https://data-test-kv.vault.azure.net/"
}
},
"dependsOn": [
"[variables('workspaceName')]"
]
}
Am I missing some kind of permission or connection that I need to enable? Why am I able to deploy successfully through the UI but not through the ARM template? Any comment or suggestion is greatly valued, so please feel free to comment or improve this question.
I had to contact Microsoft support and their reply was the following:
ARM templates cannot be used to create a linked service. This is due to the fact that linked services are not ARM resources, for examples, synapse workspaces, storage account, virtual networks, etc. Instead, a linked service is classified as an artifact. To still complete the task at hand, you will need to use the Synapse REST API or PowerShell. Below is the link that provides guidance on how to use the API. https://learn.microsoft.com/en-us/powershell/module/az.synapse/set-azsynapselinkedservice?view=azps-7.1.0
This limitation in ARM is applied only to Synapse and they might fix this in the future.
Additional references:
https://feedback.azure.com/d365community/idea/05e41bf1-0925-ec11-b6e6-000d3a4f07b8
https://feedback.azure.com/d365community/idea/48f1bf78-2985-ec11-a81b-6045bd7956bb
In Synapse unlike ADF, linked-services are not part of arm-templates. They are called artifacts and it comprises: Note Books, Spark Definitions, Linked Services, Pipelines etc.
You can find the full article here: https://techcommunity.microsoft.com/t5/azure-synapse-analytics-blog/how-to-use-ci-cd-integration-to-automate-the-deploy-of-a-synapse/ba-p/2248060
In short, first, deploy Synapse using arm templates. And then set up the linked services:
- task: Synapse workspace deployment#1
displayName: 'Setup:Synapse KeyVault Linked Service'
inputs:
TemplateFile: '$(Build.Repository.LocalPath)/TemplateForWorkspace.json'
ParametersFile: '$(Build.Repository.LocalPath)/TemplateParametersForWorkspace.json'
azureSubscription: '${{ parameters.environments.serviceConnectionId }}'
ResourceGroupName: '$(computeResourceGroupName)'
TargetWorkspaceName: '$(synapseWorkspaceName)'
DeleteArtifactsNotInTemplate: true
OverrideArmParameters: |
synapseLinkedServiceKV: $(synapseLinkedServiceKV)
workspaceName: $(synapseWorkspaceName)
Environment: 'prod'
TemplateForWorkspace.json:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"workspaceName": {
"type": "string"
},
"synapseLinkedServiceKV": {
"type": "string"
}
},
"variables": {
"workspaceId": "[concat('Microsoft.Synapse/workspaces/', parameters('workspaceName'))]"
},
"resources": [
{
"name": "[concat(parameters('workspaceName'), '/' , parameters('synapseLinkedServiceKV'))]",
"type": "Microsoft.Synapse/workspaces/linkedServices",
"apiVersion": "2019-06-01-preview",
"properties": {
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": "[concat('https://', parameters('synapseLinkedServiceKV'), '.vault.azure.net/')]"
},
"annotations": [],
"description": "Linked Service to Azure KeyVault. KeyVault is used to primarily fetch secrets"
},
"dependsOn": []
}
]
}
TemplateParametersForWorkspace.json:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"workspaceName": {
"value": ""
},
"synapseLinkedServiceKV": {
"value": ""
}
}
}
It deletes the existing artifacts and deploys the one above. You would first need to install the task extension on your Azure Devops for Synapse workspace deployment#1
Note above template was auto-generated. In synapse studio, goto Git Configuration and point it to your repo. It will submit the changes to the branch workspace_publish. You can copy and build on top of the specific artifact code.

Convert ADLS Gen1 to Gen2 at the time of ADF deployment

I am trying to do an adf deployment from production instance into the development instance in order to sync up the recent changes on PROD ADF.
In the production instance we do have linked services connecting to Gen1 ADLS and Gen2 ADLS.
But in DEV we do have only Gen2 ADLS available.
Is there any way to convert/map the Linked Services with Gen1 ADLS into Gen2 at the time of deployment and vice versa.
Is there any other way to achieve the same like by replacing the linked service object in ARM template?
I've tried several tests but it seems that we can not change linked services attributes via ARM templates. You can refer this post.
The linked service of Gen1 is as follows:
{
"name": "[concat(parameters('factoryName'), '/AzureBlobStorage1')]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties": {
"annotations": [],
"type": "AzureBlobStorage",
"typeProperties": {
"connectionString": "[parameters('AzureBlobStorage1_connectionString')]"
}
},
"dependsOn": []
}
The linked service of Gen2 is as follows:
{
"name": "[concat(parameters('factoryName'), '/AzureDataLakeStorage1')]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties": {
"annotations": [],
"type": "AzureBlobFS",
"typeProperties": {
"url": "[parameters('AzureDataLakeStorage1_properties_typeProperties_url')]",
"accountKey": {
"type": "SecureString",
"value": "[parameters('AzureDataLakeStorage1_accountKey')]"
}
}
},
"dependsOn": []
}

Azure Data Factory CI/CD for Schedule Triggers Not Working

For triggers it looks like only pipelines pipeline and typeProperties blocks can be overriden based on the documentation.
What I want to achieve is with my CI/CD process and overriding parameters functionality, to have a schedule trigger disabled in the target ADF, unlike my source ADF.
If I inspect the JSON of a trigger that looks like the following field could do the trick "runtimeState": "Started".
{
"name": "name_daily",
"properties": {
"description": " ",
"annotations": [],
"runtimeState": "Started",
"pipelines": [
{
"pipelineReference": {
"referenceName": "name",
"type": "PipelineReference"
}
}
],
"type": "ScheduleTrigger",
"typeProperties": {
"recurrence": {
"frequency": "Day",
"interval": 1,
"startTime": "2020-05-05T13:01:00.000Z",
"timeZone": "UTC",
"schedule": {
"minutes": [
1
],
"hours": [
13
]
}
}
}
}
}
But if I attempt to add it in the JSON file like this:
"Microsoft.DataFactory/factories/triggers": {
"properties": {
"runtimeState": "-",
"typeProperties": {
"recurrence": {
"interval": "=",
"frequency": "="
}
}
}
}
it never shows up in the Override section in Azure Pipeline Releases.
Does this ADF CI/CD functionality exist for triggers? How can I achieve my target here?
Turns out runtimeState for triggers is not obeyed in the arm-template-parameters-definition.json.
The path is clearer after some more research - I can achieve what I want with either editing the Powershell script Microsoft has provided or use an ADF custom task from the Azure Devops marketplace.

Executing custom activity on Azure Data Factory Pipeline

I am creating simple pipeline in the data factory that should only run a custom activity. The deployment template for the pipeline looks like this:
{
"type": "pipelines",
"name": "MyCustomActivityPipeline",
"dependsOn": [
"DataFactoryName",
"AzureBatchLinkedService"
],
"apiVersion": "[variables('api-version')]",
"properties": {
"description": "Custom activity sample",
"activities": [
{
"type": "Custom",
"name": "MyCustomActivity",
"linkedServiceName": {
"referenceName": "AzureBatchLinkedService",
"type": "LinkedServiceReference"
},
"typeProperties": {
"command": "cmd /c echo hello world"
}
}
]
}
}
Additionally I have created all the resources needed- the batch account with pools and the storage account. All the resources are in the same resource group and subscription. I try to trigger the pipeline using console command
Invoke-AzureRmDataFactoryV2Pipeline -DataFactory "DataFactory" -PipelineName "PipelineName" -ResourceGroupName "ResourceGroupName"
I am getting this error:
Activity MyCustomActivity failed: Can not access user batch account, please check batch account setiings.
Has anyone ever experienced such an error from ADF execution of a pipeline? The weird part is that all the resources have access to each other and are within the same resource group and subscription.
Please check the settings for the storage linked service used by batch linked service. Make sure the connection string type is SecureString

Execute U-SQL script in ADL storage from Data Factory in Azure

I have a USQL script stored on my ADL store and I am trying to execute it. the script file is quite big - about 250Mb.
So far i have a Data Factory, I have created a Linked Service and am trying to create a Data lake Analytics U-SQL Activity.
The code for my U-SQL Activity looks like this:
{
"name": "RunUSQLScript1",
"properties": {
"description": "Runs the USQL Script",
"activities": [
{
"name": "DataLakeAnalyticsUSqlActivityTemplate",
"type": "DataLakeAnalyticsU-SQL",
"linkedServiceName": "AzureDataLakeStoreLinkedService",
"typeProperties": {
"scriptPath": "/Output/dynamic.usql",
"scriptLinkedService": "AzureDataLakeStoreLinkedService",
"degreeOfParallelism": 3,
"priority": 1000
},
"policy": {
"concurrency": 1,
"executionPriorityOrder": "OldestFirst",
"retry": 3,
"timeout": "01:00:00"
},
"scheduler": {
"frequency": "Day",
"interval": 1
}
}
],
"start": "2017-05-02T00:00:00Z",
"end": "2017-05-02T00:00:00Z"
}
}
However, I get the following error:
Error
Activity 'DataLakeAnalyticsUSqlActivityTemplate' from >pipeline 'RunUSQLScript1' has no output(s) and no schedule. Please add an >output dataset or define activity schedule.
What i would like is to have this Activity run on-demand, i.e. I do not want it scheduled at all, and also I do not understand what Inputs and Outputs are in my case. The U-SQL Script I am trying to run is operating on millions of files on my ADL storage and is saving them after some modifiction of the content.
Currently ADF does not support running USQL script stored in ADLS for a USQL activity, i.e. the "scriptLinkedService" under "typeProperties" has to be an Azure Blob Storage Linked Service. We will update the documentation for USQL activity to make this more clear.
Supporting running USQL script stored in ADLS is on our product backlog, but we don't have a committed date for this yet.
Shirley Wang
Currently ADF does not support executing the activity on-demand and it needs to be configured with a schedule. You will need at least one output to drive the schedule execution of the activity. The output can be a dummy Azure Storage one without actually write the data out but ADF leverages the availability properties to drive the schedule execution. For example:
{
"name": "OutputDataset",
"properties": {
"type": "AzureBlob",
"linkedServiceName": "AzureStorageLinkedService",
"typeProperties": {
"fileName": "dummyoutput.txt",
"folderPath": "adf/output",
"format": {
"type": "TextFormat",
"columnDelimiter": "\t"
}
},
"availability": {
"frequency": "Day",
"interval": 1
}
}
}

Resources