I am migrating an Azure devops organization to another organization,
I ran into a problem where i can not create a pipeline which created in the classic way (NOT A YAML).
This is the JSON
{
"name": "PP_NAME",
"folder": "",
"configuration": {
"designerJson": {
"options": [
{
"enabled": false,
"definition": {
"id": "DEF_ID"
},
"inputs": {
"branchFilters": "[\"+refs/heads/*\"]",
"additionalFields": "{}"
}
},
{
"enabled": false,
"definition": {
"id": "DEF_ID"
},
"inputs": {
"workItemType": "Task",
"assignToRequestor": "true",
"additionalFields": "{}"
}
}
],
"variables": {
"system.debug": {
"value": "false",
"allowOverride": true
}
},
"tags": [],
"process": {
"phases": [
{
"name": "Agent job 1",
"refName": "Job_1",
"condition": "succeeded()",
"target": {
"executionOptions": {
"type": 0
},
"allowScriptsAuthAccessOption": false,
"type": 1
},
"jobAuthorizationScope": "project"
}
],
"target": {
"agentSpecification": {
"identifier": "windows-2019"
}
},
"type": 1
},
"quality": "definition",
"path": "\\",
"repository": {
"id": "REPOSITORY_ID",
"name": "test 1",
"type": "TfsGit"
}
},
"path": "\\",
"type": "designerJson"
}
}
The output :
{
"$id": "1",
"innerException": null,
"message": "This API does not support creating pipelines of configuration type DesignerJson.",
"typeName": "Microsoft.Azure.Pipelines.WebApi.UnsupportedConfigurationTypeException, Microsoft.Azure.Pipelines.WebApi",
"typeKey": "UnsupportedConfigurationTypeException",
"errorCode": 0,
"eventId": 3000
}
According the the AZDO documentation, it is possible to create a pipeline with the classic way.
Thanks !
To create a classic pipeline, you can use this REST API Definitions - Create.
If you are not sure about the request body, you can use REST API Definitions - Get to get the definition of a classic pipeline as a reference.
Related
I am able to create the "message route" in azure portal and able to route messages to servicebusqueue if the query matching, I want to create the message route using the restapi instead of using azure portal, I have seen many documents but unable to find the proper one. Whether creating the message route using restapi is possible or not? if yes,How can I achieve this and please provide the respective links to refer?
I haven't tried this through REST API, but as Roman suggested,
You can check the IotHubResource_CreateOrUpdate which will help you understand how to Create or update the metadata of an Iot hub. The usual pattern to modify a property is to retrieve the IoT hub metadata and security metadata, and then combine them with the modified values in a new body to update the IoT hub.
Sample Request:
PUT https://management.azure.com/subscriptions/91d12660-3dec-467a-be2a-213b5544ddc0/resourceGroups/myResourceGroup/providers/Microsoft.Devices/IotHubs/testHub?api-version=2018-04-01
Request Body:
{
"name": "iot-dps-cit-hub-1",
"type": "Microsoft.Devices/IotHubs",
"location": "centraluseuap",
"tags": {},
"etag": "AAAAAAFD6M4=",
"properties": {
"operationsMonitoringProperties": {
"events": {
"None": "None",
"Connections": "None",
"DeviceTelemetry": "None",
"C2DCommands": "None",
"DeviceIdentityOperations": "None",
"FileUploadOperations": "None",
"Routes": "None"
}
},
"state": "Active",
"provisioningState": "Succeeded",
"ipFilterRules": [],
"hostName": "iot-dps-cit-hub-1.azure-devices.net",
"eventHubEndpoints": {
"events": {
"retentionTimeInDays": 1,
"partitionCount": 2,
"partitionIds": [
"0",
"1"
],
"path": "iot-dps-cit-hub-1",
"endpoint": "sb://iothub-ns-iot-dps-ci-245306-76aca8e13b.servicebus.windows.net/"
},
"operationsMonitoringEvents": {
"retentionTimeInDays": 1,
"partitionCount": 2,
"partitionIds": [
"0",
"1"
],
"path": "iot-dps-cit-hub-1-operationmonitoring",
"endpoint": "sb://iothub-ns-iot-dps-ci-245306-76aca8e13b.servicebus.windows.net/"
}
},
"routing": {
"endpoints": {
"serviceBusQueues": [],
"serviceBusTopics": [],
"eventHubs": [],
"storageContainers": []
},
"routes": [],
"fallbackRoute": {
"name": "$fallback",
"source": "DeviceMessages",
"condition": "true",
"endpointNames": [
"events"
],
"isEnabled": true
}
},
"storageEndpoints": {
"$default": {
"sasTtlAsIso8601": "PT1H",
"connectionString": "",
"containerName": ""
}
},
"messagingEndpoints": {
"fileNotifications": {
"lockDurationAsIso8601": "PT1M",
"ttlAsIso8601": "PT1H",
"maxDeliveryCount": 10
}
},
"enableFileUploadNotifications": false,
"cloudToDevice": {
"maxDeliveryCount": 10,
"defaultTtlAsIso8601": "PT1H",
"feedback": {
"lockDurationAsIso8601": "PT1M",
"ttlAsIso8601": "PT1H",
"maxDeliveryCount": 10
}
},
"features": "None"
},
"sku": {
"name": "S1",
"tier": "Standard",
"capacity": 1
}
}
I have a data factory that copies data from a restful webservice into an Azure Data Warehouse. I have tested and previewed all connections and datasets.
I'm receiving the following error message.
{
"errorCode": "2200",
"message": "ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of the property 'typeName' is invalid: 'Value cannot be null.\r\nParameter name: typeName'.,Source=,''Type=System.ArgumentNullException,Message=Value cannot be null.\r\nParameter name: typeName,Source=Microsoft.DataTransfer.Common,'",
"failureType": "UserError",
"target": "ImportLegs"
}
Pipeline source
{
"name": "Import Trip Data",
"properties": {
"activities": [
{
"name": "ImportLegs",
"type": "Copy",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [
{
"name": "Source",
"value": "flightleg?StartDate=01/01/2018&EndDate=02/01/2018"
},
{
"name": "Destination",
"value": "[Trip].[Leg]"
}
],
"typeProperties": {
"source": {
"type": "RestSource",
"httpRequestTimeout": "00:01:40",
"requestInterval": "00.00:00:00.010"
},
"sink": {
"type": "SqlDWSink",
"allowPolyBase": false,
"writeBatchSize": 10000
},
"enableStaging": false,
"enableSkipIncompatibleRow": true,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "id"
},
"sink": {
"name": "Origin"
}
},
{
"source": {
"path": "actualArrivalDateLocal"
},
"sink": {
"name": "Destination"
}
},
{
"source": {
"path": "actualArrivalDateUTC"
},
"sink": {
"name": "FlightLogDistance"
}
},
{
"source": {
"path": "actualBlockTime"
},
"sink": {
"name": "FlightLogFlightTime"
}
},
{
"source": {
"path": "actualDepartureDateLocal"
},
"sink": {
"name": "Aircraft"
}
},
{
"source": {
"path": "actualDepartureDateUTC"
},
"sink": {
"name": "ScheduledDepartDate"
}
}
]
}
},
"inputs": [
{
"referenceName": "FlightLeg",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "TripLegDW",
"type": "DatasetReference"
}
]
}
]
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
I also had same error and after some research i found some solution. posting to help someone further.
Error-
{
"errorCode": "2200",
"message": "ErrorCode=UserErrorFailedS3FileReadOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The operation on file part-00000-.csv.gz under directory test-bucket/abc_backfill/abc_visits is failed due to exception. ,Source=Microsoft.DataTransfer.ClientLibrary.MultipartBinaryConnector,''Type=Amazon.S3.AmazonS3Exception,Message=Error making request with Error Code Forbidden and Http Status Code Forbidden. No further error information was returned by the service.,Source=AWSSDK.Core,''Type=Amazon.Runtime.Internal.HttpErrorResponseException,Message=The remote server returned an error: (403) Forbidden.,Source=AWSSDK.Core,''Type=System.Net.WebException,Message=The remote server returned an error: (403) Forbidden.,Source=System,'",
"failureType": "UserError",
"target": "Copy data1",
"details": []
}
Solution - I found this is some access issue. even while testing connection it was giving success but when executing pipeline it was failing with above message. i found, i am accessing wrong dir. as above abc_backfill is root dir or shared dir that having abc_visits and it was giving issue.
I found i have another shared dir with abc_testbackfill which pointing to same abc_visits. once i changed to abc_testbackfill in dataset. it started working.
so i belive i didn't had access on abc_backfill share and i was getting error.
I am trying to make an ARM Template for a Custom Connector which needs to have enabled the use of on-premises Data Gateway. I am getting everything right except this last requirement, since I create a fully configured Custom Connector, except for the fact that the check-mark for using the on-premises Data Gateway is not active.
Any idea which setting/element I need to add change to get this done?
This is my template up to now:
{
"type": "Microsoft.Web/customApis",
"name": "[parameters('Connection_Name')]",
"apiVersion": "2016-06-01",
"location": "centralus",
"scale": null,
"properties": {
"connectionParameters": {
"authType": {
"type": "string",
"allowedValues": [
{
"value": "none"
}
],
"uiDefinition": {
"displayName": "Authentication Type",
"description": "Authentication type to connect to your API",
"tooltip": "Authentication type to connect to your API",
"constraints": {
"tabIndex": 1,
"required": "true",
"allowedValues": [
{
"text": "none",
"value": "anonymous"
}
],
"capability": [
"gateway"
]
}
}
},
"gateway": {
"type": "gatewaySetting",
"gatewaySettings": {
"dataSourceType": "CustomConnector",
"connectionDetails": []
},
"uiDefinition": {
"constraints": {
"tabIndex": 4,
"required": "true",
"capability": [
"gateway"
]
}
}
}
},
"backendService": {
"serviceUrl": "[parameters('ServiceUrl')]"
},
"apiType": "Soap",
"wsdlDefinition": {
"importMethod": "SoapPassThrough"
},
"swagger": {
"swagger": "2.0",
"info": {
"title": "SOAP pass-through",
"description": "Custom Connector for SOAP Operation",
"version": "1.0"
},
"host": "xxxxxxxxxxxxx",
"basePath": "/xxxxxxxx/xxxxxxxxxx",
"consumes": [],
"produces": [],
"paths": {
"/": {
"post": {
"responses": {
"default": {
"description": "default",
"schema": {
"type": "string",
"title": "",
"x-ms-visibility": "important"
},
"headers": {
"Content-Type": {
"description": "Content-Type",
"type": "string"
}
}
}
},
"summary": "GetOrigins",
"description": "GetOrigins",
"operationId": "GetOrigins",
"parameters": [
{
"name": "Content-Type",
"in": "header",
"required": false,
"type": "string"
},
{
"name": "body",
"in": "body",
"required": false,
"schema": {
"type": "string"
}
}
]
}
}
},
"definitions": {},
"parameters": {},
"responses": {},
"securityDefinitions": {},
"security": [],
"tags": [],
"schemes": [
"http"
]
},
"description": "[concat('Custom Connector for SOAP', parameters('Connection_Name'),' Operation')]",
"displayName": "[parameters('Connection_Name')]",
"iconUri": "/Content/retail/assets/default-connection-icon.6296199fc1d74aa45ca68e1253762025.2.svg"
},
"dependsOn": []
}
To enable the 'On-Premises Data Gateway' option, you need to add gateway to the capabilities array like so:
"properties" {
"capabilities": [
"gateway"
],
}
the On-Premises Data Gateway option is not available from ARM templates, you must to install your template and manually add the check box from the Azure portal.
Thanks.
I have created a REST API with a Swagger/OPEN API specification which I will like to consume trough a Azure API Management tenant in a Logic App.
When I download the specification it looks like this:
{
"swagger": "2.0",
"info": {
"title": "Leasing",
"version": "1.0"
},
"host": "ENDPOINT.azure-api.net",
"basePath": "/leasing",
"schemes": [
"http",
"https"
],
"securityDefinitions": {
"apiKeyHeader": {
"type": "apiKey",
"name": "Ocp-Apim-Subscription-Key",
"in": "header"
},
"apiKeyQuery": {
"type": "apiKey",
"name": "subscription-key",
"in": "query"
}
},
"security": [
{
"apiKeyHeader": []
},
{
"apiKeyQuery": []
}
],
"paths": {
"/{Brand}/groups": {
"get": {
"description": "Get a list of leasing groups on a brand",
"operationId": "GetGroups",
"parameters": [
{
"name": "Brand",
"in": "path",
"description": "Selection of possible brands",
"required": true,
"type": "string",
"enum": [
"Volkswagen",
"Audi",
"Seat",
"Skoda",
"VolkswagenErhverv",
"Porsche",
"Ducati"
]
}
],
"responses": {
"200": {
"description": "Returns a list of leasing groups",
"schema": {
"$ref": "#/definitions/GroupArray"
}
},
"400": {
"description": "If the brand is not valid",
"schema": {
"$ref": "#/definitions/Error"
}
}
},
"produces": [
"application/json"
]
}
}
},
"definitions": {
"Group": {
"type": "object",
"properties": {
"id": {
"format": "int32",
"type": "integer"
},
"name": {
"type": "string"
},
"description": {
"type": "string"
},
"leasingModelCount": {
"format": "int32",
"type": "integer"
},
"lowestMonthlyFee": {
"format": "int32",
"type": "integer"
}
}
},
"Error": {
"type": "object",
"properties": {
"code": {
"enum": [
"NotValidBrand",
"NotValidGroupId"
],
"type": "string",
"x-ms-enum": {
"name": "ErrorCode",
"modelAsString": true
}
},
"message": {
"type": "string"
}
}
},
"GroupArray": {
"type": "array",
"items": {
"$ref": "#/definitions/Group"
}
}
}
}
When I add this in a Logic App with the connector HTTP + Swagger I only get to define the {Brand} query input but not the various ways of using the Subscriptions key (header or query) as defined in SecurityDefiniations.
The whole securityDefinitions and security section are automatically generated in the Azure API Management service, but not recognized in Logic App.
See image of missing subscription key field:
What am I doing wrong?
Update
I have tried the following:
Usage of the 'Authentication' field (but this field is limited to certain types of auths flows https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http#authentication)
Change the Logic App 'Http + Swagger'-action in code to add the header parameter, but this action converts the action to a simple 'Http' action and therfore loosing the automatic schema generation from Swagger.
I think you need to specify this in the Authentication-field in a JSON format. Something like:
{
"apiKeyHeader" : "your Ocp-Apim-Subscription-Key",
"apiKeyQuery" : "your subscription key"
}
Just tried the fabrikam-build-extension sample on TFS 2017 and VSTS. I can see the custom tasks modifying a build, but I'm unable to use the Template1 Template to create a build definition. Template1 isn't listed.
Anybody have a clue?
Yes, I can reproduce the issue. The two tasks can be add successfully. But the template1 can’t be added in VSTS build template.
But If I use the same content as template.json to create a build template by REST API, I can find it in build template.
PUT https://marinaliu.visualstudio.com/DefaultCollection/Git2/_apis/build/definitions/templates/myCustomTemplate?api-version=2.0
Application/json:
{
"id": "android",
"name": "My Custom Andriod Template",
"category": "Build",
"iconTaskId": "DF857559-8715-46EB-A74E-AC98B9178AA0",
"description": "Build your Android projects, run tests, sign and align Android App Package files. This template requires the Android SDK to be installed on the build agent.",
"template": {
"buildNumberFormat": "$(date:yyyyMMdd)$(rev:.r)",
"build": [{
"enabled": true,
"inputs": {
"wrapperScript": "$(Parameters.wrapperScript)",
"tasks": "$(Parameters.tasks)"
},
"task": {
"id": "8D8EEBD8-2B94-4C97-85AF-839254CC6DA4",
"versionSpec": "1.*"
}
},
{
"enabled": true,
"inputs": {
"files": "**/*.apk",
"jarsign": "false",
"zipalign": "false"
},
"task": {
"id": "80F3F6A0-82A6-4A22-BA7A-E5B8C541B9B9",
"versionSpec": "1.*"
}
},
{
"enabled": true,
"alwaysRun": true,
"inputs": {
"SourceFolder": "$(build.sourcesdirectory)",
"Contents": "**/*.apk",
"TargetFolder": "$(build.artifactstagingdirectory)"
},
"task": {
"id": "5bfb729a-a7c8-4a78-a7c3-8d717bb7c13c",
"versionSpec": "2.*"
}
},
{
"enabled": true,
"alwaysRun": true,
"inputs": {
"PathtoPublish": "$(build.artifactstagingdirectory)",
"ArtifactName": "drop",
"ArtifactType": "Container"
},
"task": {
"id": "2ff763a7-ce83-4e1f-bc89-0ae63477cebe",
"versionSpec": "1.*"
}
}
],
"options": [{
"definition": {
"id": "5D58CC01-7C75-450C-BE18-A388DDB129EC"
},
"enabled": true,
"inputs": {}
}],
"variables": {
"system.debug": {
"value": "false",
"allowOverride": true
}
},
"triggers": [],
"processParameters": {
"inputs": [{
"name": "wrapperScript",
"label": "{GradleWrapper}",
"defaultValue": "gradlew",
"required": true,
"type": "filePath"
},
{
"name": "tasks",
"label": "{GradleTasks}",
"defaultValue": "build",
"required": true,
"type": "string"
}
]
}
}
}
And the create an issue here, you can follow up.