VSTS fabrikam-build-extension sample not working (Template) - azure-pipelines-build-task

Just tried the fabrikam-build-extension sample on TFS 2017 and VSTS. I can see the custom tasks modifying a build, but I'm unable to use the Template1 Template to create a build definition. Template1 isn't listed.
Anybody have a clue?

Yes, I can reproduce the issue. The two tasks can be add successfully. But the template1 can’t be added in VSTS build template.
But If I use the same content as template.json to create a build template by REST API, I can find it in build template.
PUT https://marinaliu.visualstudio.com/DefaultCollection/Git2/_apis/build/definitions/templates/myCustomTemplate?api-version=2.0
Application/json:
{
"id": "android",
"name": "My Custom Andriod Template",
"category": "Build",
"iconTaskId": "DF857559-8715-46EB-A74E-AC98B9178AA0",
"description": "Build your Android projects, run tests, sign and align Android App Package files. This template requires the Android SDK to be installed on the build agent.",
"template": {
"buildNumberFormat": "$(date:yyyyMMdd)$(rev:.r)",
"build": [{
"enabled": true,
"inputs": {
"wrapperScript": "$(Parameters.wrapperScript)",
"tasks": "$(Parameters.tasks)"
},
"task": {
"id": "8D8EEBD8-2B94-4C97-85AF-839254CC6DA4",
"versionSpec": "1.*"
}
},
{
"enabled": true,
"inputs": {
"files": "**/*.apk",
"jarsign": "false",
"zipalign": "false"
},
"task": {
"id": "80F3F6A0-82A6-4A22-BA7A-E5B8C541B9B9",
"versionSpec": "1.*"
}
},
{
"enabled": true,
"alwaysRun": true,
"inputs": {
"SourceFolder": "$(build.sourcesdirectory)",
"Contents": "**/*.apk",
"TargetFolder": "$(build.artifactstagingdirectory)"
},
"task": {
"id": "5bfb729a-a7c8-4a78-a7c3-8d717bb7c13c",
"versionSpec": "2.*"
}
},
{
"enabled": true,
"alwaysRun": true,
"inputs": {
"PathtoPublish": "$(build.artifactstagingdirectory)",
"ArtifactName": "drop",
"ArtifactType": "Container"
},
"task": {
"id": "2ff763a7-ce83-4e1f-bc89-0ae63477cebe",
"versionSpec": "1.*"
}
}
],
"options": [{
"definition": {
"id": "5D58CC01-7C75-450C-BE18-A388DDB129EC"
},
"enabled": true,
"inputs": {}
}],
"variables": {
"system.debug": {
"value": "false",
"allowOverride": true
}
},
"triggers": [],
"processParameters": {
"inputs": [{
"name": "wrapperScript",
"label": "{GradleWrapper}",
"defaultValue": "gradlew",
"required": true,
"type": "filePath"
},
{
"name": "tasks",
"label": "{GradleTasks}",
"defaultValue": "build",
"required": true,
"type": "string"
}
]
}
}
}
And the create an issue here, you can follow up.

Related

Azure Devops Create Pipeline (Classic way) REST API

I am migrating an Azure devops organization to another organization,
I ran into a problem where i can not create a pipeline which created in the classic way (NOT A YAML).
This is the JSON
{
"name": "PP_NAME",
"folder": "",
"configuration": {
"designerJson": {
"options": [
{
"enabled": false,
"definition": {
"id": "DEF_ID"
},
"inputs": {
"branchFilters": "[\"+refs/heads/*\"]",
"additionalFields": "{}"
}
},
{
"enabled": false,
"definition": {
"id": "DEF_ID"
},
"inputs": {
"workItemType": "Task",
"assignToRequestor": "true",
"additionalFields": "{}"
}
}
],
"variables": {
"system.debug": {
"value": "false",
"allowOverride": true
}
},
"tags": [],
"process": {
"phases": [
{
"name": "Agent job 1",
"refName": "Job_1",
"condition": "succeeded()",
"target": {
"executionOptions": {
"type": 0
},
"allowScriptsAuthAccessOption": false,
"type": 1
},
"jobAuthorizationScope": "project"
}
],
"target": {
"agentSpecification": {
"identifier": "windows-2019"
}
},
"type": 1
},
"quality": "definition",
"path": "\\",
"repository": {
"id": "REPOSITORY_ID",
"name": "test 1",
"type": "TfsGit"
}
},
"path": "\\",
"type": "designerJson"
}
}
The output :
{
"$id": "1",
"innerException": null,
"message": "This API does not support creating pipelines of configuration type DesignerJson.",
"typeName": "Microsoft.Azure.Pipelines.WebApi.UnsupportedConfigurationTypeException, Microsoft.Azure.Pipelines.WebApi",
"typeKey": "UnsupportedConfigurationTypeException",
"errorCode": 0,
"eventId": 3000
}
According the the AZDO documentation, it is possible to create a pipeline with the classic way.
Thanks !
To create a classic pipeline, you can use this REST API Definitions - Create.
If you are not sure about the request body, you can use REST API Definitions - Get to get the definition of a classic pipeline as a reference.

Microsoft.DataTransfer.Common.Shared.HybridDeliveryException TypeName cannot be null

I have a data factory that copies data from a restful webservice into an Azure Data Warehouse. I have tested and previewed all connections and datasets.
I'm receiving the following error message.
{
"errorCode": "2200",
"message": "ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of the property 'typeName' is invalid: 'Value cannot be null.\r\nParameter name: typeName'.,Source=,''Type=System.ArgumentNullException,Message=Value cannot be null.\r\nParameter name: typeName,Source=Microsoft.DataTransfer.Common,'",
"failureType": "UserError",
"target": "ImportLegs"
}
Pipeline source
{
"name": "Import Trip Data",
"properties": {
"activities": [
{
"name": "ImportLegs",
"type": "Copy",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [
{
"name": "Source",
"value": "flightleg?StartDate=01/01/2018&EndDate=02/01/2018"
},
{
"name": "Destination",
"value": "[Trip].[Leg]"
}
],
"typeProperties": {
"source": {
"type": "RestSource",
"httpRequestTimeout": "00:01:40",
"requestInterval": "00.00:00:00.010"
},
"sink": {
"type": "SqlDWSink",
"allowPolyBase": false,
"writeBatchSize": 10000
},
"enableStaging": false,
"enableSkipIncompatibleRow": true,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "id"
},
"sink": {
"name": "Origin"
}
},
{
"source": {
"path": "actualArrivalDateLocal"
},
"sink": {
"name": "Destination"
}
},
{
"source": {
"path": "actualArrivalDateUTC"
},
"sink": {
"name": "FlightLogDistance"
}
},
{
"source": {
"path": "actualBlockTime"
},
"sink": {
"name": "FlightLogFlightTime"
}
},
{
"source": {
"path": "actualDepartureDateLocal"
},
"sink": {
"name": "Aircraft"
}
},
{
"source": {
"path": "actualDepartureDateUTC"
},
"sink": {
"name": "ScheduledDepartDate"
}
}
]
}
},
"inputs": [
{
"referenceName": "FlightLeg",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "TripLegDW",
"type": "DatasetReference"
}
]
}
]
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
I also had same error and after some research i found some solution. posting to help someone further.
Error-
{
"errorCode": "2200",
"message": "ErrorCode=UserErrorFailedS3FileReadOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The operation on file part-00000-.csv.gz under directory test-bucket/abc_backfill/abc_visits is failed due to exception. ,Source=Microsoft.DataTransfer.ClientLibrary.MultipartBinaryConnector,''Type=Amazon.S3.AmazonS3Exception,Message=Error making request with Error Code Forbidden and Http Status Code Forbidden. No further error information was returned by the service.,Source=AWSSDK.Core,''Type=Amazon.Runtime.Internal.HttpErrorResponseException,Message=The remote server returned an error: (403) Forbidden.,Source=AWSSDK.Core,''Type=System.Net.WebException,Message=The remote server returned an error: (403) Forbidden.,Source=System,'",
"failureType": "UserError",
"target": "Copy data1",
"details": []
}
Solution - I found this is some access issue. even while testing connection it was giving success but when executing pipeline it was failing with above message. i found, i am accessing wrong dir. as above abc_backfill is root dir or shared dir that having abc_visits and it was giving issue.
I found i have another shared dir with abc_testbackfill which pointing to same abc_visits. once i changed to abc_testbackfill in dataset. it started working.
so i belive i didn't had access on abc_backfill share and i was getting error.

ARM Template for Azure Custom Connector using On-Premises Data Gateway

I am trying to make an ARM Template for a Custom Connector which needs to have enabled the use of on-premises Data Gateway. I am getting everything right except this last requirement, since I create a fully configured Custom Connector, except for the fact that the check-mark for using the on-premises Data Gateway is not active.
Any idea which setting/element I need to add change to get this done?
This is my template up to now:
{
"type": "Microsoft.Web/customApis",
"name": "[parameters('Connection_Name')]",
"apiVersion": "2016-06-01",
"location": "centralus",
"scale": null,
"properties": {
"connectionParameters": {
"authType": {
"type": "string",
"allowedValues": [
{
"value": "none"
}
],
"uiDefinition": {
"displayName": "Authentication Type",
"description": "Authentication type to connect to your API",
"tooltip": "Authentication type to connect to your API",
"constraints": {
"tabIndex": 1,
"required": "true",
"allowedValues": [
{
"text": "none",
"value": "anonymous"
}
],
"capability": [
"gateway"
]
}
}
},
"gateway": {
"type": "gatewaySetting",
"gatewaySettings": {
"dataSourceType": "CustomConnector",
"connectionDetails": []
},
"uiDefinition": {
"constraints": {
"tabIndex": 4,
"required": "true",
"capability": [
"gateway"
]
}
}
}
},
"backendService": {
"serviceUrl": "[parameters('ServiceUrl')]"
},
"apiType": "Soap",
"wsdlDefinition": {
"importMethod": "SoapPassThrough"
},
"swagger": {
"swagger": "2.0",
"info": {
"title": "SOAP pass-through",
"description": "Custom Connector for SOAP Operation",
"version": "1.0"
},
"host": "xxxxxxxxxxxxx",
"basePath": "/xxxxxxxx/xxxxxxxxxx",
"consumes": [],
"produces": [],
"paths": {
"/": {
"post": {
"responses": {
"default": {
"description": "default",
"schema": {
"type": "string",
"title": "",
"x-ms-visibility": "important"
},
"headers": {
"Content-Type": {
"description": "Content-Type",
"type": "string"
}
}
}
},
"summary": "GetOrigins",
"description": "GetOrigins",
"operationId": "GetOrigins",
"parameters": [
{
"name": "Content-Type",
"in": "header",
"required": false,
"type": "string"
},
{
"name": "body",
"in": "body",
"required": false,
"schema": {
"type": "string"
}
}
]
}
}
},
"definitions": {},
"parameters": {},
"responses": {},
"securityDefinitions": {},
"security": [],
"tags": [],
"schemes": [
"http"
]
},
"description": "[concat('Custom Connector for SOAP', parameters('Connection_Name'),' Operation')]",
"displayName": "[parameters('Connection_Name')]",
"iconUri": "/Content/retail/assets/default-connection-icon.6296199fc1d74aa45ca68e1253762025.2.svg"
},
"dependsOn": []
}
To enable the 'On-Premises Data Gateway' option, you need to add gateway to the capabilities array like so:
"properties" {
"capabilities": [
"gateway"
],
}
the On-Premises Data Gateway option is not available from ARM templates, you must to install your template and manually add the check box from the Azure portal.
Thanks.

Error creating a customContent on a confluence addon

Today I was trying to create a confluence addon for my company and I've try following atlassian documents.
My problem comes trying to run the express app when adding a new customContent to the atlassian-connect.json, after running npm start I get the following error.
Failed to register with host https‍://admin:xxx#xxx.atlassian.net/wiki (200)
{"type":"INSTALL","pingAfter":300,"status":{"done":true,"statusCode":200,"con
tentType":"application/vnd.atl.plugins.task.install.err+json","subCode":"upm.
pluginInstall.error.descriptor.not.from.marketplace","source":"https‍://1a0adc
8f.ngrok.io/atlassian-connect.json","name":"https‍://1a0adc8f.ngrok.io/atlassi
an-connect.json"},"links":{"self":"/wiki/rest/plugins/1.0/pending/b88594d3-c3
c2-4760-b687-c8d860c0a377","alternate":"/wiki/rest/plugins/1.0/tasks/b88594d3
-c3c2-4760-b687-c8d860c0a377"},"timestamp":1502272147602,"userKey":"xxx","id":"xxx"}
Add-on not registered; no compatible hosts detected
This is my atlassian-connect.json file:
{
"key": "my-add-on",
"name": "Ping Pong",
"description": "My very first add-on",
"vendor": {
"name": "Angry Nerds",
"url": "https://www.atlassian.com/angrynerds"
},
"baseUrl": "{{localBaseUrl}}",
"links": {
"self": "{{localBaseUrl}}/atlassian-connect.json",
"homepage": "{{localBaseUrl}}/atlassian-connect.json"
},
"authentication": {
"type": "jwt"
},
"lifecycle": {
"installed": "/installed"
},
"scopes": [
"READ"
],
"modules": {
"generalPages": [
{
"key": "hello-world-page-jira",
"location": "system.top.navigation.bar",
"name": {
"value": "Hello World"
},
"url": "/hello-world",
"conditions": [{
"condition": "user_is_logged_in"
}]
},
{
"key": "customersViewer",
"location": "system.header/left",
"name": {
"value": "Hello World"
},
"url": "/hello-world",
"conditions": [{
"condition": "user_is_logged_in"
}]
}
],
"customContent": [
{
"key": "customer",
"name": {
"value": "Customers"
},
"uiSupport": {
"contentViewComponent": {
"moduleKey": "customersViewer"
},
"listViewComponent": {
"moduleKey": "customerList"
},
"icons": {
"item": {
"url": "/images/customers.png"
}
}
},
"apiSupport": {
"supportedContainerTypes": ["space"]
}
}
]
}
}
Does anybody has an idea on whats going on?
The contentViewComponent can't find the generalPage it is referencing in moduleKey.
From the docs:
In the snippet above, the moduleKey “customersViewer” maps to a
generalPage module we have defined in our add-on. This generalPage is
passed the context parameters we specify, and visualizes our content
accordingly.
If you change the generalPage with the key hello-world-page-confluence to customersVieweryou be able to install and get up and running.

New Storage Plugin for csv , json

I want to add a new storage plugin ( called onetest)
When I add it on the webUI onetest appear but the directory and files that I had inside "onetest" doesn't appear on my drill explorer.
Note that in montest I placed two csvs.
The JSON I put on the webUI:
{
"type": "file",
"enabled": true,
"connection": "maprfs:///",
"workspaces": {
"root": {
"location": "/patrick",
"writable": false,
"defaultInputFormat": null
},
"montest": {
"location": "/patrick/test",
"writable": true,
"defaultInputFormat": null
},
"tmp": {
"location": "/tmp",
"writable": true,
"defaultInputFormat": null
}
},
"formats": {
"psv": {
"type": "text",
"extensions": [
"tbl"
],
"delimiter": "|"
},
"csv": {
"type": "text",
"extensions": [
"csv"
],
"delimiter": ","
},
"tsv": {
"type": "text",
"extensions": [
"tsv"
],
"delimiter": "\t"
},
"parquet": {
"type": "parquet"
},
"json": {
"type": "json"
},
"maprdb": {
"type": "maprdb"
}
}
}
Output in drill explorer:
The directories has read and write access :
Do you have any idea?
Best

Resources