New Storage Plugin for csv , json - sandbox

I want to add a new storage plugin ( called onetest)
When I add it on the webUI onetest appear but the directory and files that I had inside "onetest" doesn't appear on my drill explorer.
Note that in montest I placed two csvs.
The JSON I put on the webUI:
{
"type": "file",
"enabled": true,
"connection": "maprfs:///",
"workspaces": {
"root": {
"location": "/patrick",
"writable": false,
"defaultInputFormat": null
},
"montest": {
"location": "/patrick/test",
"writable": true,
"defaultInputFormat": null
},
"tmp": {
"location": "/tmp",
"writable": true,
"defaultInputFormat": null
}
},
"formats": {
"psv": {
"type": "text",
"extensions": [
"tbl"
],
"delimiter": "|"
},
"csv": {
"type": "text",
"extensions": [
"csv"
],
"delimiter": ","
},
"tsv": {
"type": "text",
"extensions": [
"tsv"
],
"delimiter": "\t"
},
"parquet": {
"type": "parquet"
},
"json": {
"type": "json"
},
"maprdb": {
"type": "maprdb"
}
}
}
Output in drill explorer:
The directories has read and write access :
Do you have any idea?
Best

Related

How to get modified date as column in table while ingesting all files from year/month/date directories of storage account?

I have some json files in ADLS account. The files are ingested in multiple Year/Month/Day directory structure. I want to copy all the files from ADLS to Azure SQL DB using azure data flow.
I am able to ingest the data from using data flow but I want to include the file path, file ingestion date along with the file name in three separate columns but I do not know how to get these values.
Please note that each Day directory has more than one file as following:
container_name/Dataset/Year/Month/Day/file1.json.file2.json,file3.json
Could any one help me , how do I ingest the modified date column in table with data of each files
tried using getmedata to copy each file on by one also in dataflow derived column for any modified date
I have reproduced the above and able to get the desired file by using combination of addional column option in copy activity, lookup and Get Meta data activity.
In this these are my datasets which I have used at various activities with dataset parameters.
Source_files_wild_path:
temporary_filepaths:
Each_file:
intermediate:
target_folder:
AFAIK, in ADF we can get the last modified date of files either by REST APIs or Get Meta data. But Get Meta data won't work with dynamic file paths with a folder structure like yours.
Also, we can get the file path of a blob file either from triggers or additonal column option of copy activity only. Here, as there is no usage of triggers, I have used the 2nd method.
So, First I have used a copy activity with wild card path to all source files and added $$FILEPATH as column and copied to a temporary file temp1.csv with Merge files as copy behavior.
Then I have used a lookup activity to temp1.csv to get the file as array of objects by which I can get the file paths list.
Here I have created two variables of array type.
As it is lookup output is an array objects, to get only the filename object array, use a for loop and append the #item().filepath to path_list array.
Then use the below expression to get the unique list of all file paths in unique_path_list array.
#union(variables('path_list'),variables('path_list'))
Now, use this array in a ForEach and inside Foreach, use a Get Meta data activity with each_file dataset and #item() as filename and add the filedsList like Item name and Last modified.
Then use copy activity inside Foreach, and use the same dataset. Here add the additional columns like filename, filepath and last modified and give those values.
In sink of this copy activity use another temporary folder and staging(dataset intermediate). give random file name using date function.
After ForEach, use another copy activity with intermediate dataset as source(use wild card path *.csv and give any empty string to dataset parameter) and target_folder folder as sink to get the result file by using merge files.
My pipeline JSON:
{
"name": "last_modifed_pipeline_copy1",
"properties": {
"activities": [
{
"name": "for_paths_columns",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"additionalColumns": [
{
"name": "filepath",
"value": "$$FILEPATH"
}
],
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"wildcardFolderPath": "*/*/*",
"wildcardFileName": "*.csv",
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings",
"copyBehavior": "MergeFiles"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "Source_files_wild_card_path",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "temporary_filepaths",
"type": "DatasetReference"
}
]
},
{
"name": "Lookup1",
"type": "Lookup",
"dependsOn": [
{
"activity": "for_paths_columns",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"dataset": {
"referenceName": "temporary_filepaths",
"type": "DatasetReference"
},
"firstRowOnly": false
}
},
{
"name": "append filepaths array",
"type": "ForEach",
"dependsOn": [
{
"activity": "Lookup1",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"items": {
"value": "#activity('Lookup1').output.value",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Append variable1",
"type": "AppendVariable",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"variableName": "path_list",
"value": {
"value": "#item().filepath",
"type": "Expression"
}
}
}
]
}
},
{
"name": "get_unique_paths array",
"type": "SetVariable",
"dependsOn": [
{
"activity": "append filepaths array",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"variableName": "unique_path_list",
"value": {
"value": "#union(variables('path_list'),variables('path_list'))",
"type": "Expression"
}
}
},
{
"name": "adds_last modifed column",
"type": "ForEach",
"dependsOn": [
{
"activity": "get_unique_paths array",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"items": {
"value": "#variables('unique_path_list')",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Get Metadata1",
"type": "GetMetadata",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"dataset": {
"referenceName": "Each_file",
"type": "DatasetReference",
"parameters": {
"filename": {
"value": "#item()",
"type": "Expression"
}
}
},
"fieldList": [
"itemName",
"lastModified"
],
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
}
},
{
"name": "Copy data2",
"type": "Copy",
"dependsOn": [
{
"activity": "Get Metadata1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"additionalColumns": [
{
"name": "file_path",
"value": "$$FILEPATH"
},
{
"name": "file_name",
"value": {
"value": "#activity('Get Metadata1').output.itemName",
"type": "Expression"
}
},
{
"name": "last_modifed",
"value": {
"value": "#activity('Get Metadata1').output.lastModified",
"type": "Expression"
}
}
],
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "Each_file",
"type": "DatasetReference",
"parameters": {
"filename": {
"value": "#item()",
"type": "Expression"
}
}
}
],
"outputs": [
{
"referenceName": "intermediate",
"type": "DatasetReference",
"parameters": {
"file_name": {
"value": "#concat(utcNow(),'.csv')",
"type": "Expression"
}
}
}
]
}
]
}
},
{
"name": "Copy data3",
"type": "Copy",
"dependsOn": [
{
"activity": "adds_last modifed column",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"wildcardFileName": "*.csv",
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings",
"copyBehavior": "MergeFiles"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "intermediate",
"type": "DatasetReference",
"parameters": {
"file_name": "No value"
}
}
],
"outputs": [
{
"referenceName": "target_folder",
"type": "DatasetReference"
}
]
}
],
"variables": {
"path_list": {
"type": "Array"
},
"unique_path_list": {
"type": "Array"
}
},
"annotations": [],
"lastPublishTime": "2023-01-27T12:40:51Z"
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
My pipeline:
Result file:
NOTE:
If you want run this on a regular basis, use Storage event trigger by which you can use trigger parameters like #triggerBody().folderPath and #triggerBody().fileName. you can give these to Get Meta data to get last modified time and then pass it to copy activity or dataflow to add as additonal column as per your requirement.

Copy Files from a folder to multiple folders based on the file name in Azure Data Factory

I have a parent folder in ADLS Gen2 called Source which has number of subfolders and these subfolders contain the actual data files as shown in in the below example...
***Source: ***
Folder Name: 20221212
A_20221212.txt B_20221212.txt C_20221212.txt
Folder Name: 20221219
A_20221219.txt B_20221219.txt C_20221219.txt
Folder Name: 20221226
A_20221226.txt B_20221226.txt C_20221226.txt
How can I copy files from subfolders to name specific folders (should create a new folder if it does not exist) using Azure Data Factory, please see the example below...
***Target: ***
Folder Name: A
A_20221212.txt A_20221219.txt A_20221226.txt
Folder Name: B
B_20221212.txt B_20221219.txt B_20221226.txt
Folder Name: C
C_20221212.txt C_20221219.txt C_20221226.txt
Really appreciate your and help.
I have reproduced the above and got below results.
You can follow the below procedure using Get Meta data activity if you have the folder directories at same level.
This is my source folder structure.
data
20221212
A_20221212.txt
B_20221212.txt
C_20221212.txt`
20221219
A_20221219.txt
B_20221219.txt
C_20221219.txt
20221226
A_20221226.txt
B_20221226.txt
C_20221226.txt
Source dataset:
Give this to Get Meta data activity and use ChildItems.
Then Give the ChildItems array from Get Meta data activity to a ForEach activity. Inside ForEach I have used set variable for storing folder name.
#split(item().name,'_')[0]
Now, use copy activity and in source use wild card path like below.
For sink create dataset parameters and give it copy activity sink like below.
My pipeline JSON:
{
"name": "pipeline1",
"properties": {
"activities": [
{
"name": "Get Metadata1",
"type": "GetMetadata",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"dataset": {
"referenceName": "sourcetxt",
"type": "DatasetReference"
},
"fieldList": [
"childItems"
],
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
}
},
{
"name": "ForEach1",
"type": "ForEach",
"dependsOn": [
{
"activity": "Get Metadata1",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"items": {
"value": "#activity('Get Metadata1').output.childItems",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [
{
"activity": "Set variable1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"wildcardFolderPath": "*",
"wildcardFileName": {
"value": "#item().name",
"type": "Expression"
},
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "sourcetxt",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "targettxts",
"type": "DatasetReference",
"parameters": {
"folder_name": {
"value": "#variables('folder_name')",
"type": "Expression"
},
"file_name": {
"value": "#item().name",
"type": "Expression"
}
}
}
]
},
{
"name": "Set variable1",
"type": "SetVariable",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"variableName": "folder_name",
"value": {
"value": "#split(item().name,'_')[0]",
"type": "Expression"
}
}
}
]
}
}
],
"variables": {
"folder_name": {
"type": "String"
}
},
"annotations": []
}
}
Result:

Move Files from Multiple Folders to Multiple Folders in Azure Blob

I have a folder structure in Azure Blob like this
Container/app_archive/app1/app1.csv
Container/app_archive/app2/app2.csv
Container/app_archive/app3/app3.csv
Container/app_archive/app4/app4.csv
Container/app_archive/app5/app5.csv
....
Container/app_archive/app150/app150.csv
These needs to be moved to Container/app_archive/app1/YYYY/MM/DD/app1.csv
Container/app_archive/app2/YYYY/MM/DD/app2.csv
.....
Container/app_archive/app150/YYYY/MM/DD/app150.csv
Whenever any file is placed in any folder, it has to trigger and copy the files accordingly. Also I need to capture this information in an audit table like Source File Name, Source File Path, Destination File Path etc etc. How to achieve this ?
You can use Storage event triggers with Dataset parameters for this like below.
First Give the Root container and Blob path ends with as .csv in Storage event trigger.
Create two pipeline parameters and assign the trigger values to those while creating trigger.
Now, create dataset parameters for folder name and file names for both source and sink datasets.
Source:
Sink:
My pipeline JSON:
{
"name": "pipeline1",
"properties": {
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [
{
"activity": "Set variable1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "Source1",
"type": "DatasetReference",
"parameters": {
"filename": {
"value": "#pipeline().parameters.filename",
"type": "Expression"
},
"folderpath": {
"value": "#pipeline().parameters.path",
"type": "Expression"
}
}
}
],
"outputs": [
{
"referenceName": "target1",
"type": "DatasetReference",
"parameters": {
"sinkpath": {
"value": "#variables('var_path')",
"type": "Expression"
},
"sinkfilename": {
"value": "#pipeline().parameters.filename",
"type": "Expression"
}
}
}
]
},
{
"name": "Set variable1",
"type": "SetVariable",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"variableName": "var_path",
"value": {
"value": "#concat(split(pipeline().parameters.path,'/')[2],'/',formatDateTime(utcNow(),'yyyy/MM/dd'),'/')",
"type": "Expression"
}
}
}
],
"parameters": {
"path": {
"type": "string"
},
"filename": {
"type": "string"
}
},
"variables": {
"var_path": {
"type": "String"
},
"var1": {
"type": "String"
}
},
"annotations": []
}
}
Result when a file uploaded to app1 folder:

Azure Data Factory - Export data to sub container/blob

Hi I have an ADF that copies (exports Azure SQL data) CSV files to a blob.
How can I direct the the files - the destination to a 'sub' container
I have blob Named 'SQLdata' , I want the files to be create in sub-container/blob called customers
SQLdata/Customers
SQLdata/Customers/Cust1.csv
SQLdata/Customers/Cust2.csv
I have tried
"destination": {
"fileName": "Customers//Cust1.csv"
What is wrong with the following?
"activities": [
{
"name": "Export",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [
{
"name": "Source",
"value": "dbo.#{item().source.table}"
},
{
"name": "Destination",
"value": "#{item().destination.fileName}"
}
],
"parameters": {
"cw_items": {
"type": "Array",
"defaultValue": [
{
"source": {
"table": "Cust1"
},
"destination": {
"fileName": "Cust1.csv"
}
},
{
"source": {
"table": "Cust2"
},
"destination": {
"fileName": "Cust2.csv"
}
},
I tried the same export and it works well, all the csv files is stored in containerleon/csv:
JSON code reference:
{
"name": "CopyPipeline_fls",
"properties": {
"activities": [
{
"name": "ForEach_fls",
"type": "ForEach",
"dependsOn": [],
"userProperties": [],
"typeProperties": {
"items": {
"value": "#pipeline().parameters.cw_items",
"type": "Expression"
},
"activities": [
{
"name": "Copy_fls",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [
{
"name": "Source",
"value": "dbo.#{item().source.table}"
},
{
"name": "Destination",
"value": "containerleon/csv/#{item().destination.fileName}"
}
],
"typeProperties": {
"source": {
"type": "AzureSqlSource"
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobStorageWriteSettings"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false
},
"inputs": [
{
"referenceName": "SourceDataset_fls",
"type": "DatasetReference",
"parameters": {
"cw_table": "#item().source.table"
}
}
],
"outputs": [
{
"referenceName": "DestinationDataset_fls",
"type": "DatasetReference",
"parameters": {
"cw_fileName": "#item().destination.fileName"
}
}
]
}
]
}
}
],
"parameters": {
"cw_items": {
"type": "Array",
"defaultValue": [
{
"source": {
"table": "test"
},
"destination": {
"fileName": "dbotest.csv"
}
},
{
"source": {
"table": "test3"
},
"destination": {
"fileName": "dbotest3.csv"
}
}
]
}
},
"annotations": []
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
Storage preview:
Hope this helps.

VSTS fabrikam-build-extension sample not working (Template)

Just tried the fabrikam-build-extension sample on TFS 2017 and VSTS. I can see the custom tasks modifying a build, but I'm unable to use the Template1 Template to create a build definition. Template1 isn't listed.
Anybody have a clue?
Yes, I can reproduce the issue. The two tasks can be add successfully. But the template1 can’t be added in VSTS build template.
But If I use the same content as template.json to create a build template by REST API, I can find it in build template.
PUT https://marinaliu.visualstudio.com/DefaultCollection/Git2/_apis/build/definitions/templates/myCustomTemplate?api-version=2.0
Application/json:
{
"id": "android",
"name": "My Custom Andriod Template",
"category": "Build",
"iconTaskId": "DF857559-8715-46EB-A74E-AC98B9178AA0",
"description": "Build your Android projects, run tests, sign and align Android App Package files. This template requires the Android SDK to be installed on the build agent.",
"template": {
"buildNumberFormat": "$(date:yyyyMMdd)$(rev:.r)",
"build": [{
"enabled": true,
"inputs": {
"wrapperScript": "$(Parameters.wrapperScript)",
"tasks": "$(Parameters.tasks)"
},
"task": {
"id": "8D8EEBD8-2B94-4C97-85AF-839254CC6DA4",
"versionSpec": "1.*"
}
},
{
"enabled": true,
"inputs": {
"files": "**/*.apk",
"jarsign": "false",
"zipalign": "false"
},
"task": {
"id": "80F3F6A0-82A6-4A22-BA7A-E5B8C541B9B9",
"versionSpec": "1.*"
}
},
{
"enabled": true,
"alwaysRun": true,
"inputs": {
"SourceFolder": "$(build.sourcesdirectory)",
"Contents": "**/*.apk",
"TargetFolder": "$(build.artifactstagingdirectory)"
},
"task": {
"id": "5bfb729a-a7c8-4a78-a7c3-8d717bb7c13c",
"versionSpec": "2.*"
}
},
{
"enabled": true,
"alwaysRun": true,
"inputs": {
"PathtoPublish": "$(build.artifactstagingdirectory)",
"ArtifactName": "drop",
"ArtifactType": "Container"
},
"task": {
"id": "2ff763a7-ce83-4e1f-bc89-0ae63477cebe",
"versionSpec": "1.*"
}
}
],
"options": [{
"definition": {
"id": "5D58CC01-7C75-450C-BE18-A388DDB129EC"
},
"enabled": true,
"inputs": {}
}],
"variables": {
"system.debug": {
"value": "false",
"allowOverride": true
}
},
"triggers": [],
"processParameters": {
"inputs": [{
"name": "wrapperScript",
"label": "{GradleWrapper}",
"defaultValue": "gradlew",
"required": true,
"type": "filePath"
},
{
"name": "tasks",
"label": "{GradleTasks}",
"defaultValue": "build",
"required": true,
"type": "string"
}
]
}
}
}
And the create an issue here, you can follow up.

Resources