When I'm trying to create table in Azure Data explorer database from Azure Pipeline I'm getting below error.
Error
##[error]Service returned an error: Error: getaddrinfo ENOTFOUND testdbprod.kusto.windows.net testdbprod.kusto.windows.net:443 server: testdbprod.kusto.windows.net database: testprod-prod command: .create-merge table Notifications (Timestamp: datetime, NoteType: string, homename: string, Value: string)
Pipeline.yaml
- task: ADXQuery#1
inputs:
targetType: 'inline'
script: '.create-merge table Notifications (Timestamp: datetime, NoteType: string, homename: string, Value: string)'
kustoUrls: 'https://testdbprod.kusto.windows.net:443?DatabaseName=testprod-prod'
customAuth: false
ResourceURI: 'https://testdbprod.kusto.windows.net'
aadClientId: 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXx'
aadClientSecret: 'XXXXXXXXXXXXXXXXXXXXXX'
tenantId: 'XXXXXXXXXXXXXXXXXXXXXXXX'
minThreshold: '0'
maxThreshold: '0'
It seems that the cluster URL you provided is invalid.
Related
I'm facing quite a big problem. I have a function app that I deploy by Azure Bicep in the following fashion:
param environmentType string
param location string
param storageAccountSku string
param vnetIntegrationSubnetId string
param kvName string
/*
This module contains the IaC for deploying the Premium function app
*/
/// Just a single minimum instance to start with and max scaling of 3 for dev, 5 for prd ///
var minimumElasticSize = 1
var maximumElasticSize = ((environmentType == 'prd') ? 5 : 3)
var name = 'nlp'
var functionAppName = 'function-app-${name}-${environmentType}'
/// Storage account for service ///
resource functionAppStorage 'Microsoft.Storage/storageAccounts#2019-06-01' = {
name: 'st4functionapp${name}${environmentType}'
location: location
kind: 'StorageV2'
sku: {
name: storageAccountSku
}
properties: {
allowBlobPublicAccess: false
accessTier: 'Hot'
supportsHttpsTrafficOnly: true
minimumTlsVersion: 'TLS1_2'
}
}
/// Premium app plan for the service ///
resource servicePlanfunctionApp 'Microsoft.Web/serverfarms#2021-03-01' = {
name: 'plan-${name}-function-app-${environmentType}'
location: location
kind: 'linux'
sku: {
name: 'EP1'
tier: 'ElasticPremium'
family: 'EP'
}
properties: {
reserved: true
targetWorkerCount: minimumElasticSize
maximumElasticWorkerCount: maximumElasticSize
elasticScaleEnabled: true
isSpot: false
zoneRedundant: ((environmentType == 'prd') ? true : false)
}
}
// Create log analytics workspace
resource logAnalyticsWorkspacefunctionApp 'Microsoft.OperationalInsights/workspaces#2021-06-01' = {
name: '${name}-functionapp-loganalytics-workspace-${environmentType}'
location: location
properties: {
sku: {
name: 'PerGB2018' // Standard
}
}
}
/// Log analytics workspace insights ///
resource applicationInsightsfunctionApp 'Microsoft.Insights/components#2020-02-02' = {
name: 'application-insights-${name}-function-${environmentType}'
location: location
kind: 'web'
properties: {
Application_Type: 'web'
Flow_Type: 'Bluefield'
publicNetworkAccessForIngestion: 'Enabled'
publicNetworkAccessForQuery: 'Enabled'
Request_Source: 'rest'
RetentionInDays: 30
WorkspaceResourceId: logAnalyticsWorkspacefunctionApp.id
}
}
// App service containing the workflow runtime ///
resource sitefunctionApp 'Microsoft.Web/sites#2021-03-01' = {
name: functionAppName
location: location
kind: 'functionapp,linux'
identity: {
type: 'SystemAssigned'
}
properties: {
clientAffinityEnabled: false
httpsOnly: true
serverFarmId: servicePlanfunctionApp.id
siteConfig: {
linuxFxVersion: 'python|3.9'
minTlsVersion: '1.2'
pythonVersion: '3.9'
use32BitWorkerProcess: true
appSettings: [
{
name: 'FUNCTIONS_EXTENSION_VERSION'
value: '~4'
}
{
name: 'FUNCTIONS_WORKER_RUNTIME'
value: 'python'
}
{
name: 'AzureWebJobsStorage'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTAZUREFILECONNECTIONSTRING'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTSHARE'
value: 'app-${toLower(name)}-functionservice-${toLower(environmentType)}a6e9'
}
{
name: 'APPINSIGHTS_INSTRUMENTATIONKEY'
value: applicationInsightsfunctionApp.properties.InstrumentationKey
}
{
name: 'ApplicationInsightsAgent_EXTENSION_VERSION'
value: '~2'
}
{
name: 'APPLICATIONINSIGHTS_CONNECTION_STRING'
value: applicationInsightsfunctionApp.properties.ConnectionString
}
{
name: 'ENV'
value: toUpper(environmentType)
}
]
}
}
/// VNET integration so flows can access storage and queue accounts ///
resource vnetIntegration 'networkConfig#2022-03-01' = {
name: 'virtualNetwork'
properties: {
subnetResourceId: vnetIntegrationSubnetId
swiftSupported: true
}
}
}
/// Outputs for creating access policies ///
output functionAppName string = sitefunctionApp.name
output functionAppManagedIdentityId string = sitefunctionApp.identity.principalId
Output is used for giving permissions to blob/queue and some keyvault stuff. This code is a single module called in a main.bicep module and deployed via an Azure Devops pipeline.
I have a second repository in which I have some functions and which I also deploy via Azure Pipelines. This one contains three .yaml files for deploying, 2 templates (CI and CD) and 1 main pipeline called azure-pipelines.yml pulling it all together:
functions-ci.yml:
parameters:
- name: environment
type: string
jobs:
- job:
displayName: 'Publish the function as .zip'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- task: CopyFiles#2
displayName: 'Create project folder'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: |
**
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: Bash#3
displayName: 'Install requirements for running function'
inputs:
targetType: 'inline'
script: |
python3 -m pip install --upgrade pip
pip install setup
pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
workingDirectory: '$(Build.ArtifactStagingDirectory)'
- task: ArchiveFiles#2
displayName: 'Create project zip'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
- task: PublishPipelineArtifact#1
displayName: 'Publish project zip artifact'
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifactName: 'functions$(environment)'
publishLocation: 'pipeline'
functions-cd.yml:
parameters:
- name: environment
type: string
- name: azureServiceConnection
type: string
jobs:
- job: worfklowsDeploy
displayName: 'Deploy the functions'
steps:
# Download created artifacts, containing the zipped function codes
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
artifactName: 'functions$(environment)'
targetPath: '$(Build.ArtifactStagingDirectory)'
# Zip deploy the functions code
- task: AzureFunctionApp#1
inputs:
azureSubscription: $(azureServiceConnection)
appType: functionAppLinux
appName: function-app-nlp-$(environment)
package: $(Build.ArtifactStagingDirectory)/**/*.zip
deploymentMethod: 'zipDeploy'
They are pulled together in azure-pipelines.yml:
trigger:
branches:
include:
- develop
- main
pool:
name: "Hosted Ubuntu 1804"
variables:
${{ if notIn(variables['Build.SourceBranchName'], 'main') }}:
environment: dev
azureServiceConnection: SC-NLPDT
${{ if eq(variables['Build.SourceBranchName'], 'main') }}:
environment: prd
azureServiceConnection: SC-NLPPRD
pythonVersion: '3.9'
stages:
# Builds the functions as .zip
- stage: functions_ci
displayName: 'Functions CI'
jobs:
- template: ./templates/functions-ci.yml
parameters:
environment: $(environment)
# Deploys .zip workflows
- stage: functions_cd
displayName: 'Functions CD'
jobs:
- template: ./templates/functions-cd.yml
parameters:
environment: $(environment)
azureServiceConnection: $(azureServiceConnection)
So this successfully deploys my function app the first time around when I have also deployed the infra code. The imports are done well, the right function app is deployed, and the code runs when I trigger it.
But, when I go and redeploy the infra (bicep) code, all of a sudden I the newest version of the functions is gone and is replaced by a previous version.
Also, running this previous version doesn't work anymore since all my requirements that were installed in the pipeline (CI part) via pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt suddenly cannot be found anymore, giving import errors (i.e. Result: Failure Exception: ModuleNotFoundError: No module named 'azure.identity'). Mind you, this version did work previously just fine.
This is a big problem for me since I need to be able to update some infra stuff (like adding an APP_SETTING) without this breaking the current deployment of functions.
I had thought about just redeploying the function automatically after an infra update, but then I still miss the previous invocations which I need to be able to see.
Am I missing something in the above code because I cannot figure out what would be going wrong here that causes my functions to change on infra deployment...
Looking at the documentation:
To enable your function app to run from a package, add a WEBSITE_RUN_FROM_PACKAGE setting to your function app settings.
1 Indicates that the function app runs from a local package file deployed in the d:\home\data\SitePackages (Windows) or /home/data/SitePackages (Linux) folder of your function app.
In your case, when you deploy your function app code using AzureFunctionApp#1 and zipDeploy, this automatically add this appsetting into your function app. When redeploying your infrastructure, this setting is removed and the function app host does not know where to find the code.
If you add this app setting in your bicep file this should work:
{
name: 'WEBSITE_RUN_FROM_PACKAGE'
value: '1'
}
I want to use templates in my local backstage and have been trying to publish a create-react-app to my gitlab instance, but keep getting the following
22022-10-21T22:36:25.000Z HTTPError: Response code 400 (Bad Request)
I have added the integration within my app-config yaml file. There is some connection happening, but seams that I have missed a required attribute for this request. For example, the title for a merge request was not given. A missed a required attribute for this request. For example, the title for a merge request was not given. Wondering whats another required attribute that I am missing for when publishing to gitlab.
apiVersion: scaffolder.backstage.io/v1beta3
kind: Template
metadata:
name: create-react-app-template
title: Create React App Template
description: Create a new React website project
tags:
- react
- cra
spec:
owner: web#example.com
type: website
parameters:
- title: Provide some simple information
required:
- component_id
- owner
properties:
component_id:
title: Name
type: string
description: Unique name of the component
ui:field: EntityNamePicker
description:
title: Description
type: string
description: Help others understand what this website is for.
owner:
title: Owner
type: string
description: Owner of the component
ui:field: OwnerPicker
ui:options:
allowedKinds:
- Group
- title: Choose a location
required:
- repoUrl
properties:
repoUrl:
title: Repository Location
type: string
ui:field: RepoUrlPicker
ui:options:
allowedHosts:
- gitlab.example.com
steps:
- id: template
name: Fetch Skeleton + Template
action: fetch:template
input:
url: ./skeleton
copyWithoutRender:
- .github/workflows/*
values:
component_id: ${{ parameters.component_id }}
description: ${{ parameters.description }}
destination: ${{ parameters.repoUrl | parseRepoUrl }}
owner: ${{ parameters.owner }}
- id: publish
name: Publish
action: publish:gitlab
input:
allowedHosts:
- gitlab.example.com
description: This is ${{ parameters.component_id }}
repoUrl: ${{ parameters.repoUrl }}
title: Creating catalog-info.yaml ${{ parameters.name }} for backstage
- id: register
name: Register
action: catalog:register
input:
repoContentsUrl: ${{ steps.publish.output.repoContentsUrl }}
catalogInfoPath: "/catalog-info.yaml"
output:
links:
- title: Repository
url: ${{ steps.publish.output.remoteUrl }}
- title: Open in catalog
icon: catalog
entityRef: ${{ steps.register.output.entityRef }}
When I want to go through the template of the Create React App it creates a
Repo Url -> gitlab.example.com?owner=maxbojorquez&repo=template
I would like to add a server to an ausostaling-group using SSM document, if the group has n instances running - i want to have (n+1).
Since this stack is managed by cloudformation, i just need to increase the 'DesiredCapacity' variable and update the stack. so i created a document with 2 steps:
get the current value of 'DesiredCapacity'
update stack with value of 'DesiredCapacity' + 1
I didnt find a way to express this simple operation, i guess im doing something wrong ...
SSM Document:
schemaVersion: '0.3'
parameters:
cfnStack:
description: 'The cloudformation stack to be updated'
type: String
mainSteps:
- name: GetDesiredCount
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: DescribeStacks
StackName: '{{ cfnStack }}'
outputs:
- Selector: '$.Stacks[0].Outputs.DesiredCapacity'
Type: String
Name: DesiredCapacity
- name: UpdateCloudFormationStack
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: UpdateStack
StackName: '{{ cfnStack }}'
UsePreviousTemplate: true
Parameters:
- ParameterKey: WebServerCapacity
ParameterValue: 'GetDesiredCount.DesiredCapacity' + 1 ### ERROR
# ParameterValue: '{{GetDesiredCount.DesiredCapacity}}' + 1 ### ERROR (trying to concat STR to INT)
# ParameterValue: '{{ GetDesiredCount.DesiredCapacity + 1}}' ### ERROR
There is a way to do calculation inside an SSM document using python runtime.
The additional python step do the following:
Python runtime get variables via the the 'InputPayload' property
The 'current' (str) key added to the event object
The python function script_handler called
The 'current' extracted using event['current']
Converting string to int and adding 1
return a dictionary with the 'desired_capacity' key and value as string
expose the output ($.Payload.desired_capacity referred to the 'desired_capacity' of the returned dictionary)
schemaVersion: '0.3'
parameters:
cfnStack:
description: 'The cloudformation stack to be updated'
type: String
mainSteps:
- name: GetDesiredCount
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: DescribeStacks
StackName: '{{ cfnStack }}'
outputs:
- Selector: '$.Stacks[0].Outputs.DesiredCapacity'
Type: String
Name: DesiredCapacity
- name: Calculate
action: 'aws:executeScript'
inputs:
Runtime: python3.6
Handler: script_handler
Script: |-
def script_handler(events, context):
desired_capacity = int(events['current']) + 1
return {'desired_capacity': str(desired_capacity)}
InputPayload:
current: '{{ GetDesiredCount.DesiredCapacity }}'
outputs:
- Selector: $.Payload.desired_capacity
Type: String
Name: NewDesiredCapacity
- name: UpdateCloudFormationStack
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: UpdateStack
StackName: '{{ cfnStack }}'
UsePreviousTemplate: true
Parameters:
- ParameterKey: WebServerCapacity
ParameterValue: '{{ Calculate.NewDesiredCapacity}}'
Below is piece of code I am trying to put in each loop in YAML pipeline.
Can anyone help ?
parameters:
- name: parameter1
type: object
default:
IN:
Test1:
folderPath: abc/myFolder1
Test2:
folderPath: abc/myFolder2
Test3:
folderPath: abc/myFolder3
US:
Test4:
folderPath: xyz/myFolder4
CA:
Test5:
folderPath: lmn/myFolder5
I have an issue with the embedded API management within a Cloud Foundry application ( node.js ) on bluemix .There is a certain path in the yaml which is not working via the gateway, please see below the relevant path from the yaml:
/socket.io/:
get:
produces:
- text/plain; charset=utf-8
parameters: []
responses:
default:
description: Definition generated from Swagger Inspector
I get 404 , not found.
The url works fine when I dont go via the gateway.
The url is https://[masked api mgd hostname]/socket.io/?EIO=3&transport=polling&t=MC0pE73
Please help.
Find attached the complete yaml below
swagger: "2.0"
info:
description: defaultDescription
version: "0.1"
title: defaultTitle
host: masked.actualEndpoint
schemes:
- https
basePath: "/"
paths:
/socket.io/:
get:
parameters:
- name: t
in: query
required: false
type: string
x-example: MC0pE73
- name: EIO
in: query
required: false
type: string
x-example: "3"
- name: transport
in: query
required: false
type: string
x-example: polling
responses:
default:
description: Definition generated from Swagger Inspector
definitions: {}
am accessing the url using https://[ masked api mangd hostname ]/socket.io/?EIO=3&transport=polling&t=MCvtHJT
I believe its the / at the end of the path ( /socket.io/ ) which is causing the gateway to fail. Any comments.