How to test Azure Data factory linked service using API - azure

I am trying to create and use Azure data factory by Rest API but while creation of linked service connection has created successfully but when I checked connection it got failed so is there anything to do test connection by API or PowerShell command.

There is no this method in Microsoft documentation.You can track this feature here.
But there is a blog about testing link service by PowerShell.Here is the script on github.
Hope this can help you.

Now, you can use one of cmdlet in azure.datafactory.tools PowerShell module:
Test connection of Linked Service (preview)
# Example 1
$LinkedServiceName = 'AzureSqlDatabase1'
Test-AdfLinkedService #params -LinkedServiceName $LinkedServiceName
Alternatively, if you prefer, you can run such test as part of your CI/CD process in Azure DevOps installing #adftools extension, which uses the same PS module behind the scenes.
More: https://sqlplayer.net/adftools
Disclaimer: I'm author of the tool.

Related

Create Azure Devops environment from script

I would like to create an Azure DevOps Pipeline Environment from Powershell.
Using Azure CLI or the Azure REST API however, I can not find any information on this.
There are some notions about the environments in the release but that's not what I need.
When using the portal following URL is called "/_apis/distributedtask/environments" but can't find any information about this REST API endpoint.
Does anyone know how to automate this?
You're right, If I check the network section when I create a new environment I can see it uses this api:
https://dev.azure.com/{org}/{project}/_apis/distributedtask/environments
With this JSON body:
{
"description":"",
"name":"test"
}
I don't see it domunetd but it should work :)

Azure DevOps API: get all testcase steps

Is there a way to get all the testcase steps data via Azure DevOps API?
I was able to find the solution:
GET https://{host}/{collection}/{project}/_apis/testplan/Plans/{plan id}/Suites/{suite id}/TestCase/{testcase id}
You need to use this particular Endpoing,
POST https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/results?api-version=5.0
and access TestActionResultModel
The Azure DevOps api has endpoints for returning test runs, results and cases. The api is wrapped in a PowerShell module called AzurePipelinesPS that has functions like
Get-APTestSuiteList
Get-APTestRunList
Get-APTestRun
Get-APTestRunStatistic
Get-APTestResultList
Get-APTestResult
Get-APTestPlanList
The following command will return a list of all your test cases in a suite.
Get-APTestSuiteList -Session 'youSessionName'
Be sure to setup a session locally with New-APSession when first installing the module. You can find the session documentation here.

How to run a remote command (powershell/bash) against an existing Azure VM in Azure Data Factory V2?

I've been trying to find a way to run a simple command against one of my existing Azure VMs using Azure Data Factory V2.
Options so far:
Custom Activity/Azure Batch won't let me add existing VMs to the pool
Azure Functions - I have not played with this but I have not found any documentation on this using AZ Functions.
Azure Cloud Shell - I've tried this using the browser UI and it works, however I cannot find a way of doing this via ADF V2
The use case is the following:
There are a few tasks that are running locally (Azure VM) in task scheduler that I'd like to orchestrate using ADF as everything else is in ADF, these tasks are usually python applications that restore a SQL Backup and or purge some folders.
i.e. sqdb-restore -r myDatabase
where sqldb-restore is a command that is recognized locally after installing my local python library. Unfortunately the python app needs to live locally in the VM.
Any suggestions? Thanks.
Thanks to #martin-esteban-zurita, his answer helped me to get to what I needed and this was a beautiful and fun experiment.
It is important to understand that Azure Automation is used for many things regarding resource orchestration in Azure (VMs, Services, DevOps), this automation can be done with Powershell and/or Python.
In this particular case I did not need to modify/maintain/orchestrate any Azure resource, I needed to actually run a Bash/Powershell command remotely into one of my existing VMs where I have multiple Powershell/Bash commands running recurrently in "Task Scheduler".
"Task Scheduler" was adding unnecessary overhead to my data pipelines because it was unable to talk to ADF.
In addition, Azure Automation natively only runs Powershell/Python commands in Azure Cloud Shell which is very useful to orchestrate resources like turning on/off Azure VMs, adding/removing permissions from other Azure services, running maintenance or purge processes, etc, but I was still unable to run commands locally in an existing VM. This is where the Hybrid Runbook Worker came into to picture. A Hybrid worker group
These are the steps to accomplish this use case.
1. Create an Azure Automation Account
2. Install the Windows Hybrid Worker in my existing VM . In my case it was tricky because my proxy was giving me some errors. I ended up downloading the Nuget Package and manually installing it.
.\New-OnPremiseHybridWorker.ps1 -AutomationAccountName <NameofAutomationAccount> -AAResourceGroupName <NameofResourceGroup>
-OMSResourceGroupName <NameofOResourceGroup> -HybridGroupName <NameofHRWGroup>
-SubscriptionId <AzureSubscriptionId> -WorkspaceName <NameOfLogAnalyticsWorkspace>
Keep in mind that in the above code, you will need to find your own parameter values, the only parameter that does not have to be found and will be created is HybridGroupName this will define the name of the Hybrid Group
3. Create a PowerShell Runbook
[CmdletBinding()]
Param
([object]$WebhookData) #this parameter name needs to be called WebHookData otherwise the webhook does not work as expected.
$VerbosePreference = 'continue'
#region Verify if Runbook is started from Webhook.
# If runbook was called from Webhook, WebhookData will not be null.
if ($WebHookData){
# Collect properties of WebhookData
$WebhookName = $WebHookData.WebhookName
# $WebhookHeaders = $WebHookData.RequestHeader
$WebhookBody = $WebHookData.RequestBody
# Collect individual headers. Input converted from JSON.
$Input = (ConvertFrom-Json -InputObject $WebhookBody)
# Write-Verbose "WebhookBody: $Input"
#Write-Output -InputObject ('Runbook started from webhook {0} by {1}.' -f $WebhookName, $From)
}
else
{
Write-Error -Message 'Runbook was not started from Webhook' -ErrorAction stop
}
#endregion
# This is where I run the commands that were in task scheduler
$callBackUri = $Input.callBackUri
# This is extremely important for ADF
Invoke-WebRequest -Uri $callBackUri -Method POST
4. Create a Runbook Webhook pointing to the Hybrid Worker's VM
4. Create a webhook activity in ADF where the above PowerShell runbook script will be called via a POST Method
Important Note: When I created the webhook activity it was timing out after 10 minutes (default), so I noticed in the Azure Automation Account that I was actually getting INPUT data (WEBHOOKDATA) that contained a JSON structure with the following elements:
WebhookName
RequestBody (This one contains whatever you add in the Body plus a default element called callBackUri)
All I had to do was to invoke the callBackUri from Azure Automation. And this is why in the PowerShell runbook code I added Invoke-WebRequest -Uri $callBackUri -Method POST. With this, ADF was succeeding/failing instead of timing out.
There are many other details that I struggled with when installing the hybrid worker in my VM but those are more specific to your environment/company.
This looks like a use case that is supported with Azure Automation, using a hybrid worker. Try reading here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
You can call runbooks with webhooks in ADFv2, using the web activity.
Hope this helped!

How to add azure function's javascript / c# code into terraform scripts?

I am working in a project that will be deployed at my client's Microsoft Azure. Thus I am currently testing terraform to assist me when the time comes.
create a azure function with terraform that will trigger on blob storage input data
My question is about how to add the azure functions's javascript/c# code into the terraform script so it will be automatically deployed ?
I checked the terraform docs, but it wasn't of much help:
https://www.terraform.io/docs/providers/azurerm/r/function_app.html
Any ideas?
Terraform doesn't handle pushing code to Azure resources, that's usually done in a following step in the pipeline (e.g. 1- execute terraform 2- deploy code).
However, the Azure Function App does have the ability to connect directly to your repo, and the Terraform azurerm_function_app module exposes the source_control property.
Terraform's azurerm_function_app documentation
So with Terraform you can configure the function app to pull the code directly from the repo when a change is detected.
Microsoft's Azure Function Continuous Deployment documentation

Azure functions developing locally - cannot register EventHub triggered function

I want to develop locally my Azure Function App and later publish it to Azure Portal.
I am using Azure Functions Core Tools command line and all my functions are in Node.js
Currently, I managed to download my functions locally and fetch their settings with command:
func azure functionapp fetch-app-settings
So after that my local.settings.json has correct settings values. When I make any changes I am also able to publish them succesfully to Azure Portal.
The problem is now that I have two functions in my app, one is Http Triggered and the second is EventHub triggered.
When I try run locally host with:
func host start
I get the following output from console:
[10.12.2017 13:03:47] Found the following functions:
[10.12.2017 13:03:47] Host.Functions.HttpTriggerJS1
[10.12.2017 13:03:47]
[10.12.2017 13:03:47] Job host started
[10.12.2017 13:03:47] The following 1 functions are in error:
[10.12.2017 13:03:47] EventHubTriggerJS1: The binding type 'eventHubTrigger' is not registered. Please ensure the type is correct and the binding extension is installed.
And when I try to run locally this EventHubTriggerJS1 function with curl:
curl --request POST -H "Content-Type:application/json" --data '{"input":"sample queue data"}' http://localhost:7071/admin/functions/EventHubTriggerJS1
then nothing happens, so I guess this is a problem of this trigger registration.
The HttpTriggerJS1 runs perfectly, I can access it under
http://localhost:7071/api/HttpTriggerJS1
So, do you have any idea where might be a problem in configuring? BTW Is it possible to have locally function and connect to the remote EventHub in portal?
I was unable to reproduce your error on the Version 1.0 runtime.
I reproduced the error in 2.0. I believe 2.0 does not support event hubs yet,
https://github.com/Azure/azure-webjobs-sdk-script/wiki/Azure-Functions-runtime-2.0-known-issues#functional-gaps
try installing the extensions
func extensions install --package Microsoft.Azure.WebJobs.Extensions.EventHubs -v 3.0.0-beta4
Can you provided more detail about your functions, and the steps you took to create them?
HttpTriggerJS1 was created locally and then published to the portal following the steps outlined in https://learn.microsoft.com/en-us/azure/azure-functions/functions-run-local ?
EventHubTriggerJS1 was created in the portal? in the same Function App?
Do not mix local development with portal development in the same function app. When you create and publish functions from a local project, you should not try to maintain or modify project code in the portal.

Resources