Microsoft Azure Storage Tool - AzCopy:batch command for scheduling - azure

How can we create batch command for AzCopy. as I want to schedule this command. Please guide me how can I schedule this command. Currently I am manually running this command in 'Microsoft Azure Storage Tool' and want to automate.

Schedule a PowerShell job to run azcopy.exe. See PowerShell scheduled jobs.

As per my knowledge PS script containing azure powershell cmdlets cannot be scheduled by windows task scheduler. Probably it lacks something which cannot establish a connection to execute azure ps cmdlets through that script.
NOTE:
Even if you schedule it it will execute the script, it will show status as completed; but it wont actually do its job.
I would recommend you to better go for azure runbooks for this task. Also you can use copyAzureStorageBlob cmdlet for the same. Thanks.

Alternatively you can use Windows Task Scheduler to start powershell script which starts your batch file.
Powershell script contains only one line e.g.:
Start-Process C:\foobar\azcopy.bat

Related

Can we simulate the execution of TFX command (azure devops)? something like parameter -Whatif (powershell)

We need to simulate the following tfx command (Azure Devops):
tfx build tasks upload --task-path C:/folderX
I know that in powershell we have the possibility of using -WhatIf as a parameter. But I think it is native to powershell commands. TFX does not apply.
powershell -Whatif parameter reference:
https://techcommunity.microsoft.com/t5/itops-talk-blog/powershell-basics-don-t-fear-hitting-enter-with-whatif/ba-p/353579
Would someone have an idea how we could simulate tfx command?
Thanks

Azure Batch Service cannot find Az.DataFactory cmdlets when run as a Custom Activity

I am attempting to automate the activation of SSIS Integration Runtimes by running a pipeline containing a Custom Activity in Azure Data Factory.
I have set the Batch Service up with a linked storage account and have successfully started to run a .ps1 file in the linked storage account. I know it find the file OK because I can see a node is running and I get an adfjob set of logs in my storage account.
The Powershell script is a simple one liner:
Start-AzDataFactoryV2IntegrationRuntime -Name SSIS -ResourceGroupName <RG Name> -DataFactoryName <ADF Name> -Force
However, the output log file says that it cannot find the cmdlet:
The term 'Start-AzDataFactoryV2IntegrationRuntime' is not recognized
as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify
that the path is correct and try again.
So I take it from the log that Powershell is available on the node but the Az module is not. I find this extremely surprising given it's an Azure Batch Service node. I've tried adding an Install-Module Az ... to the start of this script, the result is it appears to be hanging and I don't know how to track if it is doing anything or not, but in any case I cancelled after 8mins because I'm pretty sure it would have installed by then.
So I am therefore wondering where the Az module should be installed and how to go about doing so?
You could install the Az module with your Batch Start task in order for your task to use it.
By associating a start task with a pool, you can prepare the operating environment of its nodes. For example, you can perform actions such as installing the applications that your tasks run, or starting background processes.

How to run a remote command (powershell/bash) against an existing Azure VM in Azure Data Factory V2?

I've been trying to find a way to run a simple command against one of my existing Azure VMs using Azure Data Factory V2.
Options so far:
Custom Activity/Azure Batch won't let me add existing VMs to the pool
Azure Functions - I have not played with this but I have not found any documentation on this using AZ Functions.
Azure Cloud Shell - I've tried this using the browser UI and it works, however I cannot find a way of doing this via ADF V2
The use case is the following:
There are a few tasks that are running locally (Azure VM) in task scheduler that I'd like to orchestrate using ADF as everything else is in ADF, these tasks are usually python applications that restore a SQL Backup and or purge some folders.
i.e. sqdb-restore -r myDatabase
where sqldb-restore is a command that is recognized locally after installing my local python library. Unfortunately the python app needs to live locally in the VM.
Any suggestions? Thanks.
Thanks to #martin-esteban-zurita, his answer helped me to get to what I needed and this was a beautiful and fun experiment.
It is important to understand that Azure Automation is used for many things regarding resource orchestration in Azure (VMs, Services, DevOps), this automation can be done with Powershell and/or Python.
In this particular case I did not need to modify/maintain/orchestrate any Azure resource, I needed to actually run a Bash/Powershell command remotely into one of my existing VMs where I have multiple Powershell/Bash commands running recurrently in "Task Scheduler".
"Task Scheduler" was adding unnecessary overhead to my data pipelines because it was unable to talk to ADF.
In addition, Azure Automation natively only runs Powershell/Python commands in Azure Cloud Shell which is very useful to orchestrate resources like turning on/off Azure VMs, adding/removing permissions from other Azure services, running maintenance or purge processes, etc, but I was still unable to run commands locally in an existing VM. This is where the Hybrid Runbook Worker came into to picture. A Hybrid worker group
These are the steps to accomplish this use case.
1. Create an Azure Automation Account
2. Install the Windows Hybrid Worker in my existing VM . In my case it was tricky because my proxy was giving me some errors. I ended up downloading the Nuget Package and manually installing it.
.\New-OnPremiseHybridWorker.ps1 -AutomationAccountName <NameofAutomationAccount> -AAResourceGroupName <NameofResourceGroup>
-OMSResourceGroupName <NameofOResourceGroup> -HybridGroupName <NameofHRWGroup>
-SubscriptionId <AzureSubscriptionId> -WorkspaceName <NameOfLogAnalyticsWorkspace>
Keep in mind that in the above code, you will need to find your own parameter values, the only parameter that does not have to be found and will be created is HybridGroupName this will define the name of the Hybrid Group
3. Create a PowerShell Runbook
[CmdletBinding()]
Param
([object]$WebhookData) #this parameter name needs to be called WebHookData otherwise the webhook does not work as expected.
$VerbosePreference = 'continue'
#region Verify if Runbook is started from Webhook.
# If runbook was called from Webhook, WebhookData will not be null.
if ($WebHookData){
# Collect properties of WebhookData
$WebhookName = $WebHookData.WebhookName
# $WebhookHeaders = $WebHookData.RequestHeader
$WebhookBody = $WebHookData.RequestBody
# Collect individual headers. Input converted from JSON.
$Input = (ConvertFrom-Json -InputObject $WebhookBody)
# Write-Verbose "WebhookBody: $Input"
#Write-Output -InputObject ('Runbook started from webhook {0} by {1}.' -f $WebhookName, $From)
}
else
{
Write-Error -Message 'Runbook was not started from Webhook' -ErrorAction stop
}
#endregion
# This is where I run the commands that were in task scheduler
$callBackUri = $Input.callBackUri
# This is extremely important for ADF
Invoke-WebRequest -Uri $callBackUri -Method POST
4. Create a Runbook Webhook pointing to the Hybrid Worker's VM
4. Create a webhook activity in ADF where the above PowerShell runbook script will be called via a POST Method
Important Note: When I created the webhook activity it was timing out after 10 minutes (default), so I noticed in the Azure Automation Account that I was actually getting INPUT data (WEBHOOKDATA) that contained a JSON structure with the following elements:
WebhookName
RequestBody (This one contains whatever you add in the Body plus a default element called callBackUri)
All I had to do was to invoke the callBackUri from Azure Automation. And this is why in the PowerShell runbook code I added Invoke-WebRequest -Uri $callBackUri -Method POST. With this, ADF was succeeding/failing instead of timing out.
There are many other details that I struggled with when installing the hybrid worker in my VM but those are more specific to your environment/company.
This looks like a use case that is supported with Azure Automation, using a hybrid worker. Try reading here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
You can call runbooks with webhooks in ADFv2, using the web activity.
Hope this helped!

Azure powershell runbook don't show any output

I have a simple Azure PowerShell runbook script
workflow CheckIdentityColumns
{
Write-Output "Test Output"
}
When I am trying to test it I don't see any output.
Why?
This is because your flow is Powershell runbook not Powershell Workflow runbook .
In Powershell runbook, you don't need to use
workflow CheckIdentityColumns{}
declaration. This is the main reason why it doesn't work.
I tried your scenario and it worked for me. You can view the details below.
This happened to me in different scenario. The answer I got from Microsoft support was that the Runbook in the cache is still the older one, i.e. an empty Runbook in your case. All you need to do is:
Publish the Runbook and edit and test again
If it still doesn't work then clear the cache and restart your browser
Test Runbook:
Output:

Can I automatically start and stop an azure website on a schedule?

Even asking this question, I'm wondering how something so simple can be so difficult for me. All I want to do is automate the stopping and starting of an Azure WebSite on a schedule. At first, I looked at WebJobs, and created a powershell script that would stop the WebSite that used the stop-azurewebsite commandlet:
stop-azurewebsite [my site]
I then created a .cmd file to call it using powershell.exe to execute the powershell file
PowerShell.exe -ExecutionPolicy RemoteSigned -File stopit.ps1
I created a WebJob to run the powershell command to stop the site, but it errored with a message saying:
stop-azurewebsite : The term 'stop-azurewebsite' is not recognized as the name
of a cmdlet, function, script file, or operable program. Check the spelling of
the name, or if a path was included, verify that the path is correct and try
again.
So, I figured I'd go the REST API route. I created a post request using fiddler and the proper management certificate to make a call to:
https://management.core.windows.net/[my subscription id]/services/WebSpaces/[webspace]/sites/[website name]/stop
Turns out, there is no 'stop' command. There's 'restart', but that's obviously of no use in this situation. So, all that said, what is a reasonable way to automate the stopping and subsequent (but much later) starting of an Azure WebSite on a specific time schedule?
UPDATE:
I've figured out how to issue the http request to stop the site using a PUT to
https://management.core.windows.net/[my subscription id]/services/WebSpaces/[webspace]/sites/[website name] with a body of {"State":"Stopped"}, but I still don't have an obvious way of 'PUT`ing' to this URL in a WebJob.
Use Azure Automation, create a runbook and add the code below to get all 'Running' websites/webapps and shut them down.
# Get Websites/webapps
$websites = Get-AzureWebsite | where-object -FilterScript{$_.state -eq 'Running' }
# Stop each website
foreach -parallel ($website In $websites)
{
$result = Stop-AzureWebsite $website.Name
if($result)
{
Write-Output "- $($website.Name) did not shutdown successfully"
}
else
{
Write-Output "+ $($website.Name) shutdown successfully"
}
}
Publish the runbook and you should be able to schedule it straight in Azure Portal. Make sure your automated user is authenticated and your subscription selected in the runback and there should be no problems.
Similarly for starting up all websites/webapps just change 'Running' to 'Stopped' and 'Stop-AzureWebsite' to 'Start-AzureWebsite'.
Take a look at Azure Automation. It lets you run Powershell scripts on a schedule and is a much easier way to run automation scripts that will start/stop/modify your sites.
It's even easier now to do things on schedules using Azure Scheduler, especially in this situation. Stopping a website by simply making that API call with a scheduled job. Take a look at https://azure.microsoft.com/en-us/services/scheduler/

Resources