I needed to download file using WebClient in PowerShell 2.0, and I wanted to show download progress, so I did it this way:
$activity = "Downloading"
$client = New-Object System.Net.WebClient
$urlAsUri = New-Object System.Uri($url)
$event = New-Object System.Threading.ManualResetEvent($false)
$downloadProgress = [System.Net.DownloadProgressChangedEventHandler] {
$progress = [int]((100.0 * $_.BytesReceived) / $_.TotalBytesToReceive)
Write-Progress -Activity $activity -Status "${progress}% done" -PercentComplete $progress
}
$downloadComplete = [System.ComponentModel.AsyncCompletedEventHandler] {
Write-Progress -Activity $activity -Completed
$event.Set()
}
$client.add_DownloadFileCompleted($downloadComplete)
$client.add_DownloadProgressChanged($downloadProgress)
Write-Progress -Activity $activity -Status "0% done" -PercentComplete 0
$client.DownloadFileAsync($urlAsUri, $file)
$event.WaitOne()
I am getting a error There is no Runspace available to run scripts in this thread. for the code in $downloadProgress handler, which is logical. However, how do I provide a Runspace for the thread that (probably) belongs to the ThreadPool?
UPDATE:
Note that both answers to this question are worth reading, and I would accept both if I could.
Thanks stej for the nod.
Andrey, powershell has its own threadpool and each service thread keeps a threadstatic pointer to a runspace (the System.Management.Automation.Runspaces.Runspace.DefaultRunspace static member exposes this - and would be a null ref in your callbacks.) Ultimately this means it's difficult - especially in script - to use your own threadpool (as is provided by .NET for async methods) to execute scriptblocks.
PowerShell 2.0
Regardless, there is no need to play with this as powershell v2 has full support for eventing:
$client = New-Object System.Net.WebClient
$url = [uri]"http://download.microsoft.com/download/6/2/F/" +
"62F70029-A592-4158-BB51-E102812CBD4F/IE9-Windows7-x64-enu.exe"
try {
Register-ObjectEvent $client DownloadProgressChanged -action {
Write-Progress -Activity "Downloading" -Status `
("{0} of {1}" -f $eventargs.BytesReceived, $eventargs.TotalBytesToReceive) `
-PercentComplete $eventargs.ProgressPercentage
}
Register-ObjectEvent $client DownloadFileCompleted -SourceIdentifier Finished
$file = "c:\temp\ie9-beta.exe"
$client.DownloadFileAsync($url, $file)
# optionally wait, but you can break out and it will still write progress
Wait-Event -SourceIdentifier Finished
} finally {
$client.dispose()
}
PowerShell v1.0
If you're stuck on v1 (this is not specifically for you as you mention v2 in the question) you can use my powershell 1.0 eventing snap-in at http://pseventing.codeplex.com/
Async Callbacks
Another tricky area in .NET is async callbacks. There is nothing directly in v1 or v2 of powershell that can help you here, but you can convert an async callback to an event with some simple plumbing and then deal with that event using regular eventing. I posted a script for this (New-ScriptBlockCallback) at http://poshcode.org/1382
Hope this helps,
-Oisin
I see that you use async call so that you can show the progress. Then you can use BitsTransfer module for that. It shows the progress by default:
Import-Module BitsTransfer
Start-BitsTransfer -Source $url -dest d:\temp\yourfile.zip
If you would like to transfer the file in the background, you could use something like this:
Import-Module BitsTransfer
$timer = New-Object Timers.Timer
$timer.Interval = 300
Register-ObjectEvent -InputObject $timer -EventName Elapsed -Action {
if ($transfer.JobState -ne 'Transferring') {
$timer.Enabled = 0;
Write-Progress -Completed -Activity Downloading -Status done
return
}
$progress = [int](100* $transfer.BytesTransferred/$transfer.BytesTotal)
Write-Progress -Activity Downloading -Status "$progress% done" -PercentComplete $progress
} -sourceId mytransfer
$transfer = Start-BitsTransfer -Source $url -dest d:\temp\yourfile.zip -async
$timer.Enabled = 1
# after that
Unregister-Event -SourceIdentifier mytransfer
$timer.Dispose()
The key parameter is -async. It starts the transfer in background. I haven't found any event triggered by the transfer, so I query the job each second to report the state via Timers.Timer object.
However with this solution, it is needed to unregister the event and dispose the timer. Some time ago I had problems with unregistering in the scriptblock passed as -Action (it could be in the if branch), so I unregister the event in separate command.
I think #oising (x0n) has some solution on his blog. He'll tell you that hopefully and that would be answer for your question.
Related
Good afternoon,
I've been working with trying to register an event based on when all jobs are completed. Im able to successfully register one, but id like to get a message pop-up once all background jobs are completed. Anyone familiar with how to do so?
I attempted the following, but it errors out saying jobs is null:
1..10 | ForEach-Object -Process {
Start-Job { Start-Sleep $_ } -Name "$_" | Out-Null} -OutVariable $jobs
Register-ObjectEvent $jobs StateChanged -Action {
[System.Windows.MessageBox]::Show('Done')
$eventSubscriber | Unregister-Event
$eventSubscriber.Action | Remove-Job
} | Out-Null
I feel like a Do{}Until() loop can do it but, im not sure how to register that to check until the job has completed. Also tried to follow along with some ways other people have done it using different languages, but, I cant pick it up.
I don't want to post everything ive tried so this post doesn't bore anyone. Searched on google as well but, I couldn't find much on registering an object for multiple jobs.
EDIT
Heres what does work:
$job = Start-Job -Name GetLogFiles { Start-Sleep 10 }
Register-ObjectEvent $job StateChanged -Action {
[System.Windows.MessageBox]::Show('Done')
$eventSubscriber | Unregister-Event
$eventSubscriber.Action | Remove-Job
} | Out-Null
Which is what id like to happened, but to evaluate all jobs, not just one.
This is what a personally use when monitoring running jobs:
$jobs= 1..10 | ForEach-Object -Process {
Start-Job { Start-Sleep $using:_ ; "job {0} done" -f $using:_ } -Name "$_"
}
do{
$i = (Get-Job -State Completed).count
$progress = #{
Activity = 'Jobs completed'
Status = "$i of {0}" -f $jobs.Count
PercentComplete = $i / $jobs.count * 100
}
Write-Progress #progress
Start-Sleep -Milliseconds 10
}
until($i -eq $jobs.Count)
$result = Get-Job | Receive-Job
$jobs | Remove-Job
Of course, under certain scenarios where I know some jobs might fail I change the until(...) condition for something different and the do {...} contains the logic for restarting failing jobs.
Edit 1:
It's worth mentioning that Start-Job is not worth your time if you're interested in multithreading, it has been proven to be slower than a linear loop in many scenarios. You should be looking at the ThreadJob Module
Edit 2:
After some testing, this worked for me:
# Clear the Event queue
Get-EventSubscriber|Unregister-Event
# Clear the Job queue
Get-Job|Remove-Job
1..10 | ForEach-Object -Process {
$job = Start-Job { Sleep -Seconds (1..20|Get-Random) } -Name "$_"
Register-ObjectEvent -InputObject $job -EventName StateChanged -Action {
$eventSubscriber | Unregister-Event
$eventSubscriber.Action | Remove-Job
if(-not (Get-EventSubscriber))
{
[System.Windows.MessageBox]::Show('Done')
}
} | Out-Null
}
At first I didn't even know this was possible so thanks for pointing this out. Great question :)
I'm actually some automation for my ADF. As a part of that, I'm trying to delete all the ADF V2 pipelines. The problem is my pipelines having many references with different pipelines itself.
$ADFPipeline = Get-AzDataFactoryV2Pipeline -DataFactoryName $(datafactory-name) -ResourceGroupName $(rg)
$ADFPipeline | ForEach-Object { Remove-AzDataFactoryV2Pipeline -ResourceGroupName $(rg) -DataFactoryName $(datafactory-name) -Name $_.name -Force }
And most of the time I get the error like
The document cannot be deleted since it is referenced by "blabla"
I understand the error that it saying some references and cannot be deleted. However, when I tried the same deletion in the azure portal, irrespective of the reference I can able to delete. So I want to find a way that whether it possible to tell that Powershell even though it's having a reference delete it forcefully
Any other inputs much appreciated!
I run into the same issue, found out that it's rather complicated to build the whole dependency graph out of the pipeline's Activities property.
As a working solution (powershell):
function Remove-Pipelines {
param (
[Parameter(Mandatory=$true)]
[AllowEmptyCollection()]
[AllowNull()]
[System.Collections.ArrayList]$pipelines
)
if($pipelines.Count -gt 0) {
[System.Collections.ArrayList]$plsToProcess = New-Object System.Collections.ArrayList($null)
foreach ($pipeline in $pipelines) {
try {
$removeAzDFCommand = "Remove-AzDataFactoryV2Pipeline -dataFactoryName '$DataFactoryName' -resourceGroupName '$ResourceGroupName' -Name '$($pipeline.Name)' -Force -ErrorAction Stop"
Write-Host $removeAzDFCommand
Invoke-Expression $removeAzDFCommand
}
catch {
if ($_ -match '.*The document cannot be deleted since it is referenced by.*') {
Write-Host $_
$plsToProcess.Add($pipeline)
} else {
throw $_
}
}
}
Remove-Pipelines $plsToProcess
}
}
Here is the complete solution for clearing the whole DF: "trigger","pipeline","dataflow","dataset","linkedService"
Param(
[Parameter(Mandatory=$true)][string] $ResourceGroupName,
[Parameter(Mandatory=$true)][string] $DataFactoryName
)
$artfTypes = "trigger","pipeline","dataflow","dataset","linkedService"
function Remove-Artifacts {
param (
[Parameter(Mandatory=$true)][AllowEmptyCollection()][AllowNull()][System.Collections.ArrayList]$artifacts,
[Parameter(Mandatory=$true)][string]$artfType
)
if($artifacts.Count -gt 0) {
[System.Collections.ArrayList]$artToProcess = New-Object System.Collections.ArrayList($null)
foreach ($artifact in $artifacts) {
try {
$removeAzDFCommand = "Remove-AzDataFactoryV2$($artfType) -dataFactoryName '$DataFactoryName' -resourceGroupName '$ResourceGroupName' -Name '$($artifact.Name)' -Force -ErrorAction Stop"
Write-Host $removeAzDFCommand
Invoke-Expression $removeAzDFCommand
}
catch {
if ($_ -match '.*The document cannot be deleted since it is referenced by.*') {
Write-Host $_
$artToProcess.Add($artifact)
} else {
throw $_
}
}
}
Remove-Artifacts $artToProcess $artfType
}
}
foreach ($artfType in $artfTypes) {
$getAzDFCommand = "Get-AzDataFactoryV2$($artfType) -dataFactoryName '$DataFactoryName' -resourceGroupName '$ResourceGroupName'"
Write-Output $getAzDFCommand
$artifacts = Invoke-Expression $getAzDFCommand
Write-Output $artifacts.Name
Remove-Artifacts $artifacts $artfType
}
The same approach can be adapted for "Set-AzDataFactoryV2Pipeline" command as well.
It worth to mention that along with dependencies tracking, Remove/Set artifact's sequence should be right (because of cross artifacts' dependencies).
For Set - "linkedService","dataset","dataflow","pipeline","trigger"
For Remove - "trigger","pipeline","dataflow","dataset","linkedService"
Hello and thank you for the question. According to the Remove-AzDataFactoryV2Pipeline doc, the -Force flag simply skips the confirmation prompt. It does not actually 'Force' the deletion in spite of errors.
Since you are already doing automation, might I suggest leveraging the error message to recursively attempt to delete the referencing pipeline. $error[0] gets the most recent error.
(Pseudocode)
try_recurse_delete( pipeline_name )
do_delete(pipeline_name)
if not $error[0].contains("referenced by " + pipeline_name)
then return true
else
try_recurse_delete( get_refrencer_name($error[0]) )
Given that pipeline dependencies can be a many-to-many relationship, subsequent pipelines in your for-each loop might already be deleted by the recursion. You will have to adapt your code to react to 'pipeline not found' type errors.
I have many operations failed in my Azure Logic App.
I see that if you click on a single operation on the Azure portal you can re-started the operation:
Is it possible to select ALL of these failed operations, and re-run all together?
Thanks a lot guys
If you want to resubmit one or more logic app runs that failed, succeeded, or are still running, you could bulk resubmit Logic Apps from the Runs Dashboard.
About how to use this function, you could refer to this article:Monitor logic apps with Azure Monitor logs. Under the tile View logic app run information, you could find the Resubmit description.
Alternative to bulk resubmit Logic Apps from the Runs Dashboard, you can utilize the PowerShell commands. Take a look at the script below, which can automate listing Failed logic app runs, identify triggers, actions responsible and restart the apps by input ResourceGroupName. You can change some of those bits as per your needs. (skip the interactions and just restart apps again) I have just show it for understanding.
Using: Get-AzLogicApp, Get-AzLogicAppRunHistory ,
Get-AzLogicAppRunAction, Get-AzLogicAppTrigger and Start-AzLogicApp
cmdlets.
Script using Az PowerShell 6.2 Module: Az.LogicApp [Copy below to file, say restart.ps1 and run] Make sure you assign $rg with your actual AzResourceGroup name
$rg = "MyResourceGrp"
#get logic apps
$logicapps = Get-AzLogicApp -ResourceGroupName $rg
Write-Host "logicapps:" -ForegroundColor "Yellow"
write-output $logicapps.Name
#list all logicapp runs failed
$failedruns = #(foreach($name in $logicapps.Name){
Get-AzLogicAppRunHistory -ResourceGroupName $rg -Name $name | Where {$_.Status -eq 'Failed'}
})
Write-Host "failedruns:" -ForegroundColor "Yellow"
Write-Output $failedruns.Name | select -Unique
Write-Host "failedruns: LogicAppNames" -ForegroundColor "Yellow"
Write-Output $failedruns.Workflow.Name | select -Unique
#list all logicappRunsActions failed
foreach($i in $logicapps){
foreach($y in $failedruns){
if ($i.Version -eq $y.Workflow.Name) {
$resultsB = Get-AzLogicAppRunAction -ResourceGroupName $rg -Name $i.Name -RunName $y.Name -FollowNextPageLink | Where {$_.Status -eq 'Failed'}
}
}
}
foreach($item in $resultsB){
Write-Host "Action:" $item.Name " " -NoNewline "Time:" $item.EndTime
Write-Output " "
}
#get logicapp triggers
foreach($ii in $logicapps){
foreach($yy in $failedruns){
if ($ii.Version -eq $yy.Workflow.Name) {
$triggers = Get-AzLogicAppTrigger -ResourceGroupName $rg -Name $ii.Name
}
}
}
Write-Host "triggers:" -ForegroundColor "Yellow"
Write-Output $triggers.Name
#start logic apps with triggers
Write-Host "Starting logic apps....." -ForegroundColor "green"
foreach($p in $logicapps){
foreach($tri in $triggers){
if ($p.Version -eq $triggers.Workflow.Name) {
Start-AzLogicApp -ResourceGroupName $rg -Name $p.Name -TriggerName $tri.Name
}
}
}
$verify = Read-Host "Verify ruuning app? y or n"
if ($verify -eq 'y') {
$running = #(foreach($name2 in $logicapps.Name){
Get-AzLogicAppRunHistory -ResourceGroupName $rg -Name $name2 | Where {$_.Status -eq 'Running' -or $_.Status -eq 'Waiting'}
})
Write-Host $running
}
else {
Write-Host "Bye!"
}
Although my LogicApp has failed again, but you can see it was triggered in time by script
Note: If your logic app trigger expects inputs or actions (unlike recurrence or scheduled) please edit or make changes accordingly for Start-AzLogicApp command to execute successfully.
Here I am considering all logic apps are enabled (use -State Enabled) parameter for Get-AzLogicApp command if you want to run this on only currently enabled apps.
Example: Get-AzLogicApp -ResourceGroupName "rg" | where {$_.State -eq 'Enabled'}
2. You can also try advanced settings for triggers in workflow. Such as retry policy.
You can specify it to retry at custom intervals in case of failures due to an intermittent issues.
You can submit a feedback or Upvote a similar Feedback : ability-to-continue-from-a-particular-point-as-in
Refer: help topics for the Logic Apps cmdlets
I want create a queue of ffmpeg commands and then let N threads consume the queue launching an instance of ffmpeg with parameters:
here some code:
#looping on $items to prepare the params
foreach($input in $items){
{...}
#adding a param in queue
$global:jobsQueue.Enqueue($ffmpegParam)
}
#block code to be executed in different thread
$block = {
Param($queue, $ffmpegDir)
while($true){
if($queue.TryDequeue($params)){
$pinfo = New-Object System.Diagnostics.ProcessStartInfo($ffmpegDir, $params)
$pinfoMap.$input.UseShellExecute = $false
$pinfoMap.$input.CreateNoWindow = $true
$p = New-Object System.Diagnostics.Process
$p.StartInfo = $pinfo
$p.Start()
$p.WaitForExit()
} else {break}
}
}
#miserably failing to start the previous block code
for($i = 0; $i -lt 1; $i++){
Start-Job -Name "process $i" -ScriptBlock $block -ArgumentList $global:jobsQueue,$ffmpegDir
}
A job is actually started but it does nothing and i don't get why. I read those pages but i wasn't able to come up with a solution:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/start-job?view=powershell-5.1
How do I Start a job of a function i just defined?
https://social.technet.microsoft.com/Forums/scriptcenter/en-US/b68c1c68-e0f0-47b7-ba9f-749d06621a2c/calling-a-function-using-startjob
I have three components in my PowerShell script. I'm using a runspace for each : core, GUI and timer.
Core is for executing some actions, GUI for showing if action in core is completed and timer is needed for one action in core.
I'm not sure It's a clean/good solution... because I need to write some information with write-host and exchange some info between theses runspaces. Could you tell me a better way if not ?
Function StartRunSpaceForGui {
#-RunSpace for GUI---------------
$global:RunSpaceForGui=[runspacefactory]::CreateRunspace()
$global:RunSpaceForGui.ApartmentState="STA"
$global:RunSpaceForGui.ThreadOptions="ReuseThread"
$global:RunSpaceForGui.Open()
$global:HashForGui=[hashtable]::Synchronized(#{})
$global:RunSpaceForGui.SessionStateProxy.SetVariable("HashForGui",$HashForGui)
$global:ScriptForGui=[PowerShell]::Create().AddScript("$global:ProductPath\PS\Gui.ps1")
$global:ScriptForGui.RunSpace=$RunSpaceForGui
$global:HandleForGui=$ScriptForGui.BeginInvoke()
}
Function StartRunSpaceForEngine {
#-RunSpace for Engine---------------------#
$global:RunSpaceForEngine=[runspacefactory]::CreateRunspace()
$global:RunSpaceForEngine.Open()
$global:ScriptForEngine=[PowerShell]::Create().AddScript("$global:ProductPath\PS\Engine.ps1")
$global:ScriptForEngine.RunSpace=$RunSpaceForEngine
$global:HandleForEngine=$ScriptForEngine.BeginInvoke()
}
Function StartRunSpaceForTimer {
#-RunSpace for timer
$global:RunSpaceForTimer=[runspacefactory]::CreateRunspace()
$global:RunSpaceForTimer.Open()
$global:ScriptForTimer=[PowerShell]::Create().AddScript("$global:ProductPath\PS\Timer.ps1")
$global:ScriptForTimer.RunSpace=$RunSpaceForTimer
$global:HandleForTimer=$ScriptForTimer.BeginInvoke()
}
EDIT :
GUI thread :
Add-Type -AssemblyName System.Windows.Forms
Add-Type -AssemblyName System.Drawing
#--Main Form Creation.
$HashForGui.MainForm=New-Object System.Windows.Forms.Form
$HashForGui.MainForm.Size=New-Object System.Drawing.Size($Width,$Height)
$HashForGui.MainForm.StartPosition="CenterScreen"
$HashForGui.MainForm.FormBorderStyle=[System.Windows.Forms.FormBorderStyle]::FixedDialog
$HashForGui.MainForm.MaximizeBox=$False
$HashForGui.MainForm.MinimizeBox=$False
$HashForGui.MainForm.ShowInTaskbar=$True
$HashForGui.MainForm.WindowState="Normal"
$HashForGui.MainForm.Text=$ProductName
$Icon=New-Object system.drawing.icon($ProductIconFilePath)
$HashForGui.MainForm.Icon=$Icon
#---WebBrowser Creation.
$HashForGui.MainWebBrowser=New-Object System.Windows.Forms.WebBrowser
$HashForGui.MainWebBrowser.Size=New-Object System.Drawing.Size($Width,$Height)
$HashForGui.MainWebBrowser.Location=New-Object System.Drawing.Size(0,0)
$HashForGui.MainWebBrowser.ScrollBarsEnabled=$True
$HashForGui.MainForm.Controls.Add($HashForGui.MainWebBrowser)
#---WebBrowser info section load.
$HashForGui.MainWebBrowser.DocumentText=get-content $HtmlLoadingFilePath
$HashForGui.MainForm.ShowDialog()
exit
Timer thread
Remove-ItemProperty -Path "Registry::$ProductRegistryPath" -Name "Time Out" -Force -ErrorAction SilentlyContinue
[Int32]$Timer=0
do {
start-sleep -s 1
$Timer=$Timer+1
if ($Debug -eq 1) {write-host -Foreground Blue "Timer : $Timer/$MaxTimer"}
} until ($Timer -ge $MaxTimer)
New-ItemProperty -Path "Registry::$ProductRegistryPath" -Name "Time Out" -PropertyType String -value "1" -Force
exit