Powershell script stops working when ran through task scheduler - excel

I am running the script below as a scheduled task with the user logged on the server. It converts an xls file to csv using the Excel.Application COM object. The conversion works, but eventually breaks and I don't know why.
I have the task run the following command which should in theory allow it to run constantly:
powershell.exe -noexit -file "filename.ps1"
Any thoughts on what to try?
$server = "\\server"
$xls = "\path\XLS\"
$csv = "\path\CSV\"
$folder = $server + $xls
$destination = $server + $csv
$filter = "*.xls" # <-- set this according to your requirements
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $true # <-- set this according to your requirements
NotifyFilter = [IO.NotifyFilters]"FileName, LastWrite"
}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
$excelFile = $folder + $name
$E = New-Object -ComObject Excel.Application
$E.Visible = $false
$E.DisplayAlerts = $false
$wb = $E.Workbooks.Open($excelFile)
foreach ($ws in $wb.Worksheets) {
$n = "output_" + $name -replace ".XLS"
$ws.SaveAs($destination + $n + ".csv", 6)
}
$E.Quit()
}

I was doing something similar with word. I couldnt use Quit alone. I think using Quit hides Excel. Try releasing the com object by using:
$E.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($E)
Remove-Variable E
i dont know if you are opening the Excel application but if you are you can maybe use
$wb.close($false)
Let me know if it works...
Ref: https://technet.microsoft.com/en-us/library/ff730962.aspx?f=255&MSPPError=-2147217396

Related

Powershell, close Excel

I have the following code and it works perfectly except it's not closing Excel properly. It's leaving an Excel process running.
Is there a way to close Excel properly without killing the process?
Since i'm using other Excel files while running this script i can not kill all active Excel processes.
I think i tried everything i found online.
$WorkDir = "D:\Test\QR_ES\RG_Temp"
$BGDir = "D:\Test\QR_ES\3_BG"
$File = "D:\Test\QR_ES\4_Adr_Excel\KD_eMail.xlsx"
$SentDir = "D:\Test\QR_ES\RG_Temp\Sent\Dunning"
chdir $WorkDir
$firstPageList = Get-ChildItem "$WorkDir\1*.pdf" -File -Name
ForEach ($firstPage in $firstPageList)
{
$secondPage = "$BGDir\BG_RG.pdf"
$output = "Dunn-$firstPage"
invoke-command {pdftk $firstPage background $secondPage output $output}}
del 1*.pdf
gci $WorkDir\Dunn-*.pdf | rename-item -newname {$_.Name.Substring(5)} -Force
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $false
$Workbook = $Excel.workbooks.open($file)
$DunnList = Get-ChildItem "$WorkDir\1*.pdf" -File -Name
ForEach ($Dunn in $DunnList)
{
$Worksheets = $Workbooks.worksheets
$Worksheet = $Workbook.Worksheets.Item("KD_eMail")
$Range = $Worksheet.Range("A1").EntireColumn
$DunnSearch = $Dunn.Substring(0,5)
$SearchString = $DunnSearch
$Search = $Range.find($SearchString)
$Recipient = $Worksheet.Cells.Item($Search.Row, $Search.Column + 1)
$Msg = "<span style='font-family:Calibri;font-size:12pt;'>Test</span>"
$Outlook = New-Object -ComObject Outlook.Application
$namespace = $Outlook.GetNameSpace("MAPI")
$namespace.Logon($null, $null, $false, $true)
$EmailFrom = ('test#test.com')
$account = $outlook.Session.Accounts.Item($EmailFrom)
$Mail = $Outlook.CreateItem(0)
$Mail.HTMLBody = $Msg
$Mail.Subject = "OP - $SearchString"
$Mail.To = $Recipient
function Invoke-SetProperty {
param(
[__ComObject] $Object,
[String] $Property,
$Value
)
[Void] $Object.GetType().InvokeMember($Property,"SetProperty",$NULL,$Object,$Value)
}
Invoke-SetProperty -Object $mail -Property "SendUsingAccount" -Value $account
$Mail.Attachments.Add("$WorkDir\$Dunn")
$Mail.Save()
$Mail.close(1)
$Mail.Send()}}
$workbook.close($false)
$Excel.Quit()
chdir $WorkDir
del 1*.pdf
See this post:
https://stackoverflow.com/a/35955339/5329137
which is not accepted as an answer, but I believe is the full, correct way to close Excel.
This is what did it for me:
$FilePID = (Get-Process -name Excel | Where-Object { $_.MainWindowTitle -like 'FileName.xlsx*' }).Id
$Workbook.Save()
$Workbook.close($false)
Stop-Process $FilePID
Elaborating on #ASD's answer, since the MainWindowTitle doesn't (always) include the file suffix (.xlsx) you may have to strip that when comparing it to the filename. I'm using -replace to use a Regex match of everything before the last dot.
$excelPID = (Get-Process -name Excel | Where-Object { $_.MainWindowTitle -eq $fileName -replace '\.[^.]*$', '' }).Id
$workbook.Close()
Stop-Process $excelPID

Excel spawned from powershell script not quitting [duplicate]

This question already has answers here:
Excel, save and close after run
(6 answers)
Closed 4 years ago.
We're migrating databases so I'm using powershell to modify the hundreds of excel files that reference to old DB instance to the new one. This is all fine and works as intended. The problem I have is that the Excel application will not exit when I'm done. The process will just hang around as a background process and I need to go into task manager to kill it. Not a huge deal but it is annoying. Here is my script. I'm on powershell v5 and Office 2016.
param(
[string]$search_root=$(throw "missing search root parameter"),
[boolean]$test=$true
)
echo $search_root
$NAMEPOSTFIX = '-Updated'
$OLDCONN = 'Data Source=abc;'
$NEWCONN = 'Data Source=xyz;'
$filelist = Get-ChildItem -Path $search_root *.xls* -Recurse -Exclude '*Updated.*'
$Excel = New-Object -Com Excel.Application
$Excel.DisplayAlerts = $False
function update_con_xls{
param($file)
$Workbook = $Excel.Workbooks.Open($file)
foreach($con in $Workbook.Connections){
if ($con.OLEDBConnection -ne $null){
$con.OLEDBConnection.Connection = $con.OLEDBConnection.Connection.Replace($OLDCONN,$NEWCONN)
}
if ($con.ODBCConnection -ne $null){
$con.ODBCConnection.Connection = $con.ODBCConnection.Connection.Replace($OLDCONN,$NEWCONN)
}
}
$Workbook.Save()
$Workbook.saved = $true
$Excel.Workbooks.Close()
}
foreach ($file in $filelist) {
echo $file
if ($file.Extension -eq '.xls' -or $file.Extension -eq '.xlsx') {
$newfile = ($file.DirectoryName + '\' + $file.BaseName + $NAMEPOSTFIX + $file.Extension)
if($test){
echo $test
$newfile = ('C:\test\' + $file.Name) #for test run to copy coppy locally
}
Copy-Item $file.FullName -Destination $newfile
update_con_xls($newfile)
}
}
$Excel.Quit()
$Excel = $null
Maybe change:
$Excel.Workbooks.Close()
to :
$Excel= New-Object -ComObject Excel.Application;
$Workbook = $Excel.Workbooks.Open($file);
$Workbook.Save();
$Workbook.Close(); # <--- try
$Excel.Quit();
Remove-Variable -Name Excel;
I do not see anything here that could close your ODB and OLEDb connection trying to add:
$con.OLEDBConnection.Connection.Close();
$con.ODBCConnection.Connection.Close();
or
$con.OLEDBConnection.Close();
$con.ODBCConnection.Close();
after your work is finished.

Powershell script using Excel running slow

So i have this script that i coded on my laptop that works just fine, the job is to combine two .csv-files into one .xls-file.
And running the script with two .csv-files containing a couple of thousand rows takes a few seconds max.
But when i try to run it on the server where it should be located, it takes... hours. I haven't done a full run, but writing one line in the .xls-file takes maybe 2-3 seconds.
So what im wondering is what is causing the huge increase in runtime. I'm monitoring the CPU-load while the script is running, and it's at 50-60% load.
The server has loads of Ram, and two CPU-core.
How can i speed this up?
The script looks like this:
$path = "C:\test\*"
$path2 = "C:\test"
$date = Get-Date -Format d
$csvs = Get-ChildItem $path -Include *.csv | Sort-Object LastAccessTime -Descending | Select-Object -First 2
$y = $csvs.Count
Write-Host "Detected the following CSV files: ($y)"
foreach ($csv in $csvs) {
Write-Host " "$csv.Name
}
$outputfilename = "regSCI " + $date
Write-Host Creating: $outputfilename
$excelapp = New-Object -ComObject Excel.Application
$excelapp.sheetsInNewWorkbook = $csvs.Count
$xlsx = $excelapp.Workbooks.Add()
$sheet = 1
$xlleft = -4131
foreach ($csv in $csvs) {
$row = 1
$column = 1
$worksheet = $xlsx.Worksheets.Item($sheet)
$worksheet.Name = $csv.Name
$worksheet.Rows.HorizontalAlignment = $xlleft
$file = (Get-Content $csv)
Write-Host Worksheet created: $worksheet.Name
foreach($line in $file) {
Write-Host Writing Line
$linecontents = $line -split ',(?!\s*\w+")'
foreach($cell in $linecontents) {
Write-Host Writing Cell
$cell1 = $cell.Trim('"')
$worksheet.Cells.Item($row, $column) = $cell1
$column++
}
$column = 1
$row++
$WorkSheet.UsedRange.Columns.Autofit() | Out-Null
}
$sheet++
$headerRange = $worksheet.Range("a1", "q1")
$headerRange.AutoFilter() | Out-Null
}
$output = $path2 + "\" + $outputfilename
Write-Host $output
$xlsx.SaveAs($output)
$excelapp.Quit()
To speed up your existing code, add these just after creating Excel object:
$excelapp.ScreenUpdating = $false
$excelapp.DisplayStatusBar = $false
$excelapp.EnableEvents = $false
$excelapp.Visible = $false
And these just before SaveAs:
$excelapp.ScreenUpdating = $true
$excelapp.DisplayStatusBar = $true
$excelapp.EnableEvents = $true
This causes excel not to render the worksheet in realtime and fire events every time you change the contets. Most probably DisplayStatusBar and ScreenUpdating doesn't matter if you make an application invisible, but I included it just in case.
Also, you're running Autofit() after every line. This certainly doesn't help with performance.

Powershell grinds to a halt part way through processing

I have the following power-shell (4.0) script. It opens an excel book (2013) and gets a url from a cell and opens it. It then downloads the web page to a .htm file. It iterates through each row in the worksheet (~3000).
The problem I am having is the first ~500 files are done in about 3 minutes. It then appears to slow down considerably to where only one file is created every 2-3 minutes. I have checked my available ram and cpu usage and both are fine (ram 2.93gb used of 8gb, cpu is at 35%).
Is there anything I can do to get round this and speed it up?
cls
$output_folder = "c:\temp"
$OUTPUT_FILENAME=""
# comment following line to add a timestamp to each file gets created
if($OUTPUT_FILENAME.length -eq 0) {$OUTPUT_FILENAME=(get-date).tostring().replace(" ","").replace("/","").replace(":","")}
$filepath = "C:\Temp\MeteringHistory\Meters for Maximo Upload"
$xl = New-Object -COM "Excel.Application"
$xl.Visible = $false
$wb = $xl.Workbooks.Open($filepath)
$ws = $wb.Worksheets.Item("WOhistory")
$maxRow = ($ws.UsedRange.rows).count
$minRow = 1
for ($minrow -le $maxrow ; $minrow++)
{
$website = $ws.cells.item($minRow, 1).text
$fileName = $ws.cells.item($minRow, 5).text + " - " + $ws.cells.item($minRow, 4).text
$fileName = $fileName -replace '/', '_'
$wc=new-object system.net.webclient
$wc.UseDefaultCredentials = $true
$wc.downloadfile($website,"c:\temp\MeteringHistory\Files\$filename.htm")
$wc.Dispose()
}
I would export Excel to CSV first and ditch Excel afterwards. You can even do this from powershell itself. Then use invoke-webrequest instead of all that boilerplate you had and be done with it. It can be done in only couple of lines of code. That way you reduce the issue to the basic powershell.
Too speed up this process you could then invoke-webrequest in parallel via background jobs.
Hard to know what's wrong without an error. I would try one thing though: Avoid creating and disposing hundreds of WebClient-objects when you can reuse the first one. Try:
cls
$output_folder = "c:\temp"
$OUTPUT_FILENAME=""
# comment following line to add a timestamp to each file gets created
if($OUTPUT_FILENAME.length -eq 0) {$OUTPUT_FILENAME=(get-date).tostring().replace(" ","").replace("/","").replace(":","")}
$filepath = "C:\Temp\MeteringHistory\Meters for Maximo Upload"
$xl = New-Object -COM "Excel.Application"
$xl.Visible = $false
$wb = $xl.Workbooks.Open($filepath)
$ws = $wb.Worksheets.Item("WOhistory")
$maxRow = ($ws.UsedRange.rows).count
$minRow = 1
$wc=new-object system.net.webclient
$wc.UseDefaultCredentials = $true
for ($minrow -le $maxrow ; $minrow++) {
$website = $ws.cells.item($minRow, 1).text
$fileName = $ws.cells.item($minRow, 5).text + " - " + $ws.cells.item($minRow, 4).text
$fileName = $fileName -replace '/', '_'
$wc.DownloadFile($website,"c:\temp\MeteringHistory\Files\$filename.htm")
}
$wc.Dispose()

Excel, save and close after run

how can I save the below script after it has run?
Script is from: Powershell Disk Usage Report
$erroractionpreference = "SilentlyContinue"
$a = New-Object -comobject Excel.Application
$a.visible = $True
$b = $a.Workbooks.Add()
$c = $b.Worksheets.Item(1)
$c.Cells.Item(1,1) = "Server Name"
$c.Cells.Item(1,2) = "Drive"
$c.Cells.Item(1,3) = "Total Size (GB)"
$c.Cells.Item(1,4) = "Free Space (GB)"
$c.Cells.Item(1,5) = "Free Space (%)"
$d = $c.UsedRange
$d.Interior.ColorIndex = 19
$d.Font.ColorIndex = 11
$d.Font.Bold = $True
$intRow = 2
$colComputers = get-content "c:\servers.txt"
foreach ($strComputer in $colComputers)
{
$colDisks = get-wmiobject Win32_LogicalDisk -computername $strComputer -Filter "DriveType = 3"
foreach ($objdisk in $colDisks)
{
$c.Cells.Item($intRow, 1) = $strComputer.ToUpper()
$c.Cells.Item($intRow, 2) = $objDisk.DeviceID
$c.Cells.Item($intRow, 3) = "{0:N0}" -f ($objDisk.Size/1GB)
$c.Cells.Item($intRow, 4) = "{0:N0}" -f ($objDisk.FreeSpace/1GB)
$c.Cells.Item($intRow, 5) = "{0:P0}" -f ([double]$objDisk.FreeSpace/[double]$objDisk.Size)
$intRow = $intRow + 1
}
}
According to https://social.technet.microsoft.com/Forums/windowsserver/en-US/919459dc-3bce-4242-bf6b-fdf37de9ae18/powershell-will-not-save-excel-file, this will work, but I am unable to:
Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlWorkbookDefault
$Excel = New-Object -comobject Excel.Application
$Excel.Visible = $true
################
$Excel.workbooks.OpenText($file,437,1,1,1,$True,$True,$False,$False,$True,$False)
$Excel.ActiveWorkbook.SaveAs($env:tmp + "\myfile.xls", $xlFixedFormat)
$Excel.Workbooks.Close()
$Excel.Quit()
This worked for me :
$workbook.Close($false)
$excel.Quit()
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workSheet)
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel)
Remove-Variable -Name excel
To properly and completely close Excel, you also need to release COM references. In my own testing have found removing the variable for Excel also ensures no remaining references exist which will keep Excel.exe open (like if you are debugging in the ISE).
Without performing the above, if you look in Task Manager, you may see Excel still running...in some cases, many copies.
This has to do with how the COM object is wrapped in a “runtime callable wrapper".
Here is the skeleton code that should be used:
$excel = New-Object -ComObject Excel.Application
$excel.Visible = $true
$workbook = $excel.Workbooks.Add()
# or $workbook = $excel.Workbooks.Open($xlsxPath)
# do work with Excel...
$workbook.SaveAs($xlsxPath)
$excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel)
# no $ needed on variable name in Remove-Variable call
Remove-Variable excel
Got it working! - Special thanks to #Matt
Complete script that is working:
$erroractionpreference = "SilentlyContinue"
$a = New-Object -comobject Excel.Application
$a.visible = $True
Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlFixedFormat = [Microsoft.Office.Interop.Excel.XlFileFormat]::xlWorkbookDefault
$a.Visible = $true
$b = $a.Workbooks.Add()
$c = $b.Worksheets.Item(1)
$c.Cells.Item(1,1) = "Server Name"
$c.Cells.Item(1,2) = "Drive"
$c.Cells.Item(1,3) = "Total Size (GB)"
$c.Cells.Item(1,4) = "Free Space (GB)"
$c.Cells.Item(1,5) = "Free Space (%)"
$d = $c.UsedRange
$d.Interior.ColorIndex = 19
$d.Font.ColorIndex = 11
$d.Font.Bold = $True
$intRow = 2
$colComputers = get-content "c:\servers.txt"
foreach ($strComputer in $colComputers)
{
$colDisks = get-wmiobject Win32_LogicalDisk -computername $strComputer -Filter "DriveType = 3"
foreach ($objdisk in $colDisks)
{
$c.Cells.Item($intRow, 1) = $strComputer.ToUpper()
$c.Cells.Item($intRow, 2) = $objDisk.DeviceID
$c.Cells.Item($intRow, 3) = "{0:N0}" -f ($objDisk.Size/1GB)
$c.Cells.Item($intRow, 4) = "{0:N0}" -f ($objDisk.FreeSpace/1GB)
$c.Cells.Item($intRow, 5) = "{0:P0}" -f ([double]$objDisk.FreeSpace/[double]$objDisk.Size)
$intRow = $intRow + 1
}
}
$a.workbooks.OpenText($file,437,1,1,1,$True,$True,$False,$False,$True,$False)
$a.ActiveWorkbook.SaveAs("C:\Users\Username\Desktop\myfile.xls", $xlFixedFormat)
$a.Workbooks.Close()
$a.Quit()
As mentioned in MSDN documentation here, the ReleaseComObject call only decrements the reference counter of that COM object by 1. If your scripts has multiple references of the same COM object, It will not release the object.
The documentation recommends using FinalReleaseComObject method to completely release the COM object and close the Excel process once in for all.
Just be sure to call this method only when you are done with the COM reference, as not doing so may lead to bugs which are hard to debug.
Creating the Excel file:
$Excel = New-Object -ComObject Excel.Application
$Excel.Visible = $True
......
Closing down the Excel:
$Excel.Close()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel)
spps -n Excel
This solves my issue when $excel.Quit() does not quit and OneDrive won't upload the file. In my case I just need some automation and after the job is done it is quite fine that all the Excel processes are killed.
$excel.Quit()
# Check and you will see an excel process still exists after quitting
# Remove the excel process by piping it to stop-process
# Warning: This Closes All Excel Processes
Get-Process excel | Stop-Process -Force

Resources