Powershell script cannot access a file when run as a Scheduled Task - excel

My Powershell (2.0) script has the following code snippet:
$fileName = "c:\reports\1.xlsx"
$xl = new-object -comobject excel.application
$xlFormat = [Microsoft.Office.Interop.excel.XlFileFormat]::xlWorkbookDefault
$xl.displayalerts = $false
$workbook = $xl.workbooks.open($fileName)
#Code to manipulate a worksheet
$workbook.SaveAs($fileName, $xlformat)
$xl.quit()
$error | out-file c:\reports\error.txt
I can run this script in the Powershell command prompt with no issues. The spreadsheet gets updated, and error.txt is empty. However, when I run it as a task in Task Scheduler, I get errors with the first line.
Exception calling "Open" with "1" argument(s): "Microsoft Office Excel cannot access the file 'C:\reports\1.xlsx'. There are several possible reasons:
The file name or path does not exist.
The file is being used by another program.
The workbook you are trying to save has the same name as a currently open workbook.
I run the task with the same credentials I use to run the script in the Powershell command prompt. When I run the script manually, it can open, update, and save the spreadsheet with no issues. When I run it in Task Scheduler, it can't access the spreadsheet.
The file in question is readable/writeable for all users. I've verified I can open the file in Excel with the same credentials. If I make a new spreadsheet and put its name in as the $filename, I get the same results. I've verified that there are no instances of Excel.exe in Task Manager.
Oddly, if I use get-content, I don't have any problems. Also, if I make a new spreadsheet, I don't have any problem.
$fileName = "c:\reports\1.xlsx"
$xl = get-content $spreadsheet
$xl = new-object -comobject excel.application
$xlFormat = [Microsoft.Office.Interop.excel.XlFileFormat]::xlWorkbookDefault
$xl.displayalerts = $false
# Commented out $workbook = $xl.workbooks.open($fileName)
$workbook = $xl.workbooks.add()
#Code to manipulate a worksheet
$workbook.SaveAs($fileName, $xlformat)
$xl.quit()
$error | out-file c:\reports\error.txt
That works fine. So Get-ChildItem can open the file with no issue. ComObject can open the file if I run it manually, but not if it's run as task.
I'm at a loss. Any ideas?

I think you've hit a bug in Excel:
You have to create a folder (or two on a 64bit-windows):
(32Bit, always)
C:\Windows\System32\config\systemprofile\Desktop
(64Bit)
C:\Windows\SysWOW64\config\systemprofile\Desktop
I have had the same problem and this was the only solution i have
found.
From TechNet Forums (via PowerShell and Excel Issue when Automating )

The solutions above didn't work in my SCSM 2012 Scorch environment, instead I used PSExcel (https://github.com/RamblingCookieMonster/PSExcel) which has no dependency on having Excel installed or the ComObject.

To extend what #TessellatingHeckler provided, you can run the following commands in Powershell(As Admin/Elevated) to create the folders before opening excel, in my script this fixed the issue:
New-Item -ItemType Directory -Force -Path C:\Windows\System32\config\systemprofile\Desktop
if ([Environment]::Is64BitProcess -ne [Environment]::Is64BitOperatingSystem)
{
New-Item -ItemType Directory -Force -Path C:\Windows\SysWOW64\config\systemprofile\Desktop
}

I had to set my Scheduled Task to run 'only when user is logged on' (logged on to server as the service account that runs task then disconnect session) as it seems to be a limitation with the Task Scheduler and Excel. It's a pretty lame workaround but it works.

To reiterate what TessellatingHeckler said. I had to resolve this issue on a 64 bit system, so I used the following command which made the PowerShell script finally work via Task Scheduler:
New-Item -ItemType Directory -Force -Path C:\Windows\SysWOW64\config\systemprofile\Desktop

Related

Timeout while executing Powershell script

I don't have much experience with Powershell yet, I'm looking for a solution for my script.
I work with a process automation platform called "Firestart" and in it you can run powershell scripts.
The purpose of the script: to read a single cell from an .xlsx file.
Right now I am trying to run a script with this application and then the script keeps running for 10 minutes. Because of this I automatically get a Timeout error which is logical.
If I run this script on the same server in Powershell, I get the desired outout within 1 seconds.
It might be an problem of the application itself, but maybe someone can review my script.
Script:
$objExcel = New-Object -ComObject Excel.Application
$ExcelFile = '#.xlsx file'
$WorkBook = $objExcel.Workbooks.Open($ExcelFile)
$workbook.sheets.item(1).activate()
$WorkbookTotal=$workbook.Worksheets.item(1)
$value = $WorkbookTotal.Cells.Item(10, 2)
return $value.Text

How to run parallel excel macros with PowerShell?

I would like to run multiple excel macros in parallel by opening multiple instances of excel using a shell (PowerShell) script.
Examples on parallel processing in shell aren't intuitive to me (new to shell scripting).
# start excel
$excel = New-Object -comobject Excel.Application
# get files
$files = Get-ChildItem "C:\User\test"
# loop through all files in the directory
ForEach ($file in $files){
# open the file
$workbook = $excel.Workbooks.Open($file.FullName)
# make file visible
$excel.Visible = $true
# run macro
$app = $excel.Application
$app.run("Macro1")
}
The code provided performs the task I want, but it does so sequentially rather than in parallel.
$app.run("application.ontime vba.now() + vba.timeserial(0,0,5),""Macro1"" ")
This will run assync your macro , but still dont know if is that you want

Running Powershell Script from Task Scheduler when User is Logged Off [duplicate]

This question already has an answer here:
Task Scheduler doesn't execute batch file properly
(1 answer)
Closed 6 years ago.
I have a powershell script that I'm intending to run from a remote server. The purpose of the script is to do the following:
Copy an Excel file from a mapped drive to the remote server
Open the Excel file and run a macro that
The macro copies an Access table that's on the remote server and pastes it into the Excel file, then does some manipulation of the data
Saves the Excel file, closes it, and then copies it back to the mapped drive
Right now, I'm testing it on my local machine, so it's copying the Excel file from the mapped drive to my C drive, then grabbing the Access table from a location on my local machine. It runs perfectly when I run it from Powershell. Here is the code:
# collect excel process ids before and after new excel background process is
# started
$priorExcelProcesses = Get-Process -name "*Excel*" | % { $_.Id }
$Excel = New-Object -ComObject Excel.Application
$postExcelProcesses = Get-Process -name "*Excel*" | % { $_.Id }
#run program
$folderPath = "my folder goes here"
$filePath = "my folder gooes here\filename.xlsm"
$tempPath = "C:\Users\Public"
$tempFile = "C:\Users\Public\filename.xlsm"
#copy file from I drive to remote desktop
Copy-Item -Path $filePath -Destination $tempPath
#create Excel variables
$excel = new-object -comobject excel.application
$excel.visible = $False
$excelFiles = Get-ChildItem -Path $tempPath -Include *.xls, *xlsm -Recurse
#open excel application and run routine
Foreach($file in $excelFiles)
{
$workbook = $excel.workbooks.open($tempFile)
$worksheet = $workbook.worksheets.item(1)
$excel.Run("getAccessData")
$workbook.save()
$workbook.close()
}
#copy file from remote desktop back onto I drive
Copy-Item -Path $tempFile -Destination $folderPath
# try to gently quit
$Excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel)
# otherwise allow stop-process to clean up
$postExcelProcesses | ? { $priorExcelProcesses -eq $null -or $priorExcelProcesses -notcontains $_ } | % { Stop-Process -Id $_ }
I need to have the script run once per day in the middle of the night, so I have been working on making it a scheduled task. I first set up the 'Action' with the following information:
Program/Script:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add arguments (optional):
-NoProfile -ExecutionPolicy Bypass -file "C:\Users\nelsonth\emsUpdateDesktop.ps1"
Now, if I run this task while the security option is set to "Run only when user is logged on" and the "Hidden" checkbox selected, then the task runs perfectly.
Since this is going to be running from a remote desktop in the middle of the night, I need the script to run while I am logged off. But when I selected "Run whether user is logged on or not" and "Run with highest privileges" the script no longer runs.
I need this to work so can anyone help me troubleshoot this?
When run as "Run whether user is logged on or not", your PowerShell Script will most likely have to manually Map the drives or access the file via UNC path.
Try adding something like the following to the top of your script, checking for the drive letter, then mapping it if not found:
if (-Not (Test-Path 'X:\')) {New-PSDrive -Name "X" -PSProvider FileSystem -Root "\\MyServer\MyShare"}

Find all existing connections in multiple Excel files in a directory

Due to server changes, I have been asked to find the Workbook Connections used for all spreadhseet reports (200+) stored within a folder directory. Is there a way of using powershell to find these?
I had tried using the following powershell command but I think I am maybe going about it incorrectly.
Get-ChildItem “C:\path” -recurse | Select-String -pattern “find me” | group path | select name
Any help would be very appreciated! Thanks
Try this:
foreach ($file in (Get-ChildItem “C:\path_to_reports\” -recurse))
{
$Excel = New-Object -comobject Excel.Application
$ExcelWorkbook = $Excel.workbooks.open($file.fullname)
write-host "Connections of $($file.fullname):"
$ExcelWorkbook.Connections
$Excel.Close()
}

PowerShell Excel Add-in Deployment Issue

I have this script that I have gotten from http://poshcode.org/1811 and modified slightly for my own purposes. The problem I'm having is the user who requested for me to deploy the add-in has informed me that only the initial person that logs on to use the add-in can see it installed. This is an Active Directory environment and the script is set to run as the user logs on.
Here is the script, I will also get the code for the add-in itself later from the person in question.
$Addinfilepath = '\\network\path\to\Excel Add-ins\'
$Addinfilename = Get-ChildItem $Addinfilepath -Name
ForEach ($CustomAddin in $Addinfilename) {
$Excel = New-Object -ComObject excel.application
$ExcelWorkbook = $excel.Workbooks.Add()
if (($ExcelWorkbook.Application.AddIns | Where-Object {$_.name -eq $CustomAddin}) -eq $null) {
$ExcelAddin = $ExcelWorkbook.Application.AddIns.Add("$Addinfilepath$CustomAddin", $True)
$ExcelAddin.Installed = "True"
Write-Host "$CustomAddin added"}
#else{}
#{Write-Host "$Addinfilename already added"}
$Excel.Quit()
}
I'm not quite sure what's wrong with the script, but I believe there's something going on where it's installing to the first person's profile on the PC, but then maybe the script is seeing it as installed period and then no longer installs to the next user that logs on.
Thanks for advice or help you can give.
Not sure if you noticed it but you are missing $ sign in front of CustomAddin when you are trying to match the name.
Where-Object {$_.name -eq CustomAddin})
should be
Where-Object {$_.name -eq $CustomAddin})

Resources