I have this script that I have gotten from http://poshcode.org/1811 and modified slightly for my own purposes. The problem I'm having is the user who requested for me to deploy the add-in has informed me that only the initial person that logs on to use the add-in can see it installed. This is an Active Directory environment and the script is set to run as the user logs on.
Here is the script, I will also get the code for the add-in itself later from the person in question.
$Addinfilepath = '\\network\path\to\Excel Add-ins\'
$Addinfilename = Get-ChildItem $Addinfilepath -Name
ForEach ($CustomAddin in $Addinfilename) {
$Excel = New-Object -ComObject excel.application
$ExcelWorkbook = $excel.Workbooks.Add()
if (($ExcelWorkbook.Application.AddIns | Where-Object {$_.name -eq $CustomAddin}) -eq $null) {
$ExcelAddin = $ExcelWorkbook.Application.AddIns.Add("$Addinfilepath$CustomAddin", $True)
$ExcelAddin.Installed = "True"
Write-Host "$CustomAddin added"}
#else{}
#{Write-Host "$Addinfilename already added"}
$Excel.Quit()
}
I'm not quite sure what's wrong with the script, but I believe there's something going on where it's installing to the first person's profile on the PC, but then maybe the script is seeing it as installed period and then no longer installs to the next user that logs on.
Thanks for advice or help you can give.
Not sure if you noticed it but you are missing $ sign in front of CustomAddin when you are trying to match the name.
Where-Object {$_.name -eq CustomAddin})
should be
Where-Object {$_.name -eq $CustomAddin})
Related
I have a simple script to get the last sign in details for each user in Azure. When running the script from Visual Studio, it all runs fine with no errors.
After uploading the script to an Azure Automation Account, I am getting the error "Object reference not set to an instance of an object".
I have checked and the command 'Get-AzureADAuditSigninLogs' is returning $null
$users = Get-AzureADUser -All $true
foreach ( $user in $users ) {
$userLogs = Get-AzureADAuditSigninLogs -Filter "startsWith(userPrincipalName, '$( $user.UserPrincipalName )')" -All $true
}
Any ideas on the issue that could be causing this to occur in the Automation account but not visual studio?
As per this issue,-All $true parameter is not working for cmdlet Get-AzureADAuditSignInLogs as expected.
To resolve it, you can try upgrading to AzureADPreview v2.0.2.89.
Alternatively, you can also try as suggested by psignoret:
Format string with -f or [String]::Format():
Write-Host ("startsWith(userPrincipalName ,'{0}')" -f $user.userPrincipalName)
Write-Host [String]::Format("startsWith(userPrincipalName ,'{0}')", $user.userPrincipalName)
This question already has an answer here:
Task Scheduler doesn't execute batch file properly
(1 answer)
Closed 6 years ago.
I have a powershell script that I'm intending to run from a remote server. The purpose of the script is to do the following:
Copy an Excel file from a mapped drive to the remote server
Open the Excel file and run a macro that
The macro copies an Access table that's on the remote server and pastes it into the Excel file, then does some manipulation of the data
Saves the Excel file, closes it, and then copies it back to the mapped drive
Right now, I'm testing it on my local machine, so it's copying the Excel file from the mapped drive to my C drive, then grabbing the Access table from a location on my local machine. It runs perfectly when I run it from Powershell. Here is the code:
# collect excel process ids before and after new excel background process is
# started
$priorExcelProcesses = Get-Process -name "*Excel*" | % { $_.Id }
$Excel = New-Object -ComObject Excel.Application
$postExcelProcesses = Get-Process -name "*Excel*" | % { $_.Id }
#run program
$folderPath = "my folder goes here"
$filePath = "my folder gooes here\filename.xlsm"
$tempPath = "C:\Users\Public"
$tempFile = "C:\Users\Public\filename.xlsm"
#copy file from I drive to remote desktop
Copy-Item -Path $filePath -Destination $tempPath
#create Excel variables
$excel = new-object -comobject excel.application
$excel.visible = $False
$excelFiles = Get-ChildItem -Path $tempPath -Include *.xls, *xlsm -Recurse
#open excel application and run routine
Foreach($file in $excelFiles)
{
$workbook = $excel.workbooks.open($tempFile)
$worksheet = $workbook.worksheets.item(1)
$excel.Run("getAccessData")
$workbook.save()
$workbook.close()
}
#copy file from remote desktop back onto I drive
Copy-Item -Path $tempFile -Destination $folderPath
# try to gently quit
$Excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel)
# otherwise allow stop-process to clean up
$postExcelProcesses | ? { $priorExcelProcesses -eq $null -or $priorExcelProcesses -notcontains $_ } | % { Stop-Process -Id $_ }
I need to have the script run once per day in the middle of the night, so I have been working on making it a scheduled task. I first set up the 'Action' with the following information:
Program/Script:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add arguments (optional):
-NoProfile -ExecutionPolicy Bypass -file "C:\Users\nelsonth\emsUpdateDesktop.ps1"
Now, if I run this task while the security option is set to "Run only when user is logged on" and the "Hidden" checkbox selected, then the task runs perfectly.
Since this is going to be running from a remote desktop in the middle of the night, I need the script to run while I am logged off. But when I selected "Run whether user is logged on or not" and "Run with highest privileges" the script no longer runs.
I need this to work so can anyone help me troubleshoot this?
When run as "Run whether user is logged on or not", your PowerShell Script will most likely have to manually Map the drives or access the file via UNC path.
Try adding something like the following to the top of your script, checking for the drive letter, then mapping it if not found:
if (-Not (Test-Path 'X:\')) {New-PSDrive -Name "X" -PSProvider FileSystem -Root "\\MyServer\MyShare"}
Due to server changes, I have been asked to find the Workbook Connections used for all spreadhseet reports (200+) stored within a folder directory. Is there a way of using powershell to find these?
I had tried using the following powershell command but I think I am maybe going about it incorrectly.
Get-ChildItem “C:\path” -recurse | Select-String -pattern “find me” | group path | select name
Any help would be very appreciated! Thanks
Try this:
foreach ($file in (Get-ChildItem “C:\path_to_reports\” -recurse))
{
$Excel = New-Object -comobject Excel.Application
$ExcelWorkbook = $Excel.workbooks.open($file.fullname)
write-host "Connections of $($file.fullname):"
$ExcelWorkbook.Connections
$Excel.Close()
}
My Powershell (2.0) script has the following code snippet:
$fileName = "c:\reports\1.xlsx"
$xl = new-object -comobject excel.application
$xlFormat = [Microsoft.Office.Interop.excel.XlFileFormat]::xlWorkbookDefault
$xl.displayalerts = $false
$workbook = $xl.workbooks.open($fileName)
#Code to manipulate a worksheet
$workbook.SaveAs($fileName, $xlformat)
$xl.quit()
$error | out-file c:\reports\error.txt
I can run this script in the Powershell command prompt with no issues. The spreadsheet gets updated, and error.txt is empty. However, when I run it as a task in Task Scheduler, I get errors with the first line.
Exception calling "Open" with "1" argument(s): "Microsoft Office Excel cannot access the file 'C:\reports\1.xlsx'. There are several possible reasons:
The file name or path does not exist.
The file is being used by another program.
The workbook you are trying to save has the same name as a currently open workbook.
I run the task with the same credentials I use to run the script in the Powershell command prompt. When I run the script manually, it can open, update, and save the spreadsheet with no issues. When I run it in Task Scheduler, it can't access the spreadsheet.
The file in question is readable/writeable for all users. I've verified I can open the file in Excel with the same credentials. If I make a new spreadsheet and put its name in as the $filename, I get the same results. I've verified that there are no instances of Excel.exe in Task Manager.
Oddly, if I use get-content, I don't have any problems. Also, if I make a new spreadsheet, I don't have any problem.
$fileName = "c:\reports\1.xlsx"
$xl = get-content $spreadsheet
$xl = new-object -comobject excel.application
$xlFormat = [Microsoft.Office.Interop.excel.XlFileFormat]::xlWorkbookDefault
$xl.displayalerts = $false
# Commented out $workbook = $xl.workbooks.open($fileName)
$workbook = $xl.workbooks.add()
#Code to manipulate a worksheet
$workbook.SaveAs($fileName, $xlformat)
$xl.quit()
$error | out-file c:\reports\error.txt
That works fine. So Get-ChildItem can open the file with no issue. ComObject can open the file if I run it manually, but not if it's run as task.
I'm at a loss. Any ideas?
I think you've hit a bug in Excel:
You have to create a folder (or two on a 64bit-windows):
(32Bit, always)
C:\Windows\System32\config\systemprofile\Desktop
(64Bit)
C:\Windows\SysWOW64\config\systemprofile\Desktop
I have had the same problem and this was the only solution i have
found.
From TechNet Forums (via PowerShell and Excel Issue when Automating )
The solutions above didn't work in my SCSM 2012 Scorch environment, instead I used PSExcel (https://github.com/RamblingCookieMonster/PSExcel) which has no dependency on having Excel installed or the ComObject.
To extend what #TessellatingHeckler provided, you can run the following commands in Powershell(As Admin/Elevated) to create the folders before opening excel, in my script this fixed the issue:
New-Item -ItemType Directory -Force -Path C:\Windows\System32\config\systemprofile\Desktop
if ([Environment]::Is64BitProcess -ne [Environment]::Is64BitOperatingSystem)
{
New-Item -ItemType Directory -Force -Path C:\Windows\SysWOW64\config\systemprofile\Desktop
}
I had to set my Scheduled Task to run 'only when user is logged on' (logged on to server as the service account that runs task then disconnect session) as it seems to be a limitation with the Task Scheduler and Excel. It's a pretty lame workaround but it works.
To reiterate what TessellatingHeckler said. I had to resolve this issue on a 64 bit system, so I used the following command which made the PowerShell script finally work via Task Scheduler:
New-Item -ItemType Directory -Force -Path C:\Windows\SysWOW64\config\systemprofile\Desktop
I have a folder, containing several solutions for a SharePoint application, which I want to add and install. I want to iterate over the elements in the folder, then use the Add-SPSolution. After that I want to do a check if the solutions are done deploying, before using the Install-SPSolution. Here is a snippet that I am currently working on:
# Get the location of the folder you are currently in
$dir = $(gl)
# Create a list with the .wsp solutions
$list = Get-ChildItem $dir | where {$_.extension -eq ".wsp"}
Write-Host 'DEPLOYING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Add-SPSolution -LiteralPath $my_file.FullName}
Write-Host 'SLEEP FOR 30 SECONDS'
Start-Sleep -s 30
Write-Host 'INSTALLING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Install-SPSolution -Identity $my_file.Name -AllWebApplications -GACDeployment}
Is there a way to check if the deployment is finished, and it is ready to start installing the solutions?
You need to check the SPSolution.Deployed property value in a loop - basic solution looks like this:
do { Start-Sleep 2 } while (!((Get-SPSolution $name).Deployed))
The Deploying SharePoint 2010 Solution Packages Using PowerShell article contains more details and this comment discusses a potential caveat.