I'm trying to do a simple parallel operation in Powershell. I am using PoshRSJobs for multithreading, though I have also tried Invoke-Parallel with the same issue.
I need to call a couple of my own functions in the scriptbody of the job, but this does not allow me to MOCK those functions for unit testing (they end up being the original non-mocked functions). At this point, I'm just trying to assert that they have been called the correct number of times.
Here is the original class (the functionality of the imported modules are irrelevant - the actual implementations are currently returning test strings)...
Import-Module $PSScriptRoot\Convert-DataTable
Import-Module $PSScriptRoot\Get-History
Import-Module $PSScriptRoot\Get-Assets
Import-Module $PSScriptRoot\Write-DataTable
function MyStuff (
param(
[string]$serverInstance = "localhost\SQLEXPRESS",
[string]$database = "PTLPowerShell",
[string]$tableName = "Test"
)
$assets = Get-Assets
$full_dt = New-Object System.Data.DataTable
$assets | Start-RSJob -ModulesToImport $PSScriptRoot\Convert-FLToDataTable, $PSScriptRoot\Get-FLHistory {
$history = Get-History $asset
$history_dt = Convert-DataTable $history
return $history_dt.Rows
} | Wait-RSJob | Receive-RSJob | ForEach {
$full_dt.Rows.Add($_)
}
Write-DataTable $serverInstance $database $tableName $full_dt
}
Here is the Pester test...
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.', '.'
. "$here\$sut"
Describe "MyStuff" {
BeforeEach {
Mock Get-Assets { return "page1", "page2"}
Mock Get-History { return "history" }
Mock Convert-DataTable {
$historyDT = New-Object System.Data.Datatable;
$historyDT.TableName = 'Test'
return ,$historyDT
}
Mock Write-DataTable {}
}
It "should do something" {
{ MyStuff } | Should -Not -Throw;
}
It "should call Get-FLAssetGrid" {
Assert-MockCalled Get-Assets 1
}
It "should call Get-FLHistory" {
Assert-MockCalled Get-History 2
}
It "should call Convert-DataTable" {
Assert-MockCalled Convert-DataTable 2
}
It "should call Write-DataTable" {
Assert-MockCalled Write-DataTable 1
}
}
Here is the Pester test's output currently...
Describing MyStuff
[+] should do something 1.71s
[+] should call Get-Assets 211ms
[-] should call Get-History 61ms
Expected Get-History to be called at least 2 times but was called 0 times
23: Assert-MockCalled Get-History 2
at <ScriptBlock>, myFile.Tests.ps1: line 23
[-] should call Convert-DataTable 110ms
Expected Convert-DataTable to be called at least 2 times but was called 0 times
26: Assert-MockCalled Convert-DataTable 2
at <ScriptBlock>, myFile.Tests.ps1: line 26
[+] should call Write-DataTable 91ms
So ultimately, I'm looking for a way to do parallel operations in PowerShell and still be able to mock and unit test them.
I don't consider this a full answer, and I don't work on the Pester project, but I would say that this is simply not a supported scenario for Pester. This might change when/if concurrent programming becomes part of PowerShell proper (or it may not).
If you're willing to change your implementation you might be able to write around this limitation to support some sort of testing.
For example, maybe your function doesn't use an RSJob when it only has 1 thing to do (which conveniently might be the case when testing).
Or maybe you implement a -Serial or -NoParallel or -SingleRunspace switch (or a -ConcurrencyFactor which you set to 1 in tests), wherein you don't use a runspace for those conditions.
Based on your example it's difficult to tell if that kind of test adequately tests what you want, but it seems like it does.
I was able to sorta get it to work via injecting the mock into the thread; here's a prof of concept but the fine details would need to be hammered out on a case by case basis
#code.ps1
function ToTest{
start-job -Name OG -ScriptBlock {return (Get-Date '1/1/2000').ToString()}
}
pester
#code.Tests.ps1
$DebugPreference = 'Continue'
write-debug 'Pester''ng: code.ps1'
#################################################################
. (join-path $PSScriptRoot 'code.ps1')
Describe 'Unit Tests' -Tag 'Unit' {
Mock start-job {
$NewSB = {
&{describe 'MockingJob:$JobName' {
Mock get-date {'got mocked'}
& {$ScriptBlock} | Export-Clixml '$JobName.xml'
}}
$out = Import-Clixml '$JobName.xml'
remove-item '$JobName.xml'
$out | write-output
}.ToString().Replace('$ScriptBlock',$ScriptBlock.ToString()).Replace('$JobName',$Name)
start-job -Name "Mock_$Name" -ScriptBlock ([ScriptBlock]::Create($NewSB))
} -ParameterFilter {$Name -NotMatch 'Mock'}
It 'uses the mocked commandlet' {
$job = ToTest
receive-job -Job $job -wait | should be 'got mocked'
remove-job -Job $job
}
}
$DebugPreference = 'SilentlyContinue'
Related
This self-answer intends to provide an easy and efficient parallelism alternative for those stuck with Windows PowerShell and being unable to install Modules due to, for example, Company Policies.
In Windows PowerShell, the built-in available alternatives for local parallel invocations are Start-Job and workflow, both known to be very slow, inefficient, and one of them (workflow) is not even recommended to use and no longer available in newer versions of PowerShell.
The other alternative is to rely on the PowerShell SDK and code our own parallel logic using what the System.Management.Automation.Runspaces Namespace has to offer. This is definitively the most efficient approach and is what ForEach-Object -Parallel (in PowerShell Core) as well as the Start-ThreadJob (preinstalled in PowerShell Core and available in Windows PowerShell through the PowerShell Gallery) uses behind the scenes.
A simple example:
$throttlelimit = 3
$pool = [runspacefactory]::CreateRunspacePool(1, $throttlelimit)
$pool.Open()
$tasks = 0..10 | ForEach-Object {
$ps = [powershell]::Create().AddScript({
'hello world from {0}' -f [runspace]::DefaultRunspace.InstanceId
Start-Sleep 3
})
$ps.RunspacePool = $pool
#{ Instance = $ps; AsyncResult = $ps.BeginInvoke() }
}
$tasks | ForEach-Object {
$_.Instance.EndInvoke($_.AsyncResult)
}
$tasks.Instance, $pool | ForEach-Object Dispose
This is great but gets tedious and often times complicated when the code has more complexity and in consequence brings lots of questions.
Is there an easier way to do it?
Since this is a topic that can be confusing and often brings questions to the site I have decided to create this function that can simplify this tedious task and help those stuck in Windows PowerShell. The aim is to have it as simple and as friendly as possible, it should also be a function that could be copy-pasted in our $PROFILE to be reused whenever needed and not require the installation of a Module (as stated in the question).
This function has been greatly inspired by RamblingCookieMonster's Invoke-Parallel and Boe Prox's PoshRSJob and is merely a simplified take on those with a few improvements.
NOTE
Further updates to this function will be published to the official GitHub repo as well as to the PowerShell Gallery. The code in this answer will no longer be maintained.
Contributions are more than welcome, if you wish to contribute, fork the repo and submit a pull request with the changes.
DEFINITION
# The function must run in the scope of a Module.
# `New-Module` must be used for portability. Otherwise store the
# function in a `.psm1` and import it via `Import-Module`.
New-Module PSParallelPipeline -ScriptBlock {
function Invoke-Parallel {
[CmdletBinding(PositionalBinding = $false)]
[Alias('parallel', 'parallelpipeline')]
param(
[Parameter(Mandatory, ValueFromPipeline)]
[object] $InputObject,
[Parameter(Mandatory, Position = 0)]
[scriptblock] $ScriptBlock,
[Parameter()]
[int] $ThrottleLimit = 5,
[Parameter()]
[hashtable] $Variables,
[Parameter()]
[ArgumentCompleter({
param(
[string] $commandName,
[string] $parameterName,
[string] $wordToComplete
)
(Get-Command -CommandType Filter, Function).Name -like "$wordToComplete*"
})]
[string[]] $Functions,
[Parameter()]
[ValidateSet('ReuseThread', 'UseNewThread')]
[System.Management.Automation.Runspaces.PSThreadOptions] $ThreadOptions = [System.Management.Automation.Runspaces.PSThreadOptions]::ReuseThread
)
begin {
try {
$iss = [initialsessionstate]::CreateDefault2()
foreach($key in $Variables.PSBase.Keys) {
$iss.Variables.Add([System.Management.Automation.Runspaces.SessionStateVariableEntry]::new($key, $Variables[$key], ''))
}
foreach($function in $Functions) {
$def = (Get-Command $function).Definition
$iss.Commands.Add([System.Management.Automation.Runspaces.SessionStateFunctionEntry]::new($function, $def))
}
$usingParams = #{}
foreach($usingstatement in $ScriptBlock.Ast.FindAll({ $args[0] -is [System.Management.Automation.Language.UsingExpressionAst] }, $true)) {
$varText = $usingstatement.Extent.Text
$varPath = $usingstatement.SubExpression.VariablePath.UserPath
# Credits to mklement0 for catching up a bug here. Thank you!
# https://github.com/mklement0
$key = [Convert]::ToBase64String([System.Text.Encoding]::Unicode.GetBytes($varText.ToLower()))
if(-not $usingParams.ContainsKey($key)) {
$usingParams.Add($key, $PSCmdlet.SessionState.PSVariable.GetValue($varPath))
}
}
$pool = [runspacefactory]::CreateRunspacePool(1, $ThrottleLimit, $iss, $Host)
$tasks = [System.Collections.Generic.List[hashtable]]::new()
$pool.ThreadOptions = $ThreadOptions
$pool.Open()
}
catch {
$PSCmdlet.ThrowTerminatingError($_)
}
}
process {
try {
# Thanks to Patrick Meinecke for his help here.
# https://github.com/SeeminglyScience/
$ps = [powershell]::Create().AddScript({
$args[0].InvokeWithContext($null, [psvariable]::new('_', $args[1]))
}).AddArgument($ScriptBlock.Ast.GetScriptBlock()).AddArgument($InputObject)
# This is how `Start-Job` does it's magic. Credits to Jordan Borean for his help here.
# https://github.com/jborean93
# Reference in the source code:
# https://github.com/PowerShell/PowerShell/blob/7dc4587014bfa22919c933607bf564f0ba53db2e/src/System.Management.Automation/engine/ParameterBinderController.cs#L647-L653
if($usingParams.Count) {
$null = $ps.AddParameters(#{ '--%' = $usingParams })
}
$ps.RunspacePool = $pool
$tasks.Add(#{
Instance = $ps
AsyncResult = $ps.BeginInvoke()
})
}
catch {
$PSCmdlet.WriteError($_)
}
}
end {
try {
foreach($task in $tasks) {
$task['Instance'].EndInvoke($task['AsyncResult'])
if($task['Instance'].HadErrors) {
$task['Instance'].Streams.Error
}
}
}
catch {
$PSCmdlet.WriteError($_)
}
finally {
$tasks.Instance, $pool | ForEach-Object Dispose
}
}
}
} -Function Invoke-Parallel | Import-Module -Force
SYNTAX
Invoke-Parallel -InputObject <Object> [-ScriptBlock] <ScriptBlock> [-ThrottleLimit <Int32>]
[-ArgumentList <Hashtable>] [-ThreadOptions <PSThreadOptions>] [-Functions <String[]>] [<CommonParameters>]
REQUIREMENTS
Compatible with Windows PowerShell 5.1 and PowerShell Core 7+.
INSTALLATION
If you wish to install this through the Gallery and have it available as a Module:
Install-Module PSParallelPipeline -Scope CurrentUser
EXAMPLES
EXAMPLE 1: Run slow script in parallel batches
$message = 'Hello world from {0}'
0..10 | Invoke-Parallel {
$using:message -f [runspace]::DefaultRunspace.InstanceId
Start-Sleep 3
} -ThrottleLimit 3
EXAMPLE 2: Same as previous example but with -Variables parameter
$message = 'Hello world from {0}'
0..10 | Invoke-Parallel {
$message -f [runspace]::DefaultRunspace.InstanceId
Start-Sleep 3
} -Variables #{ message = $message } -ThrottleLimit 3
EXAMPLE 3: Adding to a single thread safe instance
$sync = [hashtable]::Synchronized(#{})
Get-Process | Invoke-Parallel {
$sync = $using:sync
$sync[$_.Name] += #( $_ )
}
$sync
EXAMPLE 4: Same as previous example but using -Variables to pass the reference instance to the Runspaces
This method is the recommended when passing reference instances to the runspaces, $using: may fail in some situations.
$sync = [hashtable]::Synchronized(#{})
Get-Process | Invoke-Parallel {
$sync[$_.Name] += #( $_ )
} -Variables #{ sync = $sync }
$sync
EXAMPLE 5: Demonstrates how to pass a locally defined Function to the Runspace scope
function Greet { param($s) "$s hey there!" }
0..10 | Invoke-Parallel {
Greet $_
} -Functions Greet
PARAMETERS
-InputObject
Specifies the input objects to be processed in the ScriptBlock.
Note: This parameter is intended to be bound from pipeline.
Type: Object
Parameter Sets: (All)
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: True (ByValue)
Accept wildcard characters: False
-ScriptBlock
Specifies the operation that is performed on each input object.
This script block is run for every object in the pipeline.
Type: ScriptBlock
Parameter Sets: (All)
Aliases:
Required: True
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
-ThrottleLimit
Specifies the number of script blocks that are invoked in parallel.
Input objects are blocked until the running script block count falls below the ThrottleLimit.
The default value is 5.
Type: Int32
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: 5
Accept pipeline input: False
Accept wildcard characters: False
-Variables
Specifies a hash table of variables to have available in the Script Block (Runspaces).
The hash table Keys become the Variable Name inside the Script Block.
Type: Hashtable
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
-Functions
Existing functions in the Local Session to have available in the Script Block (Runspaces).
Type: String[]
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
-ThreadOptions
These options control whether a new thread is created when a command is executed within a Runspace.
This parameter is limited to ReuseThread and UseNewThread. Default value is ReuseThread.
See PSThreadOptions Enum for details.
Type: PSThreadOptions
Parameter Sets: (All)
Aliases:
Accepted values: Default, UseNewThread, ReuseThread, UseCurrentThread
Required: False
Position: Named
Default value: ReuseThread
Accept pipeline input: False
Accept wildcard characters: False
PowerShell workflows provide a powerful way to run PowerShell modules and scripts against multiple servers in parallel. There are a lot of different ways in PowerShell to run scripts against multiple instances, but most methods simply run serially one server at a time.
I have an issue with a DSC config i'm trying to use to install and run a mongo service on an Azure VM.
When the DSC runs on the initial deployment of the VM, the secondary disk 'F' is attached and formatted successfully, however... i receive an error when trying to create directories on the new disk:
Error message: \"DSC Configuration 'Main' completed with error(s).
Cannot find drive. A drive with the name 'F' does not exist.
The PowerShell DSC resource '[Script]SetUpDataDisk' with SourceInfo 'C:\\Packages\\Plugins\\Microsoft.Powershell.DSC\\2.73.0.0\\DSCWork\\MongoDSC.0\\MongoDSC.ps1::51::2::Script' threw one or more non-terminating errors while running the Set-TargetResource functionality.
Here is my DSC script :
Configuration Main
{
Param ( [string] $nodeName )
Import-DscResource -ModuleName PSDesiredStateConfiguration
Import-DscResource -ModuleName xStorage
Node $nodeName
{
xWaitforDisk Disk2
{
DiskId = 2
RetryIntervalSec = 60
RetryCount = 60
}
xDisk FVolume
{
DiskId = 2
DriveLetter = 'F'
FSLabel = 'MongoData'
DependsOn = "[xWaitforDisk]Disk2"
}
Script SetUpDataDisk{
TestScript ={
return Test-Path "f:\mongoData\"
}
SetScript ={
#set up the directories for mongo
$retries = 0
Do{
$mountedDrive = Get-Volume | Where DriveLetter -eq 'F'
if($mountedDrive -eq $null)
{
Start-Sleep -Seconds 60
$retries = $retries + 1
}
}While(($mountedDrive -eq $null) -and ($retries -lt 60))
$dirName = "mongoData"
$dbDirName = "db"
$logDirName = "logs"
##! ERROR THROWN FROM THESE LINES
New-Item -Path "F:\$dirName" -ItemType Directory
New-Item -Path "F:\$dirName\$dbDirName" -ItemType Directory
New-Item -Path "F:\$dirName\$logDirName" -ItemType Directory
}
GetScript = {#{Result = "SetUpDataDisk"}}
DependsOn = "[xDisk]FVolume"
}
}
}
The annoying thing is that if i run the deployment again everything works with no errors, i have put a loop in to try and wait for the disk to be ready but this still throws the error. I'm very new to DSC so any pointers would be helpful.
It seems xDiskAccessPath can be used for that:
<#
.EXAMPLE
This configuration will wait for disk 2 to become available, and then make the disk available as
two new formatted volumes mounted to folders c:\SQLData and c:\SQLLog, with c:\SQLLog using all
available space after c:\SQLData has been created.
#>
Configuration Example
{
Import-DSCResource -ModuleName xStorage
Node localhost
{
xWaitforDisk Disk2
{
DiskId = 2
RetryIntervalSec = 60
RetryCount = 60
}
xDiskAccessPath DataVolume
{
DiskId = 2
AccessPath = 'c:\SQLData'
Size = 10GB
FSLabel = 'SQLData1'
DependsOn = '[xWaitForDisk]Disk2'
}
xDiskAccessPath LogVolume
{
DiskId = 2
AccessPath = 'c:\SQLLog'
FSLabel = 'SQLLog1'
DependsOn = '[xDiskAccessPath]DataVolume'
}
}
}
https://github.com/PowerShell/xStorage/blob/dev/Modules/xStorage/Examples/Resources/xDiskAccessPath/1-xDiskAccessPath_InitializeDataDiskWithAccessPath.ps1
I am attempting to use powershell v2.0 to create a scheduled task using the Schedule.Service Com Object.
I have created ps1 file but we get an error when we execute it that is eluding me.
Here is the code:
param(
[string]$xmlFilePath = $(throw "-xmlFilePath is required"),
[string]$server = "localhost",
[string]$taskFolderName = "\"
)
try {
$xmlContent = [xml] (Get-Content $xmlFilePath);
$taskScheduler = New-Object -ComObject Schedule.Service
$taskScheduler.Connect($server)
$taskFolder = $taskScheduler.GetFolder($taskFolderName);
$taskFolder.RegisterTask($xmlFilePathl, $xmlContent, 6, "<user name>", "<password>", 1);
}
catch {
$Exception = $_.Exception;
while ($Exception.Message -ne $null)
{
Write-Error $Exception.Message;
$Exception = $Exception.InnerException;
}
return;
}
Runnig this locally or remotely gives the same result.
The error is as follows:
C:\temp\CreateScheduledTaskFromXML.ps1 : Exception calling "RegisterTask" with "6" argument(s): "(1,2)::"
At line:1 char:33
+ .\CreateScheduledTaskFromXML.ps1 <<<< -server DEVBDAPP12 -xmlFilePath "C:\Temp\collectors\adcomputer\Discovery.Ser
vices.ActiveDirectory.Computer.Collector.xml"
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,CreateScheduledTaskFromXML.ps1
What does this mean "Exception calling "RegisterTask" with "6" argument(s): "(1,2)::""
The failure is occuring on the registertask method but the error does not make sense.
This use is based on the following MSDN article
http://msdn.microsoft.com/en-us/library/windows/desktop/aa382575(v=vs.85).aspx
As a side not, We cannot update this machine to powershell 3.0 or use the powerpack at this time and would like to avoid schtask.exe so these are not options
If anyone has any insight it would be greatly appreciated.
If you just type :
$taskScheduler = New-Object -ComObject Schedule.Service
$taskScheduler.Connect("localhost")
$taskFolder = $taskScheduler.GetFolder("\")
$taskFolder.RegisterTask
You receive :
IRegisteredTask RegisterTask (string, string, int, Variant, Variant, _TASK_LOGON_TYPE, Variant)
There are 7 arguments, this meas that you miss one argument. If you have a look at Microsoft documentation the call looks like this :
HRESULT RegisterTask(
[in] BSTR path,
[in] BSTR xmlText,
[in] LONG flags,
[in] VARIANT userId,
[in] VARIANT password,
[in] TASK_LOGON_TYPE logonType,
[in, optional] VARIANT sddl,
[out] IRegisteredTask **ppTask
);
So I would try to add $null as the last argument (security descriptor) :
$taskFolder.RegisterTask($xmlFilePathl, $xmlContent, 6, "<user name>", "<password>", 1, $null)
I was able to figure out this problem. First was the first argument in the RegisterTask. I was interpreting this as the folder in the Task scheduler. However, this is not the folder but the task name. Second, with the help from some of the comments and the validation flag, I found that the second argument needs to be a string type and not xml type. Finally, I had to add a 7th argument of null to fullfill the method signature: Thanks for all your help
Here is the updated code that works:
param(
[string]$taskName = $(throw "-taskName is required"), #complete path for the scheduled task
[string]$xmlFilePath = $(throw "-xmlFilePath is required"),
[string]$server = "localhost", # Only works with Servers it can access. Use WinRM for cross domain request
[string]$taskFolderName = "\"
)
$value = $null;
try {
$xmlContent = [string](Get-Content $xmlFilePath);
$taskScheduler = New-Object -ComObject Schedule.Service
$taskScheduler.Connect($server)
$taskFolder = $taskScheduler.GetFolder($taskFolderName);
$value = $taskFolder.RegisterTask($taskName, $xmlContent, 6, "<username>", "<password>", 1, $null);
}
catch {
$Exception = $_.Exception;
while ($Exception.Message -ne $null)
{
Write-Error $Exception.Message;
$Exception = $Exception.InnerException;
}
return;
}
This is a query leading on from another that was very kindly worked on by #marek-grzenkowicz.
Issue 1) The script is generating an error when it runs. It was mentioned that you can't modify an element of the collection that is being enumerated. Can you show me how to work around this? Despite changes being made to avoid the problem it's still happening
An error occurred while enumerating through a collection: Collection was modified; enumera tion operation may not execute.. At C:\Users\quickdev1\Documents\LoopThroughAl lLibrariesCreateView.ps1:7 char:10
+ foreach <<<< ($list in $web.Lists) {
+ CategoryInfo : InvalidOperati on: (Microsoft.Share...on+SPEnumerator:S PEnumerator) [], RuntimeException
+ FullyQualifiedErrorId : BadEnumeration
Issue 2) I would like to put in some logic to check if an existing view is already there with the name "Detailed" and if so to skip that Library but I'm not sure how to achieve it.
If anyone could help it would be awesome.
Thanks,
Ashley
Full Script
Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue
$siteURL = "http://sp14fdev01/"
$site = Get-SPSite($siteURL)
foreach($web in $site.AllWebs) {
foreach($list in $web.Lists) {
if($list.BaseType -eq "DocumentLibrary") {
// the variables `$web` and `$list` already reference the objects you need
//$site = New-Object Microsoft.SharePoint.SPSite($SiteURL) ;
//$web = $site.OpenWeb($SiteURL);
// new instance of the list is necessary to avoid the error "Collection was modified"
$newList = $web.Lists.item($list.ID);
$viewfields = New-Object System.Collections.Specialized.StringCollection
$viewfields.Add("DocIcon")
$viewfields.Add("LinkFilename")
$viewfields.Add("_UIVersionString")
$viewfields.Add("FileSizeDisplay")
$viewfields.Add("Created")
$viewfields.Add("Modified")
$viewfields.Add("Editor")
[void]$newList.Views.Add("Detailed", $viewfields, "", 100, $true, $true)
$newList.Update();
// setting the default view
$view=$newList.Views["Detailed"]
$view.DefaultView = $true
$view.Update()
}
}
$web.Dispose();
}
$site.Dispose();
You will have to use a for-loop instad of a foreach since you are modifying the collection you are looping over.
Like this:
foreach($web in $site.AllWebs)
{
$listCounter = $web.Lists.Count
for($i=0;$i -le $listCounter;$i++)
{
$list = $web.Lists[$i]
//etc....
}
}
I have a POSH script that sets a user's access to a specific folder for some files to read.
The user's group gets assigned to the folder (which happens to be the same name).
I then created a new view, set it to default, and told it to display all files without folders.
This script has been working perfectly for 4 months but now some people want to use the mobile view and I am running into an issue. If a user does not have read access from the root directory to the folder in question, SharePoints mobile view will not show the folder.
For example the user has the following permissions set:
Limited Access on the root
Limited Access on the Alpha folder
Read access to the folder under Alpha
I need to make it so a user can view this in the mobile view.
Here is my code:
#region Start
# Create Connection to stopwatch diagnostics
[Void][System.Diagnostics.Stopwatch] $sw;
# New Stopwatch object
$sw = New-Object System.Diagnostics.StopWatch;
# Stop any watches that might be running
$sw.Stop();
$sw.Start();
clear
[int]$a = 0;
# Which folders to assign
[array]$sections = "Alpha","Bravo","Charlie","Delta";
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint");
#endregion
#region The meat and potatoes
foreach ($section in $sections) {
#region get the Directories
$pathtowd = "\\path\to\webdav\$section"; # UNC Path to the pivots
$dirs = Get-ChildItem $pathtowd | Where-Object { $_.Attributes -band [System.IO.FileAttributes]::Directory }
#endregion
#region Connect to SharePoint
$SPSite = New-Object Microsoft.SharePoint.SPSite("http://sharepoint"); # Connect to SharePoint
$OpenWeb = $SpSite.OpenWeb("/Downloads"); # Subsite of downloads
#endregion
[int]$i = 0; # Integer to increment
foreach ($dir in $dirs) {
$verify_groups = $OpenWeb.groups | ? {$_.Name -eq "$dir"; } # Verify the groups
if ($verify_groups -ne $null) {
if ($dir.ToString() -eq $verify_groups.ToString()) {
$i++; # Increment the groups
Write-Host "[", $sw.Elapsed.ToString(), "] -",$dir -F Green; # Output status
$path = "http://sharepoint/Downloads/Pivots/$section/" + $dir; # Set the Path
$spc = $OpenWeb.SiteGroups; # SharePoint connection
$group = $spc[$dir]; # Directory
$roleAssignment = New-Object Microsoft.SharePoint.SPRoleAssignment($group); # Role Assignment connection
$OpenWeb.GetFolder($path).Item.BreakRoleInheritance("true"); # Break inheritance
$roleAssignment.RoleDefinitionBindings.Add($OpenWeb.RoleDefinitions["Read"]);# Set permissions
$OpenWeb.GetFolder($path).Item.RoleAssignments.Add($roleAssignment); # Add the role
$OpenWeb.GetFolder($path).Item.Update();
}
else { Write-Host "[", $sw.Elapsed.ToString(), "] -", $verify_groups " is empty"; }
}
}
Write-Host '[' $sw.Elapsed.ToString() '] - found '$i' Folders' -f Red; # Output Status
$SPSite.Dispose(); # Dispose the connection
$OpenWeb.Dispose();
$a = $a+$i; # Total Folders
}
#endregion
$sw.Stop(); # Stop the timer
[string]$howlong = $sw.Elapsed.ToString(); # How long
write-host "Updated in Time: " $howlong -F Green; # Last message
Found it. Took 4 hours straight of trial and error but it works. Hope this helps someone else out as well. Place before $OpenWeb.GetFolder($path).Item.Update();
$returnGroups = $OpenWeb.GetFolder($path).Item.RoleAssignments | `
where {`
($_.RoleDefinitionBindings -eq $OpenWeb.RoleDefinitions["Limited Access"]) -and `
($_.RoleDefinitionBindings -notcontains $OpenWeb.RoleDefinitions["Read"])`
};
if ($returnGroups -not $null)
{
foreach ($item in $returnGroups)
{
Write-Host "Removing: " $item.Member;
$OpenWeb.GetFolder($path).Item.RoleAssignments.Remove($spc[$item.Member]);
}
}