Automatic deployment of solutions with PowerShell - sharepoint

I have a folder, containing several solutions for a SharePoint application, which I want to add and install. I want to iterate over the elements in the folder, then use the Add-SPSolution. After that I want to do a check if the solutions are done deploying, before using the Install-SPSolution. Here is a snippet that I am currently working on:
# Get the location of the folder you are currently in
$dir = $(gl)
# Create a list with the .wsp solutions
$list = Get-ChildItem $dir | where {$_.extension -eq ".wsp"}
Write-Host 'DEPLOYING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Add-SPSolution -LiteralPath $my_file.FullName}
Write-Host 'SLEEP FOR 30 SECONDS'
Start-Sleep -s 30
Write-Host 'INSTALLING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Install-SPSolution -Identity $my_file.Name -AllWebApplications -GACDeployment}
Is there a way to check if the deployment is finished, and it is ready to start installing the solutions?

You need to check the SPSolution.Deployed property value in a loop - basic solution looks like this:
do { Start-Sleep 2 } while (!((Get-SPSolution $name).Deployed))
The Deploying SharePoint 2010 Solution Packages Using PowerShell article contains more details and this comment discusses a potential caveat.

Related

WinSCP - How to download only folders/files 1 day old while excluding empty folders/files? [duplicate]

I am limited to PuTTY and WinSCP only.
I am trying to download log directories with log files. For example, I want to grab all log_files 6 days old or newer. log_dir2 and log_dir3 including the folders match the criteria, while log_dir1 and its files does not.
DIR/log_dir1/log_files % older than 6 days
DIR/log_dir2/log_files % meets criteria
DIR/log_dir3/log_files % meets criteria
My problem is that while the log_files of log_dir1 are not downloaded, the syntax I am currently using downloads the log_dir1 folder. Normally, not a big deal, but we are talking hundreds of log_dir folders (all empty as the files are older than 6 days). For reasons beyond my control, I cannot move or archive these old log directories with their log files.
My question is simply, how do I change my syntax to ignore folders that are older than 6 days as well as files.
get -filemask="*>6D" /DIR/* C:\temp
I have tried several different combinations of parameters and I have read the support page about Directory Masks and Path Masks. I cannot get any of them working (version issue?). Can anyone explain their syntax better than the help page. I will update tomorrow with the current version of WinSCP that I am using.
Time constraint in WinSCP file mask cannot be used for directories.
But you can prevent WinSCP from creating the empty folders. Use -rawtransfersettings switch with ExcludeEmptyDirectories setting.
get -rawtransfersettings ExcludeEmptyDirectories=1 -filemask="*>6D" /DIR/* C:\temp
This is the original answer, before WinSCP supported ExcludeEmptyDirectories. It might still be useful as a basis for implementations that have even more specific constraints.
You can implement this custom logic easily in PowerShell script with a use of WinSCP .NET assembly:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "username"
Password = "password"
SshHostKeyFingerprint = "..."
}
$remotePath = "/remote/path"
$localPath = "C:\local\path"
$limit = (Get-Date).AddDays(-6)
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Enumerate files to download
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, $Null, [WinSCP.EnumerationOptions]::AllDirectories) |
Where-Object { $_.LastWriteTime -gt $limit }
foreach ($fileInfo in $fileInfos)
{
$localFilePath =
[WinSCP.RemotePath]::TranslateRemotePathToLocal(
$fileInfo.FullName, $remotePath, $localPath)
# If the corresponding local folder does not exist yet, create it
$localFileDir = Split-Path -Parent $localFilePath
if (!(Test-Path -Path $localFileDir))
{
Write-Host "Creating local directory $localFileDir..."
New-Item $localFileDir -ItemType directory | Out-Null
}
Write-Host "Downloading file $($fileInfo.FullName)..."
# Download file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$transferResult = $session.GetFiles($sourcePath, $localFilePath)
# Did the download succeeded?
if (!$transferResult.IsSuccess)
{
# Print error (but continue with other files)
Write-Host ("Error downloading file ${remoteFilePath}: " +
$transferResult.Failures[0].Message)
}
}
$session.Dispose()
Write-Host "Done."
Run the script (download.ps1) like:
powershell.exe -ExecutionPolicy Unrestricted -File download.ps1

unable to append data to sharepoint file via Azure Automation

Ok I have asked a question like this but now I am trying to perform the task via Azure Automation. I can connect to the SharePoint site via Azure Automation (powershell). with the correct credentials. I can download the file and append data to it. But I can when I try and upload the file back to SharePoint it adds the contents 3 times and then Azure Automation suspends the Runbook after 3 times.
It does run perfect if I upload this file as a different file name.
$siteurl="https://abc.sharepoint.com/sites/xxx/teamsites/os"
$credSP = Get-AutomationPSCredential -Name 'test'
$fileFolder = "$Env:temp"
Connect-PnPOnline -Url $siteurl -Credentials $credSP
Get-PnPFile -Url "/sites/xxx/teamsites/os/Directory and Operating
Systems/test.csv" -Path $fileFolder -Filename test.csv -AsFile -Force
$test = "31-07-2019 -11:35"
Add-Content -Path $fileFolder\test.csv $test
Add-PnPFile -Path $fileFolder\test.csv -Approve -Folder "Directory and
Operating Systems" #-ErrorAction Ignore
Here are the results
test test
31-07-2019 -11:35
31-07-2019 -11:35
31-07-2019 -11:35
As you can see it added $test 3 times. But I dont have this issue if I upload it as a new file name.
Ok after a while I have fix the issue.
After the add-pnpfile ...... you pipe it to | out-null
Thats it. the sript stops after it uploads ,
happy days

Stop IIS express process based on site name

The site im working on got 3 individual sites running on the IIS.
When I make changes to on particular library I need to restart the site using it. The way I do that now is by manually rightclicking the IIS Express icon in the system tray, and then clicking 'Stop site', and after that I execute the debugging..
I would like to make that part automatic, so when ever i start debugging it will stop that particular site.
If I don't stop it, then it will just reuse the current running site, but if I stop it, then it will restart it..
Is it event posible? I know how to find the PID, but I don't get the name of the site behind the PID..
I put together this script in PowerShell:
$site = 'Webapplication' # replace this by your site name (not case sensitive)
$process = Get-CimInstance Win32_Process -Filter "Name = 'iisexpress.exe'" | ? {$_.CommandLine -like "*/site:`"$site`"*" }
if($process -ne $null)
{
Write-Host "Trying to stop $($process.CommandLine)"
Stop-Process -Id $process.ProcessId
Start-Sleep -Seconds 1 # Wait 1 second for good measure
Write-Host "Process was stopped"
} else
{
Write-Host "Website was not running"
}
Modify the first line to replace the site name with yours. Save this
file as stopiis.ps1 on your project folder (not the solution folder).
Now, on Solution Explorer, right-click and choose properties
On the left side, choose Build Events
Put this on 'Pre-Build event command line' so it will run before compiling:
echo powershell -File "$(ProjectDir)stopiis.ps1"
powershell -File "$(ProjectDir)stopiis.ps1"
Note: you do not need to run Visual Studio in Administrative mode because IISExpress.exe run under your account

Have to build 2 solutions, one per project

We have to build two solutions central on a TFS Server. One solution is a framework, the other includes services, which should be build separately per project in order to deploy them later via script.
In addition the framework assemblies are copied to a (base) project within the framework solution. All projects of the second solution referring to this 'base' project.
My problem is, that I have no idea, how to configure the solutions, project and builds, to behave the request illustrated above.
Please help.
Note: I don't want to put each service project into an msi in order to install it. I just want to deploy the Service out of a central drop-folder on the TFS server.
Team Build can build multiple solutions in the Build Process Template. Just click the [...] button behind the Projects to Build and add both solutions.
TFS redirects the output directory of your projects, which will probably break your script that copies the output from A to the "base project" of B. In order to turn of this redirection set the Output location to AsConfigured.
Now TFS won't know how to copy your output to the Binaries folder, which serves as the source for the copy to droplocation action. To solve that you'll need to write a powershell script and configure this as a post-build script.
The process to create a drop script is clearly documented on MSDN and a sample script is available from CodePlex.
##-----------------------------------------------------------------------
## <copyright file="GatherItemsForDrop.ps1">(c) http://TfsBuildExtensions.codeplex.com/. This source is subject to the Microsoft Permissive License. See http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx. All other rights reserved.</copyright>
##-----------------------------------------------------------------------
# Copy the binaries to the bin directory
# so that the build server can drop them
# to the staging location specified on the Build Defaults tab
#
# See
# http://msdn.microsoft.com/en-us/library/bb778394(v=vs.120).aspx
# http://msdn.microsoft.com/en-us/library/dd647547(v=vs.120).aspx#scripts
# Enable -Verbose option
[CmdletBinding()]
# Disable parameter
# Convenience option so you can debug this script or disable it in
# your build definition without having to remove it from
# the 'Post-build script path' build process parameter.
param([switch]$Disable)
if ($PSBoundParameters.ContainsKey('Disable'))
{
Write-Verbose "Script disabled; no actions will be taken on the files."
}
# This script copies the basic file types for managed code projects.
# You can change this list to meet your needs.
$FileTypes = $("*.exe","*.dll","*.exe.config","*.pdb")
# Specify the sub-folders to include
$SourceSubFolders = $("*bin*","*obj*")
# If this script is not running on a build server, remind user to
# set environment variables so that this script can be debugged
if(-not $Env:TF_BUILD -and -not ($Env:TF_BUILD_SOURCESDIRECTORY -and $Env:TF_BUILD_BINARIESDIRECTORY))
{
Write-Error "You must set the following environment variables"
Write-Error "to test this script interactively."
Write-Host '$Env:TF_BUILD_SOURCESDIRECTORY - For example, enter something like:'
Write-Host '$Env:TF_BUILD_SOURCESDIRECTORY = "C:\code\FabrikamTFVC\HelloWorld"'
Write-Host '$Env:TF_BUILD_BINARIESDIRECTORY - For example, enter something like:'
Write-Host '$Env:TF_BUILD_BINARIESDIRECTORY = "C:\code\bin"'
exit 1
}
# Make sure path to source code directory is available
if (-not $Env:TF_BUILD_SOURCESDIRECTORY)
{
Write-Error ("TF_BUILD_SOURCESDIRECTORY environment variable is missing.")
exit 1
}
elseif (-not (Test-Path $Env:TF_BUILD_SOURCESDIRECTORY))
{
Write-Error "TF_BUILD_SOURCESDIRECTORY does not exist: $Env:TF_BUILD_SOURCESDIRECTORY"
exit 1
}
Write-Verbose "TF_BUILD_SOURCESDIRECTORY: $Env:TF_BUILD_SOURCESDIRECTORY"
# Make sure path to binary output directory is available
if (-not $Env:TF_BUILD_BINARIESDIRECTORY)
{
Write-Error ("TF_BUILD_BINARIESDIRECTORY environment variable is missing.")
exit 1
}
if ([IO.File]::Exists($Env:TF_BUILD_BINARIESDIRECTORY))
{
Write-Error "Cannot create output directory."
Write-Error "File with name $Env:TF_BUILD_BINARIESDIRECTORY already exists."
exit 1
}
Write-Verbose "TF_BUILD_BINARIESDIRECTORY: $Env:TF_BUILD_BINARIESDIRECTORY"
# Tell user what script is about to do
Write-Verbose "Will look for and then gather "
Write-Verbose "$FileTypes files from"
Write-Verbose "$Env:TF_BUILD_SOURCESDIRECTORY and copy them to "
Write-Verbose $Env:TF_BUILD_BINARIESDIRECTORY
# Find the files
$files = gci $Env:TF_BUILD_SOURCESDIRECTORY -recurse -include $SourceSubFolders |
?{ $_.PSIsContainer } |
foreach { gci -Path $_.FullName -Recurse -include $FileTypes }
if($files)
{
Write-Verbose "Found $($files.count) files:"
foreach ($file in $files) {
Write-Verbose $file.FullName
}
}
else
{
Write-Warning "Found no files."
}
# If binary output directory exists, make sure it is empty
# If it does not exist, create one
# (this happens when 'Clean workspace' build process parameter is set to True)
if ([IO.Directory]::Exists($Env:TF_BUILD_BINARIESDIRECTORY))
{
$DeletePath = $Env:TF_BUILD_BINARIESDIRECTORY + "\*"
Write-Verbose "$Env:TF_BUILD_BINARIESDIRECTORY exists."
if(-not $Disable)
{
Write-Verbose "Ready to delete $DeletePath"
Remove-Item $DeletePath -recurse
Write-Verbose "Files deleted."
}
}
else
{
Write-Verbose "$Env:TF_BUILD_BINARIESDIRECTORY does not exist."
if(-not $Disable)
{
Write-Verbose "Ready to create it."
[IO.Directory]::CreateDirectory($Env:TF_BUILD_BINARIESDIRECTORY) | Out-Null
Write-Verbose "Directory created."
}
}
# Copy the binaries
Write-Verbose "Ready to copy files."
if(-not $Disable)
{
foreach ($file in $files)
{
Copy $file $Env:TF_BUILD_BINARIESDIRECTORY
}
Write-Verbose "Files copied."
}
A better solution would probably be to have 2 separate builds where the first build publishes the dependencies of the second project as a NuGet package. The Microsoft ALM Rangers have delivered a guide that explains how to set that up.
One option is to host your own nuget feed: http://docs.nuget.org/create/hosting-your-own-nuget-feeds
By hosting your own feed, you can execute custom build activities within your build process which updates your feed.
See this documentation for custom tfs build activities: http://nakedalm.com/creating-a-custom-activity-for-team-foundation-build/
See this documentation for adding powershell to your build process: http://blogs.technet.com/b/heyscriptingguy/archive/2014/04/21/powershell-and-tfs-the-basics-and-beyond.aspx
By hosting your own nuget feed, you have the ability to have your consuming solution leverage your private nugget feed and packages to deal with dependency management and versions. By leveraging the custom build activities you have the ability to update your nuget feed via .net or powershell. You also can automate the deployment via your powershell scripts.

Get dbname from multiple web.config files with powershell

I would like to issue a powershell command to return me the connection string (specifically I am looking for the db name value) for all the web sites on a web server...
So I would like to see something like
site1 dbname=Northwind
site2 dbname=Fitch
site3 dbname=DemoDB
I have tried using the IIS Powershell snap-in... I thought I was close with this:
PS C:\Windows\system32> Get-WebApplication | Get-WebConfiguration -filter /connectionStrings/*
but... after looking at the results... my answer doesn't appear to be in there
I am very new to powershell - so excuse my ignornance and inexperience
Any help appreciated!
thanks!
Hopefully, this will get you started. This just assumes there will be a web.config file at the physical path of the web application's physical path. It does not recurse to find other web.config files in the web application. It also assumes your connection strings are in the connectionStrings configuration element.
Import-Module WebAdministration
Get-WebApplication | `
ForEach-Object {
$webConfigFile = [xml](Get-Content "$($_.PhysicalPath)\Web.config")
Write-Host "Web Application: $($_.path)"
foreach($connString in $webConfigFile.configuration.connectionStrings.add)
{
Write-Host "Connection String $($connString.name): $($connString.connectionString)"
$dbRegex = "((Initial\sCatalog)|((Database)))\s*=(?<ic>[a-z\s0-9]+?);"
$found = $connString.connectionString -match $dbRegex
if ($found)
{
Write-Host "Database: $($Matches["ic"])"
}
}
Write-Host " "
}
This post may give you an idea to start with. Basically load in the web.config file as an XML file and then just find the node where the connection string is.
Do something like $myFile = ([xml] Get-Content web.config). You can then pipe that to Get-Member ( $myFile | Get-Member -MemberType Property) to start working your way into the file to see what node has it. I'm not at a computer where I can show you some screenshots to explain it more, but you can check this chapter out from PowerShell.com "Master PowerShell" e-book that explains working with XML very well.

Resources