Stop IIS express process based on site name - sharepoint

The site im working on got 3 individual sites running on the IIS.
When I make changes to on particular library I need to restart the site using it. The way I do that now is by manually rightclicking the IIS Express icon in the system tray, and then clicking 'Stop site', and after that I execute the debugging..
I would like to make that part automatic, so when ever i start debugging it will stop that particular site.
If I don't stop it, then it will just reuse the current running site, but if I stop it, then it will restart it..
Is it event posible? I know how to find the PID, but I don't get the name of the site behind the PID..

I put together this script in PowerShell:
$site = 'Webapplication' # replace this by your site name (not case sensitive)
$process = Get-CimInstance Win32_Process -Filter "Name = 'iisexpress.exe'" | ? {$_.CommandLine -like "*/site:`"$site`"*" }
if($process -ne $null)
{
Write-Host "Trying to stop $($process.CommandLine)"
Stop-Process -Id $process.ProcessId
Start-Sleep -Seconds 1 # Wait 1 second for good measure
Write-Host "Process was stopped"
} else
{
Write-Host "Website was not running"
}
Modify the first line to replace the site name with yours. Save this
file as stopiis.ps1 on your project folder (not the solution folder).
Now, on Solution Explorer, right-click and choose properties
On the left side, choose Build Events
Put this on 'Pre-Build event command line' so it will run before compiling:
echo powershell -File "$(ProjectDir)stopiis.ps1"
powershell -File "$(ProjectDir)stopiis.ps1"
Note: you do not need to run Visual Studio in Administrative mode because IISExpress.exe run under your account

Related

Cross talk threads to allow for file system access in powershell

Reference: Runspace for button event in powershell
https://www.foxdeploy.com/blog/part-v-powershell-guis-responsive-apps-with-progress-bars.html
So, I believe my issue is that PowerShell is unable to access the memory space of the file system, from within the memory block, of my thread, is there a way to solve this, to access the file system, from a multi-threaded application?
Back Story:
So, I run a program, that calls upon "code"/command, from the command prompt, (*.exe) (Robocopy) to copy files from a server, to a group of computers, at a time. We have a classroom environment, at my work, so I have my setup, in a way, that I have a folder, per room. I keep a list of all our addresses (static), for each room, in their perspective folders. We have an update, from our developers, that we need to push to all of the rooms. We need to run a slow push, as to not disturb the production environment. It's proprietary, so we can't use a/any typical solution(s), like Microsoft SCCM. So, I created a script to push to the rooms. while it does work, it's not a smooth operation. I'm not actually the one pushing the update, because of the slow process, of updating. I'm just trying to make a stable smooth-running package, for the person, who is going to be doing it. My code works, outside of the thread, (I) tested it, I know it works.
So how I came my conclusion of knowing, that my code works outside of the thread. (The picture) I followed the same setup, with my code, (A button click event inside of a thread, running the form). Placed the actual working code, (tried, and tested, before making a Thread, for the interface, after completing backend operation code testing.)
("Region Boe's Addition") referring to Boe Prox, (from the link)
In his, he is updating from a command line/powershell window, via a function run inside a thread. I'm running an event from a button, inside of a thread and trying to run a separate thread, for the click event(s). Outside of the thread. The event works fine, but inside, it doesn't work, at all..
Basic Code:
// Multi thread, thread for the $form, and thread for the event (as per referenced link)
$var = [PowerShell]::Create().AddScript({ button.Add_Click{
$var = [PowerShell]::Create().AddScript{<Thread><Robocopy></Thread>}
})
Needed the "Start-Process" -Wait command to allow for the listbox, to be updated in-between copies, to confirm installation, through each step in the loop.
$choice = $comboBox.SelectedItem
# $drive = Get-Location
if(!(Test-Path -PathType Container -Path "L:\$choice"))
{
# New-Item -ItemType "container" -Path . -Name $choice
New-Item -ItemType "Directory" -Path . -Name $choice
}
# $folder = $_
# Where is it being stored at?
[System.IO.File]::ReadLines("Y:\$choice\IPs.txt") | foreach {
ping -a -n 2 -w 2000 $_ | Out-Null
Test-Connection -Count 2 -TimeToLive 2 $_ | Out-Null
if($?)
{
RoboCopy /Log:"L:\$folder\$_.log" $source \\$_\c$\tools
RoboCopy /Log+:"L:\$folder\$folder-MovementLogs.log" $source \\$_\c$\tools
Start-Process -Wait "P:\psexec.exe" -ArgumentList "\\$_ -d -e -h -s cmd /c reg import C:\tools\dump.reg"
# Copy-Item -LiteralPath Y:\* -Destination \\$_\c$\tools
$listBox.Items.Add($_)
}
}

Disable/enable completely input devices (mouse+keyboard+touchpad) in Windows10

I'm trying to disable/enable input devices in my laptop (win10), automatically (.reg file, python code etc)
I tried to use DevCon but after a lot of attempts it didn't work out for my touchpad and keyboard (I tried to disable, remove).
I searched the web and the solutions don't completely disable the devices (for example: Ctrl+Alt+Delete is not blocked).
I work on a windows 10 Laptop, You can assume that you have admin Privileges.
Have to check for Keyboard but for Mouse and Touchpad, you can use some Powershell commands to check and find out the actual device Classes and InstanceIDs and then turn off with an Admin elevated Powershell prompt.
The InstanceIDs of Mouse and Touchpad is different on different brands and types of Laptops, but first you can identify those with their Classes such as HIDClass. To get that fire up Powershell prompt(you've already tried REG and Python, so assuming you'll be okay with Powershell too (.ps1)) and run this command:
Get-PnpDevice | Where-Object {$_.Class -eq 'HIDClass'}
This may show 2 or 3 entries of which 1 belongs to Mouse and other to Touchpad, this would be a bit of trial and error, you have to pick any InstanceID to make filter more target specific and fire-up admin-elevated Powershell (search Powershell and click on "Run As Administrator") and run Disable-PnpDevice method like below(if InstanceId contains "ACPI"):
Get-PnpDevice | Where-Object {$_.Class -eq 'HIDClass' -and $_.InstanceId -like 'ACPI*'} | Disable-PnpDevice -Confirm:$false
This will disable Touchpad(in mine(Lenovo) it did disabled it) and then you can try out another InstanceID and disable the Mouse too in the same way. Voila !! both are turned off now.
If you prefer this in .ps1 script format then you need a self-elevating script which can enable/disable the devices without any halts, save this code in .ps1 file and then right-click > Run with PowerShell:
$Loc = Get-Location
"Security.Principal.Windows" | % { IEX "( [ $_`Principal ] [$_`Identity ]::GetCurrent() ).IsInRole( 'Administrator' )" } | ? {
$True | % { $Arguments = #('-NoProfile','-ExecutionPolicy Bypass','-NoExit','-File',"`"$($MyInvocation.MyCommand.Path)`"","\`"$Loc\`"");
Start-Process -FilePath PowerShell.exe -Verb RunAs -ArgumentList $Arguments; } }
Get-PnpDevice | Where-Object {$_.Class -eq 'HIDClass' -and $_.InstanceId -like 'ACPI*'} | Disable-PnpDevice -Confirm:$false
Read-Host
Note: If you in case disable the wrong or undesired device than for enabling it, in the same admin-elevated Powershell window run the same filter command (the Get-PnpDevice with filters) and replace Disable-PnpDevice with Enable-PnpDevice.
Let me know in comments if you still face issue with above commands.
Untested, but calling BlockInput() should do what you want. It blocks both keyboad and mouse input. It is however defined in user32.dll so you will need to use ctypes to access it:
import ctypes
ctypes.windll.user32.BlockInput(True)

Have to build 2 solutions, one per project

We have to build two solutions central on a TFS Server. One solution is a framework, the other includes services, which should be build separately per project in order to deploy them later via script.
In addition the framework assemblies are copied to a (base) project within the framework solution. All projects of the second solution referring to this 'base' project.
My problem is, that I have no idea, how to configure the solutions, project and builds, to behave the request illustrated above.
Please help.
Note: I don't want to put each service project into an msi in order to install it. I just want to deploy the Service out of a central drop-folder on the TFS server.
Team Build can build multiple solutions in the Build Process Template. Just click the [...] button behind the Projects to Build and add both solutions.
TFS redirects the output directory of your projects, which will probably break your script that copies the output from A to the "base project" of B. In order to turn of this redirection set the Output location to AsConfigured.
Now TFS won't know how to copy your output to the Binaries folder, which serves as the source for the copy to droplocation action. To solve that you'll need to write a powershell script and configure this as a post-build script.
The process to create a drop script is clearly documented on MSDN and a sample script is available from CodePlex.
##-----------------------------------------------------------------------
## <copyright file="GatherItemsForDrop.ps1">(c) http://TfsBuildExtensions.codeplex.com/. This source is subject to the Microsoft Permissive License. See http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx. All other rights reserved.</copyright>
##-----------------------------------------------------------------------
# Copy the binaries to the bin directory
# so that the build server can drop them
# to the staging location specified on the Build Defaults tab
#
# See
# http://msdn.microsoft.com/en-us/library/bb778394(v=vs.120).aspx
# http://msdn.microsoft.com/en-us/library/dd647547(v=vs.120).aspx#scripts
# Enable -Verbose option
[CmdletBinding()]
# Disable parameter
# Convenience option so you can debug this script or disable it in
# your build definition without having to remove it from
# the 'Post-build script path' build process parameter.
param([switch]$Disable)
if ($PSBoundParameters.ContainsKey('Disable'))
{
Write-Verbose "Script disabled; no actions will be taken on the files."
}
# This script copies the basic file types for managed code projects.
# You can change this list to meet your needs.
$FileTypes = $("*.exe","*.dll","*.exe.config","*.pdb")
# Specify the sub-folders to include
$SourceSubFolders = $("*bin*","*obj*")
# If this script is not running on a build server, remind user to
# set environment variables so that this script can be debugged
if(-not $Env:TF_BUILD -and -not ($Env:TF_BUILD_SOURCESDIRECTORY -and $Env:TF_BUILD_BINARIESDIRECTORY))
{
Write-Error "You must set the following environment variables"
Write-Error "to test this script interactively."
Write-Host '$Env:TF_BUILD_SOURCESDIRECTORY - For example, enter something like:'
Write-Host '$Env:TF_BUILD_SOURCESDIRECTORY = "C:\code\FabrikamTFVC\HelloWorld"'
Write-Host '$Env:TF_BUILD_BINARIESDIRECTORY - For example, enter something like:'
Write-Host '$Env:TF_BUILD_BINARIESDIRECTORY = "C:\code\bin"'
exit 1
}
# Make sure path to source code directory is available
if (-not $Env:TF_BUILD_SOURCESDIRECTORY)
{
Write-Error ("TF_BUILD_SOURCESDIRECTORY environment variable is missing.")
exit 1
}
elseif (-not (Test-Path $Env:TF_BUILD_SOURCESDIRECTORY))
{
Write-Error "TF_BUILD_SOURCESDIRECTORY does not exist: $Env:TF_BUILD_SOURCESDIRECTORY"
exit 1
}
Write-Verbose "TF_BUILD_SOURCESDIRECTORY: $Env:TF_BUILD_SOURCESDIRECTORY"
# Make sure path to binary output directory is available
if (-not $Env:TF_BUILD_BINARIESDIRECTORY)
{
Write-Error ("TF_BUILD_BINARIESDIRECTORY environment variable is missing.")
exit 1
}
if ([IO.File]::Exists($Env:TF_BUILD_BINARIESDIRECTORY))
{
Write-Error "Cannot create output directory."
Write-Error "File with name $Env:TF_BUILD_BINARIESDIRECTORY already exists."
exit 1
}
Write-Verbose "TF_BUILD_BINARIESDIRECTORY: $Env:TF_BUILD_BINARIESDIRECTORY"
# Tell user what script is about to do
Write-Verbose "Will look for and then gather "
Write-Verbose "$FileTypes files from"
Write-Verbose "$Env:TF_BUILD_SOURCESDIRECTORY and copy them to "
Write-Verbose $Env:TF_BUILD_BINARIESDIRECTORY
# Find the files
$files = gci $Env:TF_BUILD_SOURCESDIRECTORY -recurse -include $SourceSubFolders |
?{ $_.PSIsContainer } |
foreach { gci -Path $_.FullName -Recurse -include $FileTypes }
if($files)
{
Write-Verbose "Found $($files.count) files:"
foreach ($file in $files) {
Write-Verbose $file.FullName
}
}
else
{
Write-Warning "Found no files."
}
# If binary output directory exists, make sure it is empty
# If it does not exist, create one
# (this happens when 'Clean workspace' build process parameter is set to True)
if ([IO.Directory]::Exists($Env:TF_BUILD_BINARIESDIRECTORY))
{
$DeletePath = $Env:TF_BUILD_BINARIESDIRECTORY + "\*"
Write-Verbose "$Env:TF_BUILD_BINARIESDIRECTORY exists."
if(-not $Disable)
{
Write-Verbose "Ready to delete $DeletePath"
Remove-Item $DeletePath -recurse
Write-Verbose "Files deleted."
}
}
else
{
Write-Verbose "$Env:TF_BUILD_BINARIESDIRECTORY does not exist."
if(-not $Disable)
{
Write-Verbose "Ready to create it."
[IO.Directory]::CreateDirectory($Env:TF_BUILD_BINARIESDIRECTORY) | Out-Null
Write-Verbose "Directory created."
}
}
# Copy the binaries
Write-Verbose "Ready to copy files."
if(-not $Disable)
{
foreach ($file in $files)
{
Copy $file $Env:TF_BUILD_BINARIESDIRECTORY
}
Write-Verbose "Files copied."
}
A better solution would probably be to have 2 separate builds where the first build publishes the dependencies of the second project as a NuGet package. The Microsoft ALM Rangers have delivered a guide that explains how to set that up.
One option is to host your own nuget feed: http://docs.nuget.org/create/hosting-your-own-nuget-feeds
By hosting your own feed, you can execute custom build activities within your build process which updates your feed.
See this documentation for custom tfs build activities: http://nakedalm.com/creating-a-custom-activity-for-team-foundation-build/
See this documentation for adding powershell to your build process: http://blogs.technet.com/b/heyscriptingguy/archive/2014/04/21/powershell-and-tfs-the-basics-and-beyond.aspx
By hosting your own nuget feed, you have the ability to have your consuming solution leverage your private nugget feed and packages to deal with dependency management and versions. By leveraging the custom build activities you have the ability to update your nuget feed via .net or powershell. You also can automate the deployment via your powershell scripts.

Automatic deployment of solutions with PowerShell

I have a folder, containing several solutions for a SharePoint application, which I want to add and install. I want to iterate over the elements in the folder, then use the Add-SPSolution. After that I want to do a check if the solutions are done deploying, before using the Install-SPSolution. Here is a snippet that I am currently working on:
# Get the location of the folder you are currently in
$dir = $(gl)
# Create a list with the .wsp solutions
$list = Get-ChildItem $dir | where {$_.extension -eq ".wsp"}
Write-Host 'DEPLOYING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Add-SPSolution -LiteralPath $my_file.FullName}
Write-Host 'SLEEP FOR 30 SECONDS'
Start-Sleep -s 30
Write-Host 'INSTALLING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Install-SPSolution -Identity $my_file.Name -AllWebApplications -GACDeployment}
Is there a way to check if the deployment is finished, and it is ready to start installing the solutions?
You need to check the SPSolution.Deployed property value in a loop - basic solution looks like this:
do { Start-Sleep 2 } while (!((Get-SPSolution $name).Deployed))
The Deploying SharePoint 2010 Solution Packages Using PowerShell article contains more details and this comment discusses a potential caveat.

Get dbname from multiple web.config files with powershell

I would like to issue a powershell command to return me the connection string (specifically I am looking for the db name value) for all the web sites on a web server...
So I would like to see something like
site1 dbname=Northwind
site2 dbname=Fitch
site3 dbname=DemoDB
I have tried using the IIS Powershell snap-in... I thought I was close with this:
PS C:\Windows\system32> Get-WebApplication | Get-WebConfiguration -filter /connectionStrings/*
but... after looking at the results... my answer doesn't appear to be in there
I am very new to powershell - so excuse my ignornance and inexperience
Any help appreciated!
thanks!
Hopefully, this will get you started. This just assumes there will be a web.config file at the physical path of the web application's physical path. It does not recurse to find other web.config files in the web application. It also assumes your connection strings are in the connectionStrings configuration element.
Import-Module WebAdministration
Get-WebApplication | `
ForEach-Object {
$webConfigFile = [xml](Get-Content "$($_.PhysicalPath)\Web.config")
Write-Host "Web Application: $($_.path)"
foreach($connString in $webConfigFile.configuration.connectionStrings.add)
{
Write-Host "Connection String $($connString.name): $($connString.connectionString)"
$dbRegex = "((Initial\sCatalog)|((Database)))\s*=(?<ic>[a-z\s0-9]+?);"
$found = $connString.connectionString -match $dbRegex
if ($found)
{
Write-Host "Database: $($Matches["ic"])"
}
}
Write-Host " "
}
This post may give you an idea to start with. Basically load in the web.config file as an XML file and then just find the node where the connection string is.
Do something like $myFile = ([xml] Get-Content web.config). You can then pipe that to Get-Member ( $myFile | Get-Member -MemberType Property) to start working your way into the file to see what node has it. I'm not at a computer where I can show you some screenshots to explain it more, but you can check this chapter out from PowerShell.com "Master PowerShell" e-book that explains working with XML very well.

Resources