I've long had a bunch of VBS automations for IIS 6, including one that gets/sets complex server bindings on several farms of paired servers, each having dozens of apps, each app having 3-12 host headers. Each app has hostname, hostname-fullyqualified, and Disaster Recovery enabled hostname, so they can be a mess to maintain manually.
I did all my vbs stuff using ADSI, but I'm thinking WMI is probably more flexible than ADSI from a full server maintenance perspective. Please correct me if I'm wrong. So now I'm trying to move up to PowerShell + WMI to prepare for Windows 2008 + IIS 7.5. I'm enjoying the learning process, but I've hit a roadblock on this problem.
I can get/set all properties via WMI on my IIS 6 web servers, except ServerBindings. I feel like I'm close, but I'm missing some layer of containment, and I just can't get the objects I'm building to cast over to the right automation object.
The following code gets and reads the ServerBindings just fine. I simply can't figure out a way to write my changes back. Any advice is welcomed.
$objWMI = [WmiSearcher] "Select * From IISWebServerSetting"
$objWMI.Scope.Path = "\\" + $server + "\root\microsoftiisv2"
$objWMI.Scope.Options.Authentication = 6
$sites = $objWMI.Get()
foreach ($site in $sites)
{
$bindings = $site.psbase.properties | ? {$_.Name -contains "ServerBindings"}
foreach ($pair in $bindings.Value.GetEnumerator())
{
# The pair object is a single binding and contains the correct data
$pair
$pair.IP
$pair.Port
$pair.Hostname
# And this line will successfully erase the contents of
# the ServerBindings
$bindings.Value = #{}
# but I can't figure out what to do to update $bindings.Value
}
$site.Put()
}
I'm liking Powershell so far, so thanks for any help you're able to offer.
Alright. I got distracted with major disk failures. The fun never stops.
Anyway, the solution to this problem is simpler than I'd made it:
process
{
$bindings = $_.ServerBindings
foreach ($binding in $bindings)
{
$binding.IP = $ip
$binding.Port = $port
$binding.Hostname = $hostname
}
$_.ServerBindings = $bindings
$_.Put()
}
ServerBindings is an array, but it likes to be an array of its own kind. I was trying to build the array from scratch, but my home-rolled array didn't smell right to Powershell. So, pull the array out of ServerBindings into a new variable, manipulate the variable, then assign the manipulated variable back to the ServerBindings property. That keeps all the right typing in place. It's smooth as silk, and seems easier than ADSI.
Related
I'm trying to add a custom header for AIP's msip_labels to a Powershell script that I'm writing. I've figured out how to do this with .Net.SMTP using:
$message.Headers.Add("msip_labels","MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Enabled=True; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_SiteId=00000000-1111-2222-3333-444444444444; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Owner=user2#domain.tld; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_SetDate=$((Get-Date).ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ss.fffffffZ")); MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Name=Internal; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Application Microsoft Azure Information Protection; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_ActionId ffffffff-5555-6666-7777-888888888888; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Extended_MSFT_Method Manual")
Based on research I've done this should work using Outlook 2016 in theory:
$Outlook = New-Object -ComObject Outlook.Application
$message = $Outlook.CreateItem(0)
$message.PropertyAccessor.SetProperty("http://schemas.microsoft.com/mapi/string/{00020386-0000-0000-C000-000000000046}/msip_labels", "MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Enabled=True; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_SiteId=00000000-1111-2222-3333-444444444444; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Owner=user2#domain.tld; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_SetDate=$((Get-Date).ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ss.fffffffZ")); MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Name=Internal; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Application=Microsoft Azure Information Protection; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_ActionId=ffffffff-5555-6666-7777-888888888888; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Extended_MSFT_Method=Manual")
$message.To = "user1#domain.tld"
$message.Cc = "user3#domain.tld"
$message.Subject = "Report"
$message.HTMLBody = #"
<p><font face = "Calibri" size = "3">Hello World</p></font>
"#
$reportMessage.Send()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Outlook) | Out-Null
I confirmed this using a MAPI viewer that this is what's used in Outlook itself with other emails I've sent using just Outlook. But, when I tried running this in my script I get this error:
Exception setting "SetProperty": Cannot convert the "MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Enabled=True; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_SiteId=00000000-1111-2222-3333-444444444444;
MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Owner=user2#domain.tld; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_SetDate=$((Get-Date).ToUniversalTime().ToString("yyyy-MM-ddTHH:mm:ss.fffffffZ")); MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Name=Internal;
MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Application=Microsoft Azure Information Protection; MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_ActionId=ffffffff-5555-6666-7777-888888888888;
MSIP_Label_aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee_Extended_MSFT_Method=Manual" value of type "string" to type "Object".
At C:\emailtest.ps1:21 char:1
+ $message.PropertyAccessor.SetProperty("http://schemas.microsoft ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : RuntimeException
Which doesn't seem to make much sense it's suppose to be a string in the MAPI schema, so I'm not sure why it thinks it should be an object. I even tried converting those values to objects using ConvertFrom-String, but it didn't work. Any advice on this would be greatly appreciated.
Try to reduce the string length you pass as a value. Does it work correctly?
It seems you need to use a low-level API on which Outlook is based on - Extended MAPI. It doesn't have any restrictions on the string length unlike OOM if you use the OpenProperty method. Also, you may consider using any third-party wrapper around that API such as Redemption.
If you use PropertyAccessor, you must have good knowledge of exception handling logic. Below I list some roadblocks that you may run into:
The body and content of an attachment of an Outlook items are not accessible through PropertyAccessor.
The PropertyAccessor ignores any seconds of the date/time value
String properties are limited in size depending on the information store type.
Limitation: Personal Folders files (.pst) and Exchange offline folders files (.ost) cannot be more than 4,088 bytes.
Limitation: For direct online access to Exchange mailbox or Public Folders hierarchy, the limit is 16,372 bytes.
Binary properties only those whose values are under 4,088 byte can be retrieved or set. (If trying to use larger values, you get an out-of-memory error).
You may find the Don't stumble over a stone working with the PropertyAccessor and StorageItem classes in Outlook 2007 article helpful.
It worked for me by appending a null character at the end of the string ( "`0" ) This is required by type "PtypString", required by property.
https://learn.microsoft.com/en-us/openspecs/exchange_server_protocols/ms-oxcdata/0c77892e-288e-435a-9c49-be1c20c7afdb
I'm setting up a Powershell script to provide some user notifications. The notification is legal in nature and may be updated/changed from time to time so it must be fairly easy to locate. It also has a few 'fill in the blank' variables that depend on the person receiving the notification.
I wanted to have a secondary Powershell file that contained the copy (text) to be used, so something like...
$body = "By accessing this system, you agree that your name ($currentUserName) and IP address ($currentUserIPAddr) will be recorded and stored for up to ($currentUserRetentionPeriod)."
The file could be updated as needed without actually opening the script, finding the line to edit, and potentially messing up other items/just being difficult. However, I'm looping through several thousand users in a single execution, so all the $currentUser... variables will be re-used frequently. This poses a problem because $body tries to get the variables immediately and acts as a static string instead of evaluating the variable contents each time it's invoked.
Is there a clever way for me to define $body a single time (i.e. not inside a loop) but still allow for redefinition of internal variables? I'd also rather not split the string up into multiple parts so it became $part1 + $var1 + part2 + var2....n+1 times.
A simple approach would be to just dot-source the script containing the copy whenever you need the variable "re-compiled":
BodyDef.ps1:
$body = "By accessing this system, you agree that your name ($currentUserName) and IP address ($currentUserIPAddr) will be recorded and stored for up to ($currentUserRetentionPeriod)."
Send-Notification.ps1
$bodyDefPath = (Join-Path $PSScriptRoot BodyDef.ps1)
foreach($user in Get-Users){
$currentUserName = $user.UserName
$currentUserIPAddr = $user.IPAddress
$currentUserRetentionPeriod = $user.RententionPeriod
. $bodyDefPath
Send-MailMessage -Body $body
}
The above would work just fine, but it's not very powershell-idiomatic, and kind of silly, reading the file over and over again.
As suggested in the comments, you should define a second function (or just a scriptblock) if you want to reuse the same template with different values:
Send-Notification.ps1
# You could as well define this as a function, doesn't make much difference
$NotificationSender = {
param($User)
$body = "By accessing this system, you agree that your name ($($user.UserName)) and IP address ($($user.IPAddress)) will be recorded and stored for up to $($user.RetentionPeriod)."
Send-MailMessage -Body $body
}
foreach($user in Get-Users){
& $NotificationSender -user $user
}
I'm currently running IIS on my server using an app instantiating certificates.
By doing this code, for instance :
X509Certificate2 myX509Certificate = new
X509Certificate2(Convert.FromBase64String(byteArrayRawCertificate), passwordCertificate,
X509KeyStorageFlags.Exportable |
X509KeyStorageFlags.MachineKeySet |
X509KeyStorageFlags.PersistKeySet);
The code works fine. But I encounter a problem on my computer, on the following folder :
C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys
3KB RSA files keep on being added on that folder. For now, I have more than a million files like those ones :
I would like to delete those files, but :
IIS uses one of them for encryption of password, or perhaps for other
purposes and I don't know which one,
Deleting such a large folder can take time (like days)
Is there a simple way not to create those files again and again ? (if I don't mention "MachineKeySet" while instanciating my certificate, this won't work)
If not, is there a way to remove the created files without deleting IIS ones ?
Is there a way to detect which files are used by IIS ?
Thanks in advance for your help.
There is some work for you. At first, you *MUST NOT* instantiate X509Certificate2 object from PFX file each time you need to access it. It is very BAD idea. This causes a new key file generated in the MachineKeys folder. Instead, you have to install the certificate to local certificate store once and then reference installed certificate.
Use X509Store.Add() method to install certficate to local store:
X509Certificate2 myX509Certificate = new
X509Certificate2(Convert.FromBase64String(byteArrayRawCertificate), passwordCertificate,
X509KeyStorageFlags.MachineKeySet);
X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.ReadWrite);
store.Add(myX509Certificate);
store.Close()
Next time you need to access your certificate and private key, use same X509Store class as follows:
X509Store store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.ReadOnly);
X509Certificate2 myCert = store.Certificates.Find(blablabla);
store.Close()
Instead of "blablabla", specify search filter: X509Certificate2Collection.Find(). You can use various filter options to locate your certificate. Most common used is by thumbprint.
Regarding large folder. If you are sure that there are no other certificates in the LocalMachine\My store, you can simply purge all contents and then install your certificate by using the code above.
Had the same problem, and was quite afraid to delete something, as I did read on other sources, that there are some system-critical files in there. By deleting those few you could end up with messed up IIS or other software.
As I had 8'000'000 of files there, I could not even open the folder - what I did was make small program in C# that delete files judging by OWNER - the user who created them! If they were created by IIS user, it would be safe to do this (please be sure you understand what you are doing!)
var userNameToDeleteBy = "IIS_App_Pool_User_Goes_here!";
var allFiles = (new DirectoryInfo(Directory.GetCurrentDirectory()).EnumerateFiles());
var i = 0;
foreach (var file in allFiles)
{
var fname = file.FullName;
string user = System.IO.File
.GetAccessControl(fname)
.GetOwner(typeof(System.Security.Principal.NTAccount))
.ToString();
if(user.Contains(userNameToDeleteBy))
{
File.Delete(fname);
i++;
}
//output only every 1k files, as this is the slowest operation
if (i % 1000 == 0) Console.WriteLine("Deleted: " + i);
}
Also, if you really need to load cert from file, here is the answer on how to leave folder garbage-free:
Prevent file creation when X509Certificate2 is created?
This is a Powershell script that deletes all machine keys that are owned by a specific application pool user.
It basically does the same as the C# program made by #halloweenlv (a powershell script seemed more practical)
param(
[ValidateNotNullOrEmpty()]
[Parameter(Mandatory=$true)]
[string] $ApplicationPoolName,
[string] $MachineKeysDirectory = "C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys"
)
$nrOfKeysDeleted = 0;
foreach ($keyPath in [IO.Directory]::EnumerateFiles($MachineKeysDirectory)) {
$owner = Get-Acl -Path $keyPath | Select-Object -ExpandProperty "Owner";
if ($owner -eq "IIS APPPOOL\$ApplicationPoolName") {
[System.IO.File]::Delete($keyPath)
$nrOfKeysDeleted++;
}
if ($nrOfKeysDeleted -gt 0 -and $nrOfKeysDeleted % 1000 -eq 0) {
Write-Output -Verbose "Deleted $nrOfKeysDeleted keys";
}
}
Write-Output -Verbose "Deleted $nrOfKeysDeleted keys";
The machine keys in this directory are system files. As mentioned before, some files are critical for the system so simply deleting everything is not a good idea.
The performance is OK but not great. On my server it took about 5 seconds to delete 1000 keys. That's about one hour and half for a million keys.
The Problem
We upload (large amounts of) files to SharePoint using FrontPage RPC (put documents call). As far as we've been able to find out, setting the value of taxonomy fields through this protocol requires their WssId.
The problem is that unless terms have been explicitly used before on a listitem, they don´t seem to have a WSS ID. This causes uploading documents with previously unused metadata terms to fail.
The Code
The call TaxonomyField.GetWssIdsOfTerm in the code snippet below simply doesn´t return an ID for those terms.
SPSite site = new SPSite( "http://some.site.com/foo/bar" );
SPWeb web = site.OpenWeb();
TaxonomySession session = new TaxonomySession( site );
TermStore termStore = session.TermStores[new Guid( "3ead46e7-6bb2-4a54-8cf5-497fc7229697" )];
TermSet termSet = termStore.GetTermSet( new Guid( "f21ac592-5e51-49d0-88a8-50be7682de55" ) );
Guid termId = new Guid( "a40d53ed-a017-4fcd-a2f3-4c709272eee4" );
int[] wssIds = TaxonomyField.GetWssIdsOfTerm( site, termStore.Id, termSet.Id, termId, false, 1);
foreach( int wssId in wssIds )
{
Console.WriteLine( wssId );
}
We also tried querying the taxonomy hidden list directly, with similar results.
The Cry For Help
Both confirmation and advice on how to tackle this would be appreciated. I see three possible routes to a solution:
Change the way we are uploading, either by uploading the terms in a different way, or by switching to a different protocol.
Query for the metadata WssIds in a different way. One that works for unused terms.
Write/find a tool to preresolve WssIds for all terms. Suggestions on how to do this elegantly are most welcome.
setting the WssID value to -1 should help you. I had similar problem (copying documents containing metadata fields) between two different web applications. I've spent many hours on solving strange metadata issues. In the end, setting the value to -1 have solved all my issues. Even if the GetWssIdsOfTerm returns a value, I've used -1 and it works correctly.
Probably there is some background logic that will tak care of the WssId.
Radek
WSS 3.0 will let me send an email to a group when a new task is added to a task list. What I would like to do is to run a weekly task that sends out reminders of tasks due within certain periods, i.e. 2 days, 7 days, 14 days etc. I thought the simplest way would be to build a little C# app that sits on the WS2K3 box and queries the WSS database. Any ideas on which tables I should be checking? More generally is there an overall schema for the WSS3 database system?
If anyone is aware of an existing solution with code please let me know.
Thx++
Jerry
My suggestions:
don't create a console app, create a class that inherits from SPJobDefinition.
set SPJobLockType.Job to this timer, this will grant that the job is executed only once in the whole farm, even if you are running multiple front-end servers
in the, timer job, open the SPSite, SPWeb objects you need, then find the SPList\
Using SPQuery filter out only the items you need - I believe, you will have to filter out the ones where Status!=Complete
Loop through the results collection (which will be of type SPListItemCollection, apply your rules, checking the DueDate and Datetime.Now, send the e-mails
Since a task is simply a SPListItem, it has a Properties property, which is actually a property bag - you can add whatever properties you need. So, add a property My_LastSentReminderDate. Use this property to check if you are not sending too much of "corporate spam" :-)
To install your SPJobDefinition in a SharePoint farm, you can use a PowerShell script. I can give you examples, if needed.
Don't forget to Threading.Thread.CurrentThread.CurrentCulture = Your_SPWeb_Instance.Locale, otherwise date comparisons may not work if the web has a different locale!
EDIT: This is how a typical reminder looks like in my applications:
Public Class TypicalTimer
Inherits SPJobDefinition
Public Sub New(ByVal spJobName As String, ByVal opApplication As SPWebApplication)
'this way we can explicitly specify we need to lock the JOB
MyBase.New(spJobName, opApplication, Nothing, SPJobLockType.Job)
End Sub
Public Overrides Sub Execute(ByVal opGuid As System.Guid)
'whatever functionality is there in the base class...
MyBase.Execute(Guid.Empty)
Try
Using oSite As SPSite = New SPSite("http://yourserver/sites/yoursite/subsite")
Using oWeb As SPWeb = oSite.OpenWeb()
Threading.Thread.CurrentThread.CurrentCulture = oWeb.Locale
'find the task list and read the "suspects"
Dim oTasks As SPList = oWeb.Lists("YourTaskListTitle")
Dim oQuery As New SPQuery()
oQuery.Query = "<Where><Neq><FieldRef Name='Status'/>" & _
"<Value Type='Choice'>Complete</Value></Neq></Where>"
Dim oUndoneTasks As SPListItemCollection = oTasks.GetItems(oQuery)
'extra filtering of the suspects.
'this can also be done in the query, but I don't know your rules
For Each oUndoneTask As SPListItem In oUndoneTasks
If oUndoneTask(SPBuiltInFieldId.TaskDueDate) IsNot Nothing AndAlso _
CDate(oUndoneTask(SPBuiltInFieldId.TaskDueDate)) < Now().Date Then
' this is where you send the mail
End If
Next
End Using
End Using
Catch ex As Exception
MyErrorHelper.LogMessage(ex)
End Try
End Sub
End Class
To register a timer job, I typically use this kind of a script:
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
[System.Reflection.Assembly]::LoadWithPartialName("Your.Assembly.Name.Here")
$spsite= [Microsoft.SharePoint.SPSite]("http://yourserver/sites/yoursite/subsite")
$params = [System.String]("This text shows up in your timer job list (in Central Admin)", $spsite.WebApplication
$newTaskLoggerJob = new-object -type Your.Namespace.TypicalTimer -argumentList $params
$schedule = new-object Microsoft.SharePoint.SPDailySchedule
$schedule.BeginHour = 8
$schedule.BeginMinute = 0
$schedule.BeginSecond = 0
$schedule.EndHour = 8
$schedule.EndMinute = 59
$schedule.EndSecond = 59
$newTaskLoggerJob.Schedule = $schedule
$newTaskLoggerJob.Update()
Any time you need something in sharepoint that is executed periodically, 99 times out of a 100 you'll need to build a TimerJob. These are scheduled tasks that run inside SharePoint and you can create your own, then using a feature + featurereceiver to actually "install" the timoerjob (definition) and assign it a schedule.
For more info: see Andrew Connell's article on TimerJobs.
P.S. Never query /update the databases related to SharePoint directly! This will make you "unsupported", i.e. if anything happens microsoft will charge (a lot of) money to come and fix it, instead of being able to ask for regular support. (if you are say an MSDN subscriber you get up to 4 free support calls a year).
Don't bother trying to go directly to the database. You will have a very hard time because it's undocumented, unsupported, and not recommended. SharePoint does in fact have a full featured object model though.
If you reference Microsoft.SharePoint.dll (located in the Global Assembly Cache of a machine with SharePoint installed on it) you can access the data that way. The objects you'll want to start with are SPSite, SPWeb, SPList, SPQuery, and SPListItem. All of which you can find very easily on http://msdn.microsoft.com in a search.
Another less-flexible but code-free possibility you could try is creating several different views that include upcoming tasks then via the GUI set up an alert for when items are added to that view.