First and foremost I'm new to Azure Functions and have only been working with it for a couple of weeks, so please bear with me. I was tasked with taking one of our Powershell script that gets users and licenses from our Office 365 Tenants, output them to CSV and then email them to a monitored email box, and porting it over to an Azure Function.
After a lot of work I've managed to get it to work using a call to Powershell.exe from within my script, due to the fact that some objects are returned "Un-serialized", preventing them from being iterated. (Known Issue on GitHub)
Everything was working via Test + Code and I set the time trigger to run at 12:00am, however when I checked my inbox the following day I had no emails. When I checked monitoring on function, I had the following listed for each of the Tenants that was iterated, which would seems to be something failing with the call to Powershell.exe:
HResult : -2146233087 CategoryInfo : OperationStopped: (:) [],
CryptographicException FullyQualifiedErrorId :
System.Security.Cryptography.CryptographicException InvocationInfo :
ScriptLineNumber : 77 OffsetInLine : 5 HistoryId : -1 ScriptName :
C:\home\site\wwwroot\License_Report_a-f\run.ps1 Line : $ScriptResult =
(&$64bitPowerShellPath -WindowStyle Hidden -NonInteractive -Command
$Script -Args
$ApplicationId,$credential,$refreshToken,$tenantID,$client.TenantID)
PositionMessage : At
C:\home\site\wwwroot\License_Report_a-f\run.ps1:77 char:5 +
$ScriptResult = (&$64bitPowerShellPath -WindowStyle Hidden -NonIn … +
PSScriptRoot : C:\home\site\wwwroot\License_Report_a-f PSCommandPath :
C:\home\site\wwwroot\License_Report_a-f\run.ps1 CommandOrigin :
Internal ScriptStackTrace : at ,
C:\home\site\wwwroot\License_Report_a-f\run.ps1: line 77
I did some investigation on this and found someone elude that there were occasions where the Function can't read "Profile.ps1", which is where I'd but the declaration of the Powershell.exe env variable, so as a test I moved the assignment locally within the script. I then set an hourly schedule on the TimeTrigger and it was running fine on the hour. However, changing the TimeTrigger back to only run at 12:00am, I was greeted with no emails again this morning and the same error, seemingly ruling out the "Profile.ps1" issue.
My frustration at the moment is that the function works fine in Code + Test, but it seems like if the the function is idle for an extended period of time, when it spins up again it can't load something properly. I had successful running on the hour yesterday at 10:00, 11:00, 12:00, 13:00, 14:00 and 15:00. It was then left with no spin ups for 9 hours and then it failed. This morning, I have updated the TimeTrigger again to run every hour to see what happens and now once again, I'm getting the emails coming through to me, so I'm baffled. Again, I've made a change and almost "woke the machine up" and now everything works fine again.
Has anyone seen this before or anything similar as I'm not sure where to look next. Is there maybe some sort of cache that get's cleared if you don't run a function for x minutes / x hours which is causing the issue? I've had a couple of hours looking on the net, but I can't see anything similar. Any help / points are gratefully appreciated.
So it seems like I may have found the issue myself, however it doesn't really explain what exactly is going on, but it seems to be related to Generating the "Graph Access Tokens" to access Office 365 itself.
After some playing around last week, I got to a point where I was no longer able to run the script without the above error, even in "Test + Run" on the Azure Functions interface. I decided to do some testing on the script that was ran by the External PowerShell call and when I removed the code that was generating the "Graph Access Tokens", the script executed successfully. I then re-structured the whole code so that I would generate the tokens before calling the External PS script, and then pass in the access tokens instead.
After testing over the weekend, I can confirm that I've correctly generated and sent the export files via email so it looks to be all working fine now. Below is a rough outlay of the start of the script, just to showcase what I've ended up with.
# Import Modules
Import-Module MsOnline -UseWindowsPowershell
Import-Module PartnerCenter -UseWindowsPowershell
# Get Tokens
$aadGraphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.windows.net/.default' -ServicePrincipal -Tenant $tenantID
$graphToken = New-PartnerAccessToken -ApplicationId $ApplicationId -Credential $credential -RefreshToken $refreshToken -Scopes 'https://graph.microsoft.com/.default' -ServicePrincipal -Tenant $tenantID
#Connect to Msol
Connect-MsolService -AdGraphAccessToken $aadGraphToken.AccessToken -MsGraphAccessToken $graphToken.AccessToken
# Get Client List
$clientList = Get-MsolPartnerContract -All | Sort-Object -Property Name
# Loop Clients
ForEach ($client in $clientList)
{
$Script = {
param (
[Object]$aadGraphTkn,
[Object]$graphTkn,
[string]$clientTenantID
)
# Import Modules
Import-Module MsOnline
Import-Module PartnerCenter
#Connect to Msol
Connect-MsolService -AdGraphAccessToken $aadGraphTkn.AccessToken -MsGraphAccessToken $graphTkn.AccessToken
# DO OTHER THINGS HERE AND RETURN SOMETHING
}
$ScriptResult = (&$env:64bitPowerShellPath -WindowStyle Hidden -NonInteractive -Command $Script -Args $aadGraphToken,$graphTkn)
}
Related
I'm looking for a solution to my problem and I am not able to find it. I've tried everything online.
I'm trying to disable our on premise AD connect, I ran it as a test but it turns out our environment is not setup correctly for this to work and requires some restructuring.
I've followed the standard instructions of
Connect-MsolService and Set-MsolDirSyncEnabled -EnableDirSync $false
Connect works fine but when I try to run the disable command it returns back the error Set-MsolDirSyncEnabled : You cannot turn off Active Directory synchronization.
I've been told it could take a while but I had enabled it last week and most resources I've found say "24 - 72 hours".
The command (Get-MSOLCompanyInformation).DirectorySynchronizationStatus shows Enabled and not syncing.
Can anyone assist me with this issue?
Thank you!
You try to enable (or disable) Directory synchronization in Office 365, and you are greeted by the following error message.
PS C:\> Set-MsolDirSyncEnabled -EnableDirSync $false
Set-MsolDirSyncEnabled : You cannot turn off Active Directory synchronization.
At line:1 char:1
+ Set-MsolDirSyncEnabled -EnableDirSync $false
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [Set-MsolDirSyncEnabled], MicrosoftOnlineException
+ FullyQualifiedErrorId : Microsoft.Online.Administration.Automation.DirSyncStatusChangeNotAllowedException,Microsoft.Online.Administration.Automation.SetDirSyncEnabled
The DirSyncStatusChangeNotAllowedException error in particular means that you have changed the status recently, and the service is simply preventing you from changing it back too soon
Note : The error message detailed is different and will occur even if the
DirSync status has been updated. It’s a simple block on Microsoft’s
side to prevent you from changing the status too often
check now or wait for atleast 12 hours to 72hr to reflect.
MSOLCompanyInformation | select DirectorySynchronizationStatus
NO FIX: Unfortunately, there is no way around this error. It simply means that your directory is still doing a full initial synch with Azure AD. This error message will clear once the initial sync is complete. The time will vary depending on the size of your on-premises AD but should take no longer than 72 hours for very large environments.
Reference : https://www.michev.info/Blog/Post/1797/you-cannot-turn-off-active-directory-synchronization
Note : If still problem is not getting solved would suggest you to reach out to MS Support. They can able to track down where the exact. issue
I'm brand new to automation, and pretty new to Powershell as well, so I'm hoping this is a simple fix. :)
I'm trying to get some code to run. And for all I know, it does run, but the test pane doesn't show anything. Based on this thread: Azure powershell runbook don't show any output, I did try republishing the code and clearing my browser cache, but that didn't help in my case, so I'm thinking there's an issue with the code?
Here's my (genericized) code):
workflow DB_DailyTasks
{
Write-Output "Code starting"
inlinescript
{
[string] $SqlServerName = "myDb.database.windows.net"
$Credential = Get-AutomationPSCredential -Name "myDatabase-automation"
# Setup credentials
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
# Execute the udp_myProc procedure
# Create connection for each individual database
$DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
Write-Output "ConnectionState is: $(DatabaseConnection.State)"
$DbName = "myDb"
# Setup connection string for $DbName
$DatabaseConnection.ConnectionString = "Server=$ServerName; Database=$DbName; User ID=$UserId; Password=$Password;"
$DatabaseConnection.Open();
Write-Output "ConnectionState is: $(DatabaseConnection.State)"
# Create command for a specific database $DBName
$DatabaseCommand.Connection = $DatabaseConnection
Write-Output "Running udp_myProc procedure"
$DatabaseCommand.CommandText = "EXECUTE [dbo].[udp_myProc]"
$NonQueryResult = $DatabaseCommand.ExecuteNonQuery()
# Close connection to $DbName
$DatabaseConnection.Close()
}
}
...and here's what I see in the test pane when I try to test:
...which isn't terribly helpful in knowing whether it actually ran.
Thanks in advance for any help you can provide! :)
[Edit] The code is definitely not running. The stored procedure inserts an entry into a history table, and there's no record of it running either for the tests or for when I ran the published output.
Interesting note, though - when I ran the published output, there were no errors and no warnings, but it did say, "This job does not have any output" in the Output tab.
[Edit #2]: It doesn't write anything on my local computer either??
[Edit #3]: Replaced Write-Output with Write-Host inside the inlinescript block. No change, either on Azure admin console or on my local computer.
My guess is that you created a PowerShell runbook, not a PowerShell Workflow runbook. If this is correct, then your runbook code declares a workflow called DB_DailyTasks, but never invokes it. For example, you could declare a function, but would not expect it to be automatically invoked because of that.
Unless you are certain that you need a workflow, not a regular PowersShell runbook, I would recommend removing workflow and InlineScript from your code, and deal with regular PowerShell.
However, if you really need it to be a workflow (not recommended unless you have to use checkpoints and things like parallel and foreach -parallel), then create a runbook of the "PowerShell Workflow" type: it treats the workflow keyword differently, so your code would be correct.
my current tasks is to set up an automatic configuration for Microsoft Azure Backup.
What i did so far:
wrote scripts and tasks that copy the Installer to a remote server, execute it, make sure it's installed, register the server with Azure, set up the schedule, the file specs and everything around it.
And it all works.
The problem though: I have now recieved the task to also include "System State" within the backup.
I'm aware, that this is about a 60 second task, if you do it using the Azure Console to schedule the backup. I do however have the requirement to build the script in a way, that not a single finger has to be moved to complete the whole thing.
Question: Has anyone figured out, if it's possible to activate and include the System State Backups ( which ends up under %MARSDIR%\Scratch\SSBS ) within the OBPolicy using only powershell?
If i activate it with the console and then execute the command "Get-OBPolicy" i find the System State listed along the other filespecs.
However i can't figure out how i would set it, using New-OBFileSpec or anything alike.
Thanks in advance :)
Edit: To clarify
Assume i'm in the config window seeing this:
I can "check" C: by doing
"New-OBFileSpec -FileSpec #("C:\")"
What command should i use in PS to "check" System State ?
Edit 2:
Below is the part of the code for this.
How do i add System State to the $inclusions?
## Register Server with Azure
$credsfile = ## Path to Vault credential file
Start-OBRegistration -VaultCredentials $credsfile -Confirm:$false
# Create Policy
$newpolicy = New-OBPolicy
$sched = New-OBSchedule -DaysofWeek Monday,Tuesday,Wednesday,Thursday,Friday,Saturday,Sunday -TimesofDay 22:00
Set-OBSchedule -Policy $newpolicy -Schedule $sched
# File Spec
$inclusions = New-OBFileSpec -FileSpec #("E:\")
Add-OBFileSpec -Policy $newpolicy -FileSpec $inclusions
# Retention
$retentionpolicy = New-OBRetentionPolicy -RetentionDays 30
Set-OBRetentionPolicy -Policy $newpolicy -RetentionPolicy $retentionpolicy
## Set the Policy
Set-OBPolicy -Policy $newpolicy -Confirm:$false
# Set Machine Encryption Key
$PassPhrase = ConvertTo-SecureString -String "...." -AsPlainText -Force
Set-OBMachineSetting -EncryptionPassPhrase $PassPhrase
11/13/2013 11:35:37 TRCW1 using local computer 11/13/2013 11:35:37
TRCE1 System.Management.ManagementException: Access denied at
System.Management.ManagementException.ThrowWithExtendedInfo(ManagementStatus
errorCode) at
System.Management.ManagementObjectCollection.ManagementObjectEnumerator.MoveNext()
at Microsoft.PowerShell.Commands.GetWmiObjectCommand.BeginProcessing()
Code (inside a loop of server names):
$error.clear() #clear any prior errors, otherwise same error may repeat over-and-over in trace
if ($LocalServerName -eq $line.ServerName)
{
# see if not using -ComputerName on local computer avoids the "service not found" error
Add-Content $TraceFilename "$myDate TRCW1 using local computer "
$Service = (get-wmiobject win32_service -filter "name = '$($line.ServiceName)'")
}
else
{
Add-Content $TraceFilename "$myDate TRCW2 using remote computer $($line.ServerName) not eq $LocalServerName"
$Service = (get-wmiobject win32_service -ComputerName $line.ServerName -filter "name = '$($line.ServiceName)'")
}
if ($error -ne $null)
{
Write-Host "----> $($error[0].Exception) "
Add-Content $TraceFilename "$myDate TRCE1 $($error[0].Exception)"
}
I'm reading a CSV of server names. I finally added the exception logic, to find I'm getting an "Access Denied". This was only happening on the local server. Seems almost backwards, the local server fails, whereas the remote servers work fine. I even changed logic to test to see if it was the local server, then tried leaving off the -ComputerName parms on the WMI (as shown in code above), and still getting error.
So far, my research shows the answer may lie with
set-item trustedhosts
But my main question is whether trustedhosts is applicable to local servers, or only remote servers. Wouldn't a computer always trust itself? Does it still use remoting to talk to itself?
This server apparently was part of a cluster a long time before I got here, and now it's not. I'm also suspicious of that.
When I run interactively the script works fine, it's only when I schedule it and run it under a service account that it fails with the access denied. The Service Account is local Admin on that box.
I'm using get-wmiobject win32_service instead of get-service because it returns extra info I need to lookup the process, and date/time the service was started using another WMI call.
Running on Win 2008/R2.
Below Update 11/13/2013 5:27Pm
I have just verified that the problem happens on more than one server. [I took the scripts and ran them on another server.] My CSV input includes a list of servers to monitor. The ones outside of my own server always return results. The ones to my own server, that omit the -ComputerName fail. (I have tried with and without the -ComputerName parm for the local server).
Are you running the script "as administrator" (UAC)? When your credentials are calculated for your local instance if you have UAC enabled and you didn't run it "as administrator" it removes the local administrator security token. Connecting to a different machine over the network, A) it completely bypasses UAC, and B) when the target evaluates your token, the group memberships you're in are fully evaluated and thus you get "administrator" access.
Probably unrelated, but I've just run across two 2008 R2 servers out of 10 on my system that reject THE FIRST performance criteria that I'm collecting, but only when it's running as a scheduled task. If I run it interactively it works at least 95% of the time. I'm collecting Disk Seconds/Read and Seconds/Write, so it's the reads that don't show, for these two servers only. I flipped the order and what do you know, the Writes don't report. I just added one drive Seconds/Transfer as a sacrificial lamb to the start of my criteria list, and VOILA now I don't get ACCESS DENIED to the reads and writes.
$counterlist = #("\$server\PhysicalDisk(0*)\Avg. Disk sec/Transfer",
"\$server\PhysicalDisk()\Avg. Disk sec/Read",
"\$server\PhysicalDisk()\Avg. Disk sec/Write")
$counters = $counterlist | Get-Counter
(not sure how to edit this, but there are asterisks in between the parenthesis after physicaldisk...)
I'm building a script to read the Security Log from several computers. I can read the Security log from my local machine with no problem when using the Get-EventLog command, but the problem with it is that I can't run it against a remote machine (the script is for powershell v1). The command below never returns any results, although that with any other LogFile, it works perfectly:
gwmi -Class Win32_NTLogEvent | where {$_.LogFile -eq "Security"}
I've done some research, and I seems to be a impersonation issue, but the -Impersonation option for the Get-WmiObject does not seem to be implemented. Is there anyway around this problem? The solution could be running the Get-EventLog on a remote machine somehow, or dealing with the impersonation issue so that the security log can be accessed.
Thanks
You could use .NET directly instead of going through WMI. The scriptblock below will give you the first entry in the security log
$logs = [System.Diagnostics.EventLog]::GetEventLogs('computername')
$security = $logs | ? {$_.log -like 'Security'}
$security.entries[0]
Have you tried to use the -Credential parameter? Also, use the filter parameter instead of where-object, it gets just the security events (where-object gets ALL events from all logs and only then performs the filtering)
gwmi Win32_NTLogEvent -filter "LogFile='Security'" -computer comp1,comp2 -credential domain\user