Have a simple Azure powershell Function. This is triggered by EventGrid, connected to a storage container. Basic function works without any gripe.
The moment I add module dependency everything runtime barfs. As you can see, I included Storage, Account & compute dependency. I tried both complete version and wildcard version. Both fail. I'd appreciate if someone can tell me what is amiss
Requirements.psd1
# This file enables modules to be automatically managed by the Functions service.
# See https://aka.ms/functionsmanageddependency for additional information.
#
#{
# For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'.
# To use the Az module in your function app, please uncomment the line below.
Az = '8.*'
Accounts = '2.*'
Compute = '4.*'
Storage = '4.*'
}
In order for these modules to be included I added import statements in profile.ps1
# import statements
Import-Module Az.Accounts
Import-Module Az.Compute
Import-Module Az.Storage
# Authenticate with Azure PowerShell using MSI.
# Remove this if you are not planning on using MSI or Azure PowerShell.
if ($env:MSI_SECRET) {
Disable-AzContextAutosave -Scope Process | Out-Null
Connect-AzAccount -Identity
}
run.ps1 file is simple. It parses the input and destructure's all input for processing and attempts to ivoke a remote VM to execute a script. this is where I need compute module
param($eventGridEvent, $TriggerMetadata)
# Make sure to pass hashtables to Out-String so they're logged correctly
# $eventGridEvent | Out-String | Write-Host
# $response = $eventGridEvent | ConvertTo-JSON -Depth 5 | Out-String -Width 200
$topic = $eventGridEvent.topic
$api = $eventGridEvent.data.api
$file = $eventGridEvent.data.url
$fileType = $eventGridEvent.data.contentType
# check for container name and file type and invoke appropriate
# operation in VM
Write-Information "A file change operation happened in $topic"
Write-Host "A file change operation happened in $topic"
Write-Host "$api was invoked on $file of type $fileType"
Write-Host "START execution of script in remote host"
Invoke-AzVMRunCommand -ResourceGroupName 'autosys100-rg' -Name 'autosyslinuxvm' -CommandId 'RunShellScript' -ScriptPath 'install_nginx.sh'
Write-Host "COMPLETED execution of script on remote host"
There's nothign fancy and ludicrously basic :(
Here's the error
Connected!
2022-07-25T21:41:19 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours. Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).
2022-07-25T21:41:28.181 [Information] Executing 'Functions.egExample' (Reason='EventGrid trigger fired at 2022-07-25T21:41:28.1674630+00:00', Id=718522d6-3705-4be8-b4dc-1e1f93383c9f)
2022-07-25T21:41:28.236 [Warning] The first managed dependency download is in progress, function execution will continue when it's done. Depending on the content of requirements.psd1, this can take a few minutes. Subsequent function executions will not block and updates will be performed in the background.
2022-07-25T21:41:28.333 [Error] Executed 'Functions.egExample' (Failed, Id=718522d6-3705-4be8-b4dc-1e1f93383c9f, Duration=115ms)Result: FailureException: Failed to install function app dependencies. Error: 'Failed to get latest version for module 'Storage' with major version '4'. 'Stack: at Microsoft.Azure.Functions.PowerShellWorker.DependencyManagement.DependencyManager.WaitOnDependencyInstallationTask() in /mnt/vss/_work/1/s/src/DependencyManagement/DependencyManager.cs:line 246at Microsoft.Azure.Functions.PowerShellWorker.DependencyManagement.DependencyManager.WaitForDependenciesAvailability(Func`1 getLogger) in /mnt/vss/_work/1/s/src/DependencyManagement/DependencyManager.cs:line 164at Microsoft.Azure.Functions.PowerShellWorker.RequestProcessor.ProcessInvocationRequest(StreamingMessage request) in /mnt/vss/_work/1/s/src/RequestProcessor.cs:line 247
2022-07-25T21:41:38.569 [Information] Executing 'Functions.egExample' (Reason='EventGrid trigger fired at 2022-07-25T21:41:38.5683456+00:00', Id=57490df5-c718-436e-87e3-211406b00f9d)
2022-07-25T21:41:39.236 [Warning] The Function app may be missing the 'Az.Accounts' module. If 'Az.Accounts' is available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency.
2022-07-25T21:41:40.002 [Error] ERROR: The specified module 'Az.Accounts' was not loaded because no valid module file was found in any module directory.Exception :Type : System.IO.FileNotFoundExceptionMessage : The specified module 'Az.Accounts' was not loaded because no valid module file was found in any module directory.HResult : -2147024894TargetObject : Az.AccountsCategoryInfo : ResourceUnavailable: (Az.Accounts:String) [Import-Module], FileNotFoundExceptionFullyQualifiedErrorId : Modules_ModuleNotFound,Microsoft.PowerShell.Commands.ImportModuleCommandInvocationInfo :MyCommand : Import-ModuleScriptLineNumber : 13OffsetInLine : 1HistoryId : 1ScriptName : C:\home\site\wwwroot\profile.ps1Line : Import-Module Az.AccountsPositionMessage : At C:\home\site\wwwroot\profile.ps1:13 char:1+ Import-Module Az.Accounts+ ~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\site\wwwrootPSCommandPath : C:\home\site\wwwroot\profile.ps1InvocationName : Import-ModuleCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 13PipelineIterationInfo :Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException : Result: ERROR: The specified module 'Az.Accounts' was not loaded because no valid module file was found in any module directory.Exception :Type : System.IO.FileNotFoundExceptionMessage : The specified module 'Az.Accounts' was not loaded because no valid module file was found in any module directory.HResult : -2147024894TargetObject : Az.AccountsCategoryInfo : ResourceUnavailable: (Az.Accounts:String) [Import-Module], FileNotFoundExceptionFullyQualifiedErrorId : Modules_ModuleNotFound,Microsoft.PowerShell.Commands.ImportModuleCommandInvocationInfo :MyCommand : Import-ModuleScriptLineNumber : 13OffsetInLine : 1HistoryId : 1ScriptName : C:\home\site\wwwroot\profile.ps1Line : Import-Module Az.AccountsPositionMessage : At C:\home\site\wwwroot\profile.ps1:13 char:1+ Import-Module Az.Accounts+ ~~~~~~~~~~~~~~~~~~~~~~~~~PSScriptRoot : C:\home\site\wwwrootPSCommandPath : C:\home\site\wwwroot\profile.ps1InvocationName : Import-ModuleCommandOrigin : InternalScriptStackTrace : at <ScriptBlock>, C:\home\site\wwwroot\profile.ps1: line 13PipelineIterationInfo :Exception: The specified module 'Az.Accounts' was not loaded because no valid module file was found in any module directory.Stack:
Your requirements.psd1 has incorrect module names as you have forgotten the Az. prefix.
#{
# For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'.
# To use the Az module in your function app, please uncomment the line below.
# Az = '8.*'
'Az.Accounts' = '2.*'
'Az.Compute' = '4.*'
'Az.Storage' = '4.*'
}
I would usually recommend not installing the entire Az module as it takes a long time however the compute, accounts and storage modules should be installed with the Az module so you wouldn't need to specify both.
You also don't need to specify the imports in the profile.ps1 of the function.
If the modules are available on the PSModulePath then you can just use the functions in your run.ps1
Related
I currently have a script that is similair to the script that is described inside https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/configure-always-encrypted-keys-using-powershell?view=sql-server-ver15
However the script uses powershell 5 which isn't available to me on linux agents in our azure-devops environment. Because of this a majority of the azure-sql commands aren't available to us. Is there an alternative to these sqlserver module cmdlets.
The recommended alternative is to use the cloud console, but you may be able to get away with linux powershell like so:
I can't guarantee if this all works, but that specific command only creates a SqlColumnMasterKeySettings object that contains information about the location of your column master key. You can probably just create one manually, but you'll need to know the exact values. I would recommend running it from a windows machine first to see what the values should be for your environment.
# On Windows [Optional]
$cmkSettings = New-SqlAzureKeyVaultColumnMasterKeySettings -KeyURL $akvKey.Key.Kid
$cmkSettings | Format-List KeystoreProviderName,KeyPath
KeystoreProviderName : # Take these strings
KeyPath : # And use them in your script on linux
# Now on Linux, using the values above:
$cmkSettings = [Microsoft.SqlServer.Management.PowerShell.AlwaysEncrypted.SqlColumnMasterKeySettings]::new("YourKeystoreProviderName","YourKeyPath")
New-SqlColumnMasterKey -Name 'CMK1' -InputObject $database -ColumnMasterKeySettings $cmkSettings
# Success!
The key settings properties are just strings that get saved to your SQL Instance, so this should work fine. The harder part is authenticating to Azure to create keys from your master key, but you can try importing the desktop version of the commands like so:
# Start a NEW powershell session without the sqlserver module:
pwsh
# Get the module directory:
$d = (Get-Item (Get-Module SqlServer).path).DirectoryName
# Import the desktop version of these assemblies:
Import-Module "$d/Microsoft.SqlServer.Diagnostics.Strace.dll"
Import-Module "$d/Microsoft.SqlServer.Management.PSSnapins.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AzureAuthenticationManagement.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AlwaysEncrypted.Types.dll"
# Then import the module normally (there will be errors - you can ignore these)
Import-Module SqlServer
# Now you may be able to authenticate to Azure to generate new keys:
# Required to generate new keys
# NOTE: -Interactive fails on linux
Add-SqlAzureAuthenticationContext -ClientID "YourID" -Secret "YourSecret" -Tenant "YourTenant"
# Create a key using your master key:
New-SqlColumnEncryptionKey -Name 'CEK1' -InputObject $database -ColumnMasterKey 'CMK1'
This worked on my installation of centos7/pwsh7.1.3 - make sure you have SqlServer version 21.1.18245 (only 10 days old at the moment) as many new sql commands got ported to pwsh 7.1.
I'm issuing this command:
azcopy sync "D:\Releases\Test\MyApp" "http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=REDACTED"
...and I'm getting this error:
error parsing the input given by the user. Failed with error Unable to infer the source 'D:\Releases\Test\MyApp' / destination 'http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=-REDACTED-
I would have thought my source was pretty clear.
Can anyone see anything wrong with my syntax?
I believe you have run into an issue with azcopy that it does not support local emulator (at least for sync command). There's an open issue on Github for the same: https://github.com/Azure/azure-storage-azcopy/issues/554.
Basically the issue is coming from the following lines of code, where it returns location as Unknown in case of storage emulator URLs:
func inferArgumentLocation(arg string) common.Location {
if arg == pipeLocation {
return common.ELocation.Pipe()
}
if startsWith(arg, "http") {
// Let's try to parse the argument as a URL
u, err := url.Parse(arg)
// NOTE: sometimes, a local path can also be parsed as a url. To avoid thinking it's a URL, check Scheme, Host, and Path
if err == nil && u.Scheme != "" && u.Host != "" {
// Is the argument a URL to blob storage?
switch host := strings.ToLower(u.Host); true {
// Azure Stack does not have the core.windows.net
case strings.Contains(host, ".blob"):
return common.ELocation.Blob()
case strings.Contains(host, ".file"):
return common.ELocation.File()
case strings.Contains(host, ".dfs"):
return common.ELocation.BlobFS()
case strings.Contains(host, benchmarkSourceHost):
return common.ELocation.Benchmark()
// enable targeting an emulator/stack
case IPv4Regex.MatchString(host):
return common.ELocation.Unknown()//This is what gets returned in case of storage emulator URL.
}
if common.IsS3URL(*u) {
return common.ELocation.S3()
}
}
}
return common.ELocation.Local()
}
I had this same issue when trying to do a sync from my local machine to Azure Blob storage.
This was the command I was running:
azcopy sync "C:\AzureStorageTest\my-app\*" "https://myapptest.z16.web.core.windows.net/$web"
But I got the error below:
INFO: The parameters you supplied were Source: 'c:\AzureStorageTest\my-app' of type Local, and Destination: 'https://
myapptest.z16.web.core.windows.net/' of type Local
INFO: Based on the parameters supplied, a valid source-destination combination could not automatically be found. Please
check the parameters you supplied. If they are correct, please specify an exact source and destination type using the -
-from-to switch. Valid values are two-word phases of the form BlobLocal, LocalBlob etc. Use the word 'Blob' for Blob St
orage, 'Local' for the local file system, 'File' for Azure Files, and 'BlobFS' for ADLS Gen2. If you need a combination
that is not supported yet, please log an issue on the AzCopy GitHub issues list.
error parsing the input given by the user. Failed with error Unable to infer the source 'C:\AzureStorageTest\my-app'
/ destination 'https://myapptest.z16.web.core.windows.net.z16.web.core.windows.net/'.
PS C:\Users\promise> azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net.z16.web.core.window
s.net/$web" -from-to localblob
Error: unknown shorthand flag: 'f' in -from-to
Here's how I solved it:
I was missing the ?[SAS] argument at the end of the Blob storage location. Also, as of this writing, the azccopy sync command does not seem to support the --from-to switch. So instead of this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net/$web"
I had this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.blob.core.windows.net/%24web?[SAS]" --recursive
Note:
The format is azcopy sync "/path/to/dir" "https://[account].blob.core.windows.net/[container]/[path/to/directory]?[SAS]" --recursive. You only need to modify the "/path/to/dir", [account] and [container]/[path/to/directory]. Every other thing remains the way it is.
My actual Blob storage location is https://myapptest.blob.core.windows.net/$24web but I used https://myapptest.blob.core.windows.net/%24web, since $ will throw some error when used, so %24 was used.
Don't use the blob website address like I did which is https://myapptest.z16.web.core.windows.net/ that got me frustrated with lots of errrors, rather use the blob storage address https://myapptest.blob.core.windows.net
The --recursive flag is already inferred in this directory sync operation, so you may consider leaving it out. I put it though for clarity sake.
That's all.
I hope this helps
OKâafter much futzing I was finally able to get this to work, using Azurite and PowerShell. It's clear that neither AzureCLI nor AzCopy are well-tested under emulation.
Here's a rough-and-tumble script that can be called from a pipeline:
[CmdletBinding()]
param(
[Parameter(Mandatory)][string] $Container,
[Parameter(Mandatory)][string] $Source
)
$Context = New-AzureStorageContext -Local
$BlobNames = Get-AzureStorageBlob -Context $Context -Container $Container | % { $_.Name }
$FilesToSync = gci $Source\* -Include RELEASES, Setup.exe
$Packages = gci $Source -Filter *.nupkg
$Packages | % {
If (!($BlobNames.Contains($_.Name))) {
$FilesToSync += $_
}
}
$FilesToSync | Set-AzureStorageBlobContent -Context $Context -Container $Container -Force
Note that this is highly customized for my Squirrel deployments (*.nupkg, RELEASES, Setup.exe), so a person will want to adjust accordingly for his own environment.
Azurite can be set to always-on using a Scheduled Task to run this command every hour:
powershell -command "Start-Process azurite-blob.cmd -PassThru -ArgumentList '--blobHost 0.0.0.0'"
The argument sets Azurite to listen on any IP so that it can be reached from other computers on the network. I punched a hole in the firewall for ports 10000-10002.
Be careful to set the Task to run under the same account that was used to install Azurite, otherwise the Task won't be able to see azurite-blob.cmd (it's in %AppData%\npm, which is added to PATH during installation).
The command line needs the --recursive option. See https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs?toc=/azure/storage/blobs/toc.json
In an Azure Function I am trying to load a PowerShell module but getting the error Assembly with same
name is already loaded.
Code Sample
Import-Module "D:\home\site\wwwroot\HelloWorld\modules\MsrcSecurityUpdates\1.7.2\MsrcSecurityUpdates.psd1"
Error Message
Import-Module : Assembly with same name is already loaded
At C:\home\site\wwwroot\HelloWorld\run.ps1:25 char:5
+ Import-Module "D:\home\site\wwwroot\HelloWorld\modules\MsrcSecuri ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Import-Module], FileLoadException
+ FullyQualifiedErrorId : FormatXmlUpdateException,Microsoft.PowerShell.Commands.ImportModuleCommand
Some additional background..
This code was working yesterday. I have made a lot of edits so cannot clearly state the same code which was working yesterday is now failing.
I am editing the code directly via the browser.
I have restarted the web app, to potentially flush out any assemblies loaded during my code. Did not make a difference.
I checked if the module is available with the following, which returns the MsrcSecurityUpdates is NOT installed.
if (-not (Get-Module -Name "MsrcSecurityUpdates"))
{
Write-Output "MsrcSecurityUpdates NOT installed";
}
else
{
Write-Output "MsrcSecurityUpdates YES installed";
}
I downloaded the module with
Save-Module -Name MsrcSecurityUpdates -Path "C:\TEMP" -Force
and subsequently uploaded to the Azure Function File Share using the Kudo console. As per the steps outlined in this Stackoverflow question
This module seems to conflict with other modules in your app, or with assemblies loaded explicitly from your code. It is also possible that the module content is corrupted.
First of all, I would recommend relying on the Managed Dependencies feature instead of uploading the module via Kudu. Just include a reference to your module into the requirements.psd1 file at the root of your app:
#{
...
'MsrcSecurityUpdates' = '1.*'
}
If you edit this file in the Portal, you may need to restart your app. The next time you invoke any function, the latest version of this module will be automatically installed from the PowerShell Gallery and will be available on PSModulePath, so you can import it without specifying any path:
Import-Module MsrcSecurityUpdates
Try this on a brand new app without any other modules: MsrcSecurityUpdates will be loaded. However, if you are still getting the same error, this means MsrcSecurityUpdates is in conflict with other modules your app is using. You can narrow it down by removing other modules from your app (including cleaning up the modules uploaded via Kudu) and reducing your code.
[UPDATE] Potential workarounds:
Try to import (Import-Module) the modules in a certain fixed order, to make sure the more recent assembly versions are loaded first. This may or may not help, depending on the design of the modules.
Try executing commands from one of the modules in a separate process (using PowerShell jobs or sessions, or even invoking pwsh.exe).
I'm trying to download and run a PowerShell script (from blob storage) using the Run Powershell artifact on an existing VM in Azure DevTest labs.
I get the following error and I assume I am doing something stupid.
& : The term './script.ps1' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
At line:1 char:3
+ & ./script.ps1
+ ~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (./script.ps1:String) [], Comman
dNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
Here is my setup...
I have also tried the JSON array syntax, which gave the same result, and an invalid URL which gave a 404 error so it seems as if it is downloading my script but then failing to find it.
Below is info I wrote a while back.
Few items to note:
Folder structure is not supported as of this writing. Therefore, script needs to be at the root of the container
Ensure your blob is public
First you will need your file in Azure storage. Once uploaded in your container, click the file to get to its properties and copy the URL field.
As an example, I have created the following Run.ps1 script file and uploaded it to storage as a blob:
param ( [string]$drive = "c:\" )
param ( [string]$folderName = "DefaultFolderName" )
New-Item -Path $drive -Name $folderName -ItemType "directory"
Now, while adding the 'Run PowerShell' artifact to the VM, we provide the following information:
File URI(s): URL field copied from earlier step. (eg. https://myblob.blob.core.windows.net/mycontainer/Run.ps1)
Script to Run: Name of the PS1 script, (eg. Run.ps1)
Script Arguments: Arguments as you would write them at the end of your command (eg. -drive "d:\" -folderName "MyFolder")
I'm trying to install an Octopus tentacle as part of an Azure deploy using Powershell DCS extension
I've installed OctopusDSC under the automation user and it appears in the module list
ResourceGroupName : RESOURCEGROUP
AutomationAccountName : AUTOMATIONUSER
Name : OctopusDSC
IsGlobal : False
Version :
SizeInBytes : 0
ActivityCount : 0
CreationTime : 22/02/2017 14:03:07 +00:00
LastModifiedTime : 22/02/2017 14:04:42 +00:00
ProvisioningState : Succeeded
I've then created a powershell script with a basic install that is trying to import the module (first few lines below):
Configuration installoctopus
{
Import-DscResource -ModuleName OctopusDSC
But then I get the error during deployment:
Unable to load resource 'OctopusDSC': Resource not found.\r\n\r\nAt C:\Packages\Plugins\Microsoft.Powershell.DSC\2.22.0.0\DSCWork\installoctopus2.0\installoctopus2.ps1:8 char:7\r\n+ cTentacleAgent OctopusTentacle\r\n+
I've tired with Import-DscResource -Module OctopusDSC as well as Import-DscResource -Module * but get the same errors
One of the first parts of the OctopusDSC documentation is
First, ensure the OctopusDSC module is on your $env:PSModulePath. Then you can create and apply configuration like this.
but I didn't have to do this for the cChoco DSC (and I'm unsure how to do it as part of a DSC configuration?) module which works fine. Is this a different type of module that requires extra import options? Is it actually a powershell module and required to be on the guest VM despite being in the Azure automation module list
The OctopusDSC resource needs to be on the guest VM for the Import-DscResource -ModuleName OctopusDSC cmd to succeed on the guest VM. So make sure it's in the ZIP file that contains your configuration script.
The easiest way to get all the resources needed into the zip file is to create it with the Publish-AzureRMVMDSCConfiguration cmdlet and just use the OutputArchivePath param. But for that cmdlet to find it, it must be in $env:PSModulePath on the machine where you run the cmdlet. So 1) install OctopusDSC in PSModulePath (on the "build" machine) and then 2) run the cmdlet.
Alternatively, you can manually add the OctopusDSC module to the zip file - usually this is just putting the folder in the zip file, but depending on the resource can mean more than that (I don't know of a good doc on manually creating it), but it's trivial to try this route and see if it works.