Alternative to New-SqlAzureKeyVaultColumnMasterKeySettings - azure

I currently have a script that is similair to the script that is described inside https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/configure-always-encrypted-keys-using-powershell?view=sql-server-ver15
However the script uses powershell 5 which isn't available to me on linux agents in our azure-devops environment. Because of this a majority of the azure-sql commands aren't available to us. Is there an alternative to these sqlserver module cmdlets.

The recommended alternative is to use the cloud console, but you may be able to get away with linux powershell like so:
I can't guarantee if this all works, but that specific command only creates a SqlColumnMasterKeySettings object that contains information about the location of your column master key. You can probably just create one manually, but you'll need to know the exact values. I would recommend running it from a windows machine first to see what the values should be for your environment.
# On Windows [Optional]
$cmkSettings = New-SqlAzureKeyVaultColumnMasterKeySettings -KeyURL $akvKey.Key.Kid
$cmkSettings | Format-List KeystoreProviderName,KeyPath
KeystoreProviderName : # Take these strings
KeyPath : # And use them in your script on linux
# Now on Linux, using the values above:
$cmkSettings = [Microsoft.SqlServer.Management.PowerShell.AlwaysEncrypted.SqlColumnMasterKeySettings]::new("YourKeystoreProviderName","YourKeyPath")
New-SqlColumnMasterKey -Name 'CMK1' -InputObject $database -ColumnMasterKeySettings $cmkSettings
# Success!
The key settings properties are just strings that get saved to your SQL Instance, so this should work fine. The harder part is authenticating to Azure to create keys from your master key, but you can try importing the desktop version of the commands like so:
# Start a NEW powershell session without the sqlserver module:
pwsh
# Get the module directory:
$d = (Get-Item (Get-Module SqlServer).path).DirectoryName
# Import the desktop version of these assemblies:
Import-Module "$d/Microsoft.SqlServer.Diagnostics.Strace.dll"
Import-Module "$d/Microsoft.SqlServer.Management.PSSnapins.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AzureAuthenticationManagement.dll"
Import-Module "$d/Microsoft.SqlServer.Management.AlwaysEncrypted.Types.dll"
# Then import the module normally (there will be errors - you can ignore these)
Import-Module SqlServer
# Now you may be able to authenticate to Azure to generate new keys:
# Required to generate new keys
# NOTE: -Interactive fails on linux
Add-SqlAzureAuthenticationContext -ClientID "YourID" -Secret "YourSecret" -Tenant "YourTenant"
# Create a key using your master key:
New-SqlColumnEncryptionKey -Name 'CEK1' -InputObject $database -ColumnMasterKey 'CMK1'
This worked on my installation of centos7/pwsh7.1.3 - make sure you have SqlServer version 21.1.18245 (only 10 days old at the moment) as many new sql commands got ported to pwsh 7.1.

Related

AzCopy Sync command is failing

I'm issuing this command:
azcopy sync "D:\Releases\Test\MyApp" "http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=REDACTED"
...and I'm getting this error:
error parsing the input given by the user. Failed with error Unable to infer the source 'D:\Releases\Test\MyApp' / destination 'http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=-REDACTED-
I would have thought my source was pretty clear.
Can anyone see anything wrong with my syntax?
I believe you have run into an issue with azcopy that it does not support local emulator (at least for sync command). There's an open issue on Github for the same: https://github.com/Azure/azure-storage-azcopy/issues/554.
Basically the issue is coming from the following lines of code, where it returns location as Unknown in case of storage emulator URLs:
func inferArgumentLocation(arg string) common.Location {
if arg == pipeLocation {
return common.ELocation.Pipe()
}
if startsWith(arg, "http") {
// Let's try to parse the argument as a URL
u, err := url.Parse(arg)
// NOTE: sometimes, a local path can also be parsed as a url. To avoid thinking it's a URL, check Scheme, Host, and Path
if err == nil && u.Scheme != "" && u.Host != "" {
// Is the argument a URL to blob storage?
switch host := strings.ToLower(u.Host); true {
// Azure Stack does not have the core.windows.net
case strings.Contains(host, ".blob"):
return common.ELocation.Blob()
case strings.Contains(host, ".file"):
return common.ELocation.File()
case strings.Contains(host, ".dfs"):
return common.ELocation.BlobFS()
case strings.Contains(host, benchmarkSourceHost):
return common.ELocation.Benchmark()
// enable targeting an emulator/stack
case IPv4Regex.MatchString(host):
return common.ELocation.Unknown()//This is what gets returned in case of storage emulator URL.
}
if common.IsS3URL(*u) {
return common.ELocation.S3()
}
}
}
return common.ELocation.Local()
}
I had this same issue when trying to do a sync from my local machine to Azure Blob storage.
This was the command I was running:
azcopy sync "C:\AzureStorageTest\my-app\*" "https://myapptest.z16.web.core.windows.net/$web"
But I got the error below:
INFO: The parameters you supplied were Source: 'c:\AzureStorageTest\my-app' of type Local, and Destination: 'https://
myapptest.z16.web.core.windows.net/' of type Local
INFO: Based on the parameters supplied, a valid source-destination combination could not automatically be found. Please
check the parameters you supplied. If they are correct, please specify an exact source and destination type using the -
-from-to switch. Valid values are two-word phases of the form BlobLocal, LocalBlob etc. Use the word 'Blob' for Blob St
orage, 'Local' for the local file system, 'File' for Azure Files, and 'BlobFS' for ADLS Gen2. If you need a combination
that is not supported yet, please log an issue on the AzCopy GitHub issues list.
error parsing the input given by the user. Failed with error Unable to infer the source 'C:\AzureStorageTest\my-app'
/ destination 'https://myapptest.z16.web.core.windows.net.z16.web.core.windows.net/'.
PS C:\Users\promise> azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net.z16.web.core.window
s.net/$web" -from-to localblob
Error: unknown shorthand flag: 'f' in -from-to
Here's how I solved it:
I was missing the ?[SAS] argument at the end of the Blob storage location. Also, as of this writing, the azccopy sync command does not seem to support the --from-to switch. So instead of this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.z16.web.core.windows.net/$web"
I had this:
azcopy sync "C:\AzureStorageTest\my-app" "https://myapptest.blob.core.windows.net/%24web?[SAS]" --recursive
Note:
The format is azcopy sync "/path/to/dir" "https://[account].blob.core.windows.net/[container]/[path/to/directory]?[SAS]" --recursive. You only need to modify the "/path/to/dir", [account] and [container]/[path/to/directory]. Every other thing remains the way it is.
My actual Blob storage location is https://myapptest.blob.core.windows.net/$24web but I used https://myapptest.blob.core.windows.net/%24web, since $ will throw some error when used, so %24 was used.
Don't use the blob website address like I did which is https://myapptest.z16.web.core.windows.net/ that got me frustrated with lots of errrors, rather use the blob storage address https://myapptest.blob.core.windows.net
The --recursive flag is already inferred in this directory sync operation, so you may consider leaving it out. I put it though for clarity sake.
That's all.
I hope this helps
OK—after much futzing I was finally able to get this to work, using Azurite and PowerShell. It's clear that neither AzureCLI nor AzCopy are well-tested under emulation.
Here's a rough-and-tumble script that can be called from a pipeline:
[CmdletBinding()]
param(
[Parameter(Mandatory)][string] $Container,
[Parameter(Mandatory)][string] $Source
)
$Context = New-AzureStorageContext -Local
$BlobNames = Get-AzureStorageBlob -Context $Context -Container $Container | % { $_.Name }
$FilesToSync = gci $Source\* -Include RELEASES, Setup.exe
$Packages = gci $Source -Filter *.nupkg
$Packages | % {
If (!($BlobNames.Contains($_.Name))) {
$FilesToSync += $_
}
}
$FilesToSync | Set-AzureStorageBlobContent -Context $Context -Container $Container -Force
Note that this is highly customized for my Squirrel deployments (*.nupkg, RELEASES, Setup.exe), so a person will want to adjust accordingly for his own environment.
Azurite can be set to always-on using a Scheduled Task to run this command every hour:
powershell -command "Start-Process azurite-blob.cmd -PassThru -ArgumentList '--blobHost 0.0.0.0'"
The argument sets Azurite to listen on any IP so that it can be reached from other computers on the network. I punched a hole in the firewall for ports 10000-10002.
Be careful to set the Task to run under the same account that was used to install Azurite, otherwise the Task won't be able to see azurite-blob.cmd (it's in %AppData%\npm, which is added to PATH during installation).
The command line needs the --recursive option. See https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs?toc=/azure/storage/blobs/toc.json

Transfer file from Windows to Linux without using 3rd party software and using Shell.Application only

How to transfer file from Window Server to Linux without using 3rd party software? I just can use pure PowerShell script to transfer zip file.
I'm using PowerShell v2.0 (I know it's pretty old and I don't have privilege to update to current version - only can use for Shell.Application script)
Telnet successfully
Destination server installed private/public key (which I gen from my server using PuTTYgen - but no privilege to install PuTTY or WinSCP)
$timestamp = (Get-Date).AddMonths(-1).ToString('yyyy-MM-dd')
$todaysDate = (Get-Date).AddDays(-1)
$source = "D:\Testing\*.csv", "D:\Testing\*.csv"
$target = "D:\Testing\bin\$timestamp.zip"
$housekeepZipFile = "D:\Testing\bin\*.zip"
$locationToTransfer = "D:\Testing\bin\*.zip"
$mftFileTransfer = "UserName#192.168.0.50:/UserName/Outbox"
Get-ChildItem -Path $locationToTransfer –Recurse | Where-Object {
$_.LastWriteTime -gt (Get-Date).AddDays(-1)
} | Copy-Item -Destination $mftFileTransfer -Force
Is my syntax correct? Just now tried, seems not receive any file.
Using Window Server 2008
As Ansgar already commented, keys are used with SSH/SFTP. There's no support for SSH/SFTP in PowerShell nor in Windows 2008. If you need to use SSH/SFTP, you have to use 3rd party software/library.
And as already said above, you do not need install privileges to use WinSCP nor PuTTY/psftp.

Clearing Azure Redis Cache using PowerShell during deployment

When deploying new versions of our web application to Azure App Service, I have a requirement to clear out the data in the associated Azure Redis Cache. This is to ensure that we don't return old versions of items which have schema changes in the new version.
We're deploying using Octopus Deploy, and I have previously tried executing the following PowerShell command to Reset the cache:
Reset-AzureRmRedisCache -ResourceGroupName "$ResourceGroup" -Name "$PrimaryCacheName" -RebootType "AllNodes" -Force
This works successfully but it's a bit heavy-handed and we're having intermittent connection issues which I suspect are caused by the fact that we're rebooting Redis and dropping existing connections.
Ideally, I'd just like to execute a FLUSHALL command via PowerShell. Is this a better approach, and is it possible to execute in PowerShell using the StackExchange.Redis library?
The Reset-AzureRmRedisCache cmdlet restarts nodes of an Azure Redis Cache instance, which I agree it is a bit overkill for your requirement.
Yes, it is possible to execute a Redis FLUSHALL command in PowerShell.
As the pre-requisite, you should install the Redis CLI and set an environment variable to point to the Redis CLI executable/binary path in your environment.
Then, you can execute in PowerShell using the Redis-CLI commands as shown below.
Invoke-Command -ScriptBlock { redis-cli -h <hostname>.redis.cache.windows.net -p <redisPort> -a <password> }
Invoke-Command -ScriptBlock { redis-cli flushall }
A execution result of code sample above is as shown below:
The way I eventually implemented this is to call the StackExchange.Redis library via PowerShell, so you'll need to have a copy of this DLL somewhere handy. During my deployment, I have access to the connection string, so this function strips out the host and port to connect to the server. This works without the need to open the non-SSL port, and the connection string allows admin access to the cache:
function FlushCache($RedisConnString)
{
# Extract the Host/Port from the start of the connection string (ignore the remainder)
# e.g. MyUrl.net:6380,password=abc123,ssl=True,abortConnect=False
$hostAndPort = $RedisConnString.Substring(0, $RedisConnString.IndexOf(","))
# Split the Host and Port e.g. "MyUrl.net:6380" --> ["MyUrl.net", "6380"]
$RedisCacheHost, $RedisCachePort = $hostAndPort.split(':')
Write-Host "Flushing cache on host - $RedisCacheHost - Port $RedisCachePort" -ForegroundColor Yellow
# Add the Redis type from the assembly
$asm = [System.Reflection.Assembly]::LoadFile("StackExchange.Redis.dll")
# Open a connection
[object]$redis_cache = [StackExchange.Redis.ConnectionMultiplexer]::Connect("$RedisConnString,allowAdmin=true",$null)
# Flush the cache
$redisServer = $redis_cache.GetServer($RedisCacheHost, $RedisCachePort,$null)
$redisServer.FlushAllDatabases()
# Dispose connection
$redis_cache.Dispose()
Write-Host "Cache flush done" -ForegroundColor Yellow
}
I have used the Windows port of netcat to clear Redis cache remotely from my Windows machine, like so:
$redisCommands = "SELECT $redisDBIndex`r`nFLUSHDB`r`nQUIT`r`n"
$redisCommands | .\nc $redisServer 6379
Where $redisDBIndex is the Redis Cache index you want to clear. Or simply the command FLAUSHALL if you want to clear everything. $redisServer is your Redis server. And simply pipe to nc.
I have also documented it here: https://jaeyow.github.io/fullstack-developer/automate-redis-cache-flush-in-powershell/#

OctopusDSC module in Azure DSC not found

I'm trying to install an Octopus tentacle as part of an Azure deploy using Powershell DCS extension
I've installed OctopusDSC under the automation user and it appears in the module list
ResourceGroupName : RESOURCEGROUP
AutomationAccountName : AUTOMATIONUSER
Name : OctopusDSC
IsGlobal : False
Version :
SizeInBytes : 0
ActivityCount : 0
CreationTime : 22/02/2017 14:03:07 +00:00
LastModifiedTime : 22/02/2017 14:04:42 +00:00
ProvisioningState : Succeeded
I've then created a powershell script with a basic install that is trying to import the module (first few lines below):
Configuration installoctopus
{
Import-DscResource -ModuleName OctopusDSC
But then I get the error during deployment:
Unable to load resource 'OctopusDSC': Resource not found.\r\n\r\nAt C:\Packages\Plugins\Microsoft.Powershell.DSC\2.22.0.0\DSCWork\installoctopus2.0\installoctopus2.ps1:8 char:7\r\n+ cTentacleAgent OctopusTentacle\r\n+
I've tired with Import-DscResource -Module OctopusDSC as well as Import-DscResource -Module * but get the same errors
One of the first parts of the OctopusDSC documentation is
First, ensure the OctopusDSC module is on your $env:PSModulePath. Then you can create and apply configuration like this.
but I didn't have to do this for the cChoco DSC (and I'm unsure how to do it as part of a DSC configuration?) module which works fine. Is this a different type of module that requires extra import options? Is it actually a powershell module and required to be on the guest VM despite being in the Azure automation module list
The OctopusDSC resource needs to be on the guest VM for the Import-DscResource -ModuleName OctopusDSC cmd to succeed on the guest VM. So make sure it's in the ZIP file that contains your configuration script.
The easiest way to get all the resources needed into the zip file is to create it with the Publish-AzureRMVMDSCConfiguration cmdlet and just use the OutputArchivePath param. But for that cmdlet to find it, it must be in $env:PSModulePath on the machine where you run the cmdlet. So 1) install OctopusDSC in PSModulePath (on the "build" machine) and then 2) run the cmdlet.
Alternatively, you can manually add the OctopusDSC module to the zip file - usually this is just putting the folder in the zip file, but depending on the resource can mean more than that (I don't know of a good doc on manually creating it), but it's trivial to try this route and see if it works.

Chef WebPI cookbook fails install in Azure

I setup a new Win2012 VM in Azure with the Chef plugin and have it connected to manage.chef.io. Added a cookbook which uses the WebPi cookbook to install ServiceBus and its dependencies. The install fails with the following error:
“Error opening installation log file. Verify that the specified log file location exists and is writable.”
After some searching it looks like this is not new in Azure based on this 2013 blog post - https://nemetht.wordpress.com/2013/02/27/web-platform-installer-in-windows-azure-startup-tasks/
It offers a hack to disabled security on the folder temporarily but I'm looking for a better solution.
Any ideas?
More of the log output -
Started installing: 'Microsoft Windows Fabric V1 RTM'
.
Install completed (Failure): 'Microsoft Windows Fabric V1 RTM'
.
WindowsFabric_1_0_960_0 : Failed.
Error opening installation log file. Verify that the specified log file location exists and is writable.
DependencyFailed: Microsoft Windows Fabric V1 CU1
DependencyFailed: Windows Azure Pack: Service Bus 1.1
.
..
Verifying successful installation...
Microsoft Visual C++ 2012 SP1 Redistributable Package (x64) True
Microsoft Windows Fabric V1 RTM False
Log Location: C:\Windows\system32\config\systemprofile\AppData\Local\Microsoft\Web Platform Installer\logs\install\2015-05-11T14.15.51\WindowsFabric.txt
Microsoft Windows Fabric V1 CU1 False
Windows Azure Pack: Service Bus 1.1 False
Install of Products: FAILURE
STDERR:
---- End output of "WebpiCmd.exe" /Install /products:ServiceBus_1_1 /suppressreboot /accepteula /Log:c:/chef/cache/WebPI.log ----
Ran "WebpiCmd.exe" /Install /products:ServiceBus_1_1 /suppressreboot /accepteula /Log:c:/chef/cache/WebPI.log returned -1
A Chef contact (thanks Bryan!) helped me understand this issue better. Some WebPI packages do not respect the explicit log path provided to WebPIcmd.exe. The author should fix the package to use the provided log path when it is set. So the options became:
Have the author fix the package
Run Chef in a new scheduled task as a different user which has access
to the AppData folder
Edit the cookbook to perform/unperform a registry edit to temporarily move the AppData folder to a location that the System
user has access. Either in my custom cookbook or fork the WebPI
cookbook.
Obviously, waiting on the author (Microsoft in this case) to fix the package would not happen quickly.
Changing how the Azure VM runs Chef doesn't make sense considering the whole idea is to provide the configuration at the time of provisioning and it just work. Plus changing the default setup may have unintended consequences and puts us in a non-standard environment.
In the short term, I decided to alter the registry in my custom cookbook.
registry_key 'HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders' do
values [{
:name => "Local AppData",
:type => :expand_string,
:data => "%~dp0appdata"
}]
action :create
end
webpi_product 'ServiceBus_1_1' do
accept_eula true
action :install
end
webpi_product 'ServiceBus_1_1_CU1' do
accept_eula true
action :install
end
registry_key 'HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders' do
values [{
:name => "Local AppData",
:type => :expand_string,
:data => '%%USERPROFILE%%\AppData\Local'
}]
end
This change could also be done in the WebPI cookbook as well to fix this issue for all dependent cookbooks. I decided to not approach this until the WebPI team responds to a feature request for the framework to verify packages respect the log path instead.
http://forums.iis.net/t/1225061.aspx?WebPI+Feature+Request+Validate+product+package+log+path+usage
Please go and reply to this thread to try to get the team to help protect against this common package issue.
Here is the solution with POWERSHELL
I had the same error while installing "Service Fabric SDK" during VMSS VM creation. Also the system user was used.
Issue: when I was connecting with RDP with my "admin" user and run the same, it worked.
Solution: change the registry entry as above, install and reset back
here is my solution using "powershell"
I installed 2 .reg files into %TEMP% folder. The content is the old and new exported key / value for the
plugin-sf-SDK-temp.reg
Windows Registry Editor Version 5.00
[HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders]
"Local AppData"=hex(2):25,00,54,00,45,00,4d,00,50,00,25,00,00,00
plugin-sf-SDK-orig.reg
Windows Registry Editor Version 5.00
[HKEY_USERS\.DEFAULT\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders]
"Local AppData"=hex(2):25,00,55,00,53,00,45,00,52,00,50,00,52,00,4f,00,46,00,\
49,00,4c,00,45,00,25,00,5c,00,41,00,70,00,70,00,44,00,61,00,74,00,61,00,5c,\
00,4c,00,6f,00,63,00,61,00,6c,00,00,00
Integrate the following code into your custom-powershelgl script:
Write-Output "Reset LocalApp Folder to TEMP"
Start-Process "$($env:windir)\regedit.exe" `
-ArgumentList "/s", "$($env:TEMP)\plugin-sf-SDK-temp.reg"
## replace the following lines with your installation - here my SF SDK installation via WebWPIcmd
Write-Output "Installing /Products:MicrosoftAzure-ServiceFabric-CoreSDK"
Start-Process "$($env:programfiles)\microsoft\web platform installer\WebPICMD.exe" `
-ArgumentList '/Install', `
'/Products:"MicrosoftAzure-ServiceFabric-CoreSDK"', `
'/AcceptEULA', "/Log:$($env:TEMP)\WebPICMD-install-service-fabric-sdk.log" `
-NoNewWindow -Wait `
-RedirectStandardOutput "$($env:TEMP)\WebPICMD.log" `
-RedirectStandardError "$($env:TEMP)\WebPICMD.error.log"
Write-Output "Reset LocalApp Folder to ORIG"
Start-Process "$($env:windir)\regedit.exe" `
-ArgumentList "/s", "$($env:TEMP)\plugin-sf-SDK-orig.reg"

Resources