How to upload to azure blob from a dataset or a datatable - azure

I am currently working in Powershell, trying to get some data from my Azure SQL Database. I have with success fetched some data into a dataset. However, i cannot seem to figure out how to upload it to Azure blob storage without saving it locally first as a csv.
The dataset must be converted to csv and uploaded to the blob as a csv without saving it locally.
This is what got so far:
$SQLServer = "xxxxxxx"
$SQLDBName = "xxxxxx"
$uid ="xxxxxxxx"
$pwd = "xxxxxxx"
$SqlQuery = "SELECT * from Dim.xxxxxx;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server = $SQLServer; Database = $SQLDBName; Integrated Security = False; User ID = $uid; Password = $pwd;"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $SqlQuery
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$csv = $DataSet.Tables[0] | ConvertTo-Csv -Delimiter ";" -NoTypeInformation
Set-AzStorageBlobContent -File $csv -Context $context -Container "xxxxxx"
However the last line gives me this error:
Set-AzStorageBlobContent : Cannot convert 'System.Object[]' to the
type 'System.String' required by parameter 'File'. Specified method is
not supported.
I know im doing something wrong but i cannot figure out how to convert the dataset and upload it at the same time. Or maybe there is another way?

According to the documentation for the Set-AzStorageBlobContent, this is not possible:
The Set-AzStorageBlobContent cmdlet uploads a local file to an Azure Storage blob.
Source: https://learn.microsoft.com/en-us/powershell/module/az.storage/set-azstorageblobcontent?view=azps-2.8.0
The reason you are receiving that error message is because the command is expecting a file name, as a string, as the value for the -File parameter, not the content of the blob. Even if you converted the Object[] to a String, it still would not work as the command will try to find a file with that path.
I recommend you use the Blob Storage REST API to achieve this, in particular the Put Blob method. You will have to craft an HTTP request.
The other option is to use the Blob Storage .NET API, as you are able to use .NET classes from Powershell.

Just let Databricks manage it. The SQL Data Warehouse connector for Databricks will manage the intermediate storage. Just load your dataframe, then write to DW using the DW connector.
https://docs.databricks.com/data/data-sources/azure/sql-data-warehouse.html

A solution I found was to create an temporaryfile in Powershell instead.
First I declare a variable with the New-TemporaryFile. After this I take my DataSet and export it to $file Variable. And after I have done that i can upload it to my Azure Blob Storage.
So the solution is:
$file = New-TemporaryFile
$DataSet.Tables[0] | Export-Csv -Path $file -Delimiter ";" -NoTypeInformation
Set-AzStorageBlobContent -File $file -Container "xxxxxx" -Context $context -blob "dataset" -Force

Related

Export Azure key vault secrets as json list (or file)

Azure powershell - how to create json file with azure keyvaults secrets
(I know how to read the secrets with power shell). I do know how to put key values pairs and export as a file.
So, given I have this
secret1Name - secret1Value
secret2Name - secret2Value
secret3Name - secret3Value
I need a file saved to the file system
{
"secret1Name":"secret1Value",
"secret2Name":"secret2Value",
"secret3Name":"secret3Value",
}
I found that there is something like this for reading from a file
$globalParametersJson = Get-Content $globalParametersFilePath
$globalParametersObject = [Newtonsoft.Json.Linq.JObject]::Parse($globalParametersJson)
And I (think) i need help doing it for writing a file.
Can anyone help ??
Export Azure key vault secrets as json list (or file)
I have tried to reproduce your ask and I have received expected results:
What I have understood from your question is that you want to write a secret to file (then below is the answer for that).
Firstly, created an empty Json file and copied its path and I followed Microsoft-Document.
Then I executed the below script:
$y="C:\Users\vs\emo.json"
$secret = Get-AzKeyVaultSecret -VaultName "rithkey"
$secretnames=$secret.Name
$Target = #()
$result = New-Object -TypeName PSObject
foreach($em in $secretnames )
{
$Target=Get-AzKeyVaultSecret -VaultName rithkey -Name $em
$x=Get-AzKeyVaultSecret -VaultName rithkey -AsPlainText -Name $em
$result | Add-Member -Type NoteProperty -Name $Target.Name -Value $x
}
$result | ConvertTo-Json | Set-Content $y
Now we can check file emo.json as we used Set-Content to write to emo.json file and output is below:

Read file data from Azure Storage FileShare

I am trying to read data from the files (json files) that exists in my Azure Storage's FileShare. I don't want to download the file and instead just read and get the data. The following command downloads the file for me instead of reading and storing the data in a variable. Please help.
$fileContents = Get-AzStorageFileContent -ShareName $fileShareName -Path $filePath -Context (Get-AzStorageAccount -ResourceGroupName $ResourceGroupName -Name "name").Context -OutVariable fileContents
How can I read the data and potentially post files in the filshare as well? I have been following this documentation but could not find a solution.
I tried in my environment and got below results:
Initially I tried with below commands to read the file from azure-file share with downloading it worked.
Commands:
$accountName = "venkat123"
$accountKey = "<storage account key >"
$fileshareName = "fileshare1"
$Name = "directory1/file.json"
$content="<downloaded path\file.json>"
$ctx = New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
Get-AzStorageFileContent -ShareName $fileShareName -Path $Name -Destination "file.json" -Context $ctx
Get-content -path $content
Console:
As workaround, if you need to read the files in azure file share without downloading you can use below python code.
Code:
from azure.storage.file import FileService
storageAccount='venkat123'
accountKey='xxxxxxx'
file_service = FileService(account_name=storageAccount, account_key=accountKey)
share_name="fileshare1"
directory_name="directory1"
file_name="file.json"
file = file_service.get_file_to_text(share_name, directory_name, file_name)
print(file.content)
Console:

Parse Excel variables into powershell

I am trying to create a powershell that will grab the varibles from an excel sheet and then add them to the powersehll command.
in the excel sheet i have 3 columns i am interested in the data from (Name , resourcegroup, location)
And then for each line with this i want it to parse into into the varible field for the powershell
I have created the powershell to do what i need but it would be better if it could loop through and pull this as I am just running the command again with different machine info manually added from the excel.
With #Theo Help
I am working with this version of the script now
Import-Csv -Path 'c:\scripts\vmtest.csv' | ForEach-Object {
# combine the VMName with suffix '-Snapshot'
$snapshotName = $vm.name + "-Snapshot"
$SnapshotStorage = "Azure-Snapshots"
$vm = Get-AzVM -ResourceGroupName $_.ResourceGroup -Name $_.Name
# using splatting for better readability
$configParams = #{
SourceUri = $vm.StorageProfile.OsDisk.ManagedDisk.Id
Location = $_.location
CreateOption = 'copy'
}
$snapshot = New-AzSnapshotConfig #configParams
New-AzSnapshot -Snapshot $snapshot -SnapshotName $snapshotname -ResourceGroupName $snapshotstorage
}
If as you have commented, you now have the data stored in a CSV file that might look something like this:
Name,ResourceGroup,Location
PRD-ITM001,SJAVIRTUALMACHINES,uksouth
TST-GRSSQL001,SJAVIRTUALMACHINES,uksouth
it has become very simple to import that data and loop through the records like below:
Import-Csv -Path 'c:\scripts\vmtest.csv' | ForEach-Object {
# combine the VMName with suffix '-Snapshot'
$snapshotName = '{0}-Snapshot' -f $_.Name
$SnapshotStorage = "Azure-Snapshots"
$vm = Get-AzVM -ResourceGroupName $_.ResourceGroup -Name $_.Name
# using splatting for better readability
$configParams = #{
SourceUri = $vm.StorageProfile.OsDisk.ManagedDisk.Id
Location = $_.Location
CreateOption = 'copy'
}
$snapshot = New-AzSnapshotConfig #configParams
New-AzSnapshot -Snapshot $snapshot -SnapshotName $snapshotName -ResourceGroupName $_.ResourceGroup
}
Note that the above code assumes your CSV uses the (default) comma as delimiter character. If in your case this is some other character, append parameter -Delimiter followed by the character the csv uses.
Inside a ForEach-Object {..} loop, the $_ automatic variable references the current record from the csv
I used Splatting for better readability of the code. This helps on cmdlets that take a long list of parameters and eliminates the use of the backtick.
Based on the above shared requirement, we understood that you want to pull the values of ResourceGroupName, VMName from the excel sheet & also you want to use those values in the script further.
Using PSExcel Module, We have written the below PowerShell Script which will pull the ResourceGroupName, VMName from excel & it will run Get-AzVM Cmdlet.
Before running the below PowerShell script , run the below cmdlet Save-Azcontext cmdlet it will saves the current authentication information for use in other PowerShell sessions.
Connect-AzAccount
Save-AzContext -Path C:\test.json
Here is the PowerShell script:
$currentDir = "C:\Program Files\WindowsPowerShell\Modules" ##pass the path of the PSexcel Module
Import-Module $currentDir"\PSExcel"
Import-AzContext -Path C:\test.json ##passing the azcontext file path which was saved earlier
$ExcelFile = "Give here the path of the current folder where scripts are stored"
$objExcel = New-Excel -Path $ExcelFile
$WorkBook = $objExcel|Get-Workbook
ForEach($Worksheet in #($Workbook.Worksheets)){
$totalNoOfRecords = $Worksheet.Dimension.Rows
$totalNoOfItems = $totalNoOfRecords-1
# Declare the starting positions first row and column names
$rowNo,$colResourceGroupName = 1,1
$rowNo,$colVMName = 1,2
if ($totalNoOfRecords -gt 1){
#Loop to get values from excel file
for($i=1;$i -le ($totalNoOfRecords-1);$i++){
$ResourceGroupName=$Worksheet.Cells.Item($rowNo+$i,$colResourceGroupName).Value
$VMName=$Worksheet.Cells.Item($rowNo+$i,$colVMName).Value
Get-AzVM -ResourceGroupName $ResourceGroupName -Name $VMName |select -Property Name,ResourceGroupName,Location
}
}
}
Here is the sample output for reference:
For more information ,you refer this blog post on How to Read excel file using PSExcel Module in PowerShell.

Write-SqlTableData to import CSV into a Azure SQL Server table

The below command creates a new table, test for me but it doesn't insert any data into it.
Write-SqlTableData -TableName test -ServerInstance myservername -DatabaseName mydb -SchemaName dbo -Credential $mycreds -InputData $data -Force
This is the error message I get:
Write-SqlTableData : Cannot access destination table '[Mydb].[dbo].
[test]'.
At line:1 char:1
+ Write-SqlTableData -TableName test -ServerInstance myinstance
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: ([dbo].[test]:Table) [Write-SqlTableData], InvalidOperationException
+ FullyQualifiedErrorId : WriteToTableFailure,Microsoft.SqlServer.Management.PowerShell.WriteSqlTableData
Any ideas are appreciated.
UPDATE
This is the code to populate data.
$data = import-csv 'C:\Users\azure-user\myfile.csv'
This is what the file looks like -
"State","ProviderNbr","Entity","Address","City","Zip","Phone","Ownership","StarRating"
"AL","017000","ALABAMA DEPARTMENT OF HEALTH HOME CARE","201 MONROE STREET, SUITE 1200", "ALABAMA", "32423", "3233233233", "Alabama", "4"
This is a weird one - as you say, in Azure Read-SqlTableData works but Write-SqlTableData doesn't. From the discussion on MSDN here I think its something to do with the Azure environment making it hard for the cmdlet to interpret the 'ServerInstance' parameter.
Example 5 in Microsoft's Write-SqlTableData documentation "Write data to an existing table of an Azure SQL Database" shows the way forwards - we need to instantiate an SMO reference to the Table and feed that to the cmdlet instead. Unfortunately the example Microsoft gives contains a small error (you can't do $table = $db.Tables["MyTable1"] to get the table, it doesn't work)
Here's a modified version of that example:
$csvPath = "C:\Temp\mycsv.csv"
$csvDelimiter = ","
# Set your connection string to Azure SQL DB.
$connString = 'Data Source=server;Persist Security Info=True'
$cred = Get-Credential -Message "Enter your SQL Auth credentials"
$cred.Password.MakeReadOnly()
$sqlcred = New-Object -TypeName System.Data.SqlClient.SqlCredential -ArgumentList $cred.UserName,$cred.Password
# Create a SqlConnection and finally get access to the SMO Server object.
$sqlcc = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $connString,$sqlcred
$sc = New-Object -TypeName Microsoft.SqlServer.Management.Common.ServerConnection -ArgumentList $sqlcc
$srv = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $sc
# Get access to table 'MyTable1' on database 'MyDB'.
# Note: both objects are assumed to exists already.
$db = $srv.Databases["MyDB"]
$tableSmo = $db.Tables | Where-Object {$_.Name -eq "MyTable1"}
# leading comma makes an array with one item
# this makes PowerShell pass the entire contents of the file directly to the Write-SqlTableData cmdlet, which in turn can do a bulk-insert
, (Import-Csv -Path $pcsvPath -Delimiter $csvDelimiter) | Write-SqlTableData -InputObject $tableSmo
$sc.Disconnect()
If you're doing integrated security you can miss off all the $cred and $sqlcred stuff and just create the SqlConnection using $sqlcc = New-Object -TypeName System.Data.SqlClient.SqlConnection -ArgumentList $connString
Note: This worked for me with "A SQL Server running on a VM in Azure", in that I was having the same error as you, 'cannot access destination table' and this approach fixed it. I haven't tested it with an "Azure SQL Database" i.e. SQL Server as a service. But from the microsoft documentation it sounds like this should work.
According to my test, we can not use the command Write-SqlTableData to import CSV file to Azure SQL and we just can use it to import CSV file to on-premise SQL
So if you want to import CSV file to Azure SQL with powershell, you can use the command Invoke-SQLCmdto insert record on by one. For example:
$Database = ''
$Server = '.database.windows.net'
$UserName = ''
$Password = ''
$CSVFileName = ''
$text = "CREATE TABLE [dbo].[Colors2](
[id] [int] NULL,
[value] [nvarchar](30) NULL
) "
Invoke-SQLCmd -ServerInstance $Server -Database $Database -Username $UserName -Password $Password -Query $text
$CSVImport = Import-CSV -Path $CSVFileName
ForEach ($CSVLine in $CSVImport){
$Id =[int] $CSVLine.Id
$Vaule=$CSVLine.value
$SQLInsert = "INSERT INTO [dbo].[Colors2] (id, value)
VALUES('$Id', '$Vaule');"
Invoke-SQLCmd -ServerInstance $Server -Database $Database -Username $UserName -Password $Password -Query $SQLInsert
}
Invoke-SQLCmd -ServerInstance $Server -Database $Database -Username $UserName -Password $Password -Query "select * from [dbo].[Colors2]"
Besides, you also can use other ways ( such as BULK INSERT ) to implement it. For further information, please refer to https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql?view=azuresqldb-current.

Continuous integration and Deploy SSAS tabular to Azure Analysis Services

I am trying to deploy SSAS Tabular model to Azure Analysis Server through MS Build and Release process.
I am able to successfully execute Invoke-ProcessASDatabase. But I am having problem with Deploying new objects to the Azure Server.
I am using Command Line to deploy the tabular model using below command
"C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\Microsoft.AnalysisServices.Deployment.exe"
and it fails with error -
Authentication failed: User ID and Password are required when user interface is not available.
I don't see a way how I can provide credentials in my command line task.
Even I was trying to automate model deployment.
Here is the power shell script I wrote. Hope this helps.
$msBuildPath = Get-MSBuildToPath
$Microsoft_AnalysisServices_Deployment_Exe_Path = Get-Microsoft_AnalysisServices_Deployment_Exe_Path
# BUild smproj
& $msBuildPath $projPath "/p:Configuration=validation" /t:Build
Get-ChildItem $binPath | Copy -Destination $workingFolder -Recurse
$secureStringRecreated = ConvertTo-SecureString -String $AnalysisServerPassword -AsPlainText -Force
$cred = New-Object System.Management.Automation.PSCredential($AnalysisServerUserName, $secureStringRecreated)
#$plainText = $cred.GetNetworkCredential().Password
#region begin Update Model.deploymenttargets
# Read Model.deploymenttargets
[xml]$deploymenttargets = Get-Content -Path $deploymenttargetsFilePath
$deploymenttargets.DeploymentTarget.Database = $AnalysisDatabase
$deploymenttargets.DeploymentTarget.Server = $AnalysisServer
$deploymenttargets.DeploymentTarget.ConnectionString = "DataSource=$AnalysisServer;Timeout=0;UID=$AnalysisServerUserName;Password=$AnalysisServerPassword;"
$deploymenttargets.Save($deploymenttargetsFilePath);
#endregion
#region begin Update Model.deploymentoptions
# Read Model.deploymentoptions
[xml]$deploymentoptions = Get-Content -Path $deploymentoptionsFilePath
# Update ProcessingOption to DoNotProcess otherwise correct xmla file wont be generated.
$deploymentoptions.Deploymentoptions.ProcessingOption = 'DoNotProcess'
$deploymentoptions.Deploymentoptions.TransactionalDeployment = 'false'
$deploymentoptions.Save($deploymentoptionsFilePath);
#endregion
# Create xmla deployment file.
& $Microsoft_AnalysisServices_Deployment_Exe_Path $asdatabaseFilePath /s:$logFilePath /o:$xmlaFilePath
#region begin Update .xmla
#Add passowrd in .xmla file.
$xmladata = Get-Content -Path $xmlaFilePath | ConvertFrom-Json
foreach ($ds in $xmladata.createOrReplace.database.model.dataSources){
$ds.Credential.AuthenticationKind = 'Windows'
$ds.Credential.Username = $AnalysisServerUserName
#Add password property to the object.
$ds.credential | Add-Member -NotePropertyName Password -NotePropertyValue $AnalysisServerPassword
}
$xmladata | ConvertTo-Json -depth 100 | Out-File $xmlaFilePath
#endregion
#Deploy model xmla.
Invoke-ASCmd -InputFile $xmlaFilePath -Server $AnalysisServer -Credential $cred

Resources