How to import csv to az storage table - azure

Am trying to upload csv file to the azure table storage, which has 1000 + records and 6 columns, as shown below as Csvheadres, unfortunately, ended up with an error.do I need to add partition key and row key columns?
the CSV columns(CsvHeaders)
res1
res2
res3
res4
res5
res6
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value[0]
$ctx = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzStorageTable -Name $tableName -Context $ctx
$CsvContents = Import-Csv -Path $Path
$CsvHeaders = ($CsvContents[0] | Get-Member -MemberType NoteProperty).Name | Where{$_ -ne "RowKey" -and $_ -ne "PartitionKey"}
Foreach($CsvContent in $CsvContents)
{
$PartitionKey = $CsvContent.PartitionKey
$RowKey = $CsvContent.RowKey
$Entity = New-Object "Microsoft.Azure.Cosmos.Table.DynamicTableEntity" "$PartitionKey", "$RowKey"
Foreach($CsvHeader in $CsvHeaders)
{
$Value = $CsvContent.$CsvHeader
$Entity.Properties.Add($CsvHeader, $Value)
}
Write-Verbose "Inserting the entity into table storage."
$result = $Table.Execute([Microsoft.Azure.Cosmos.Table.TableOperation]::Insert($Entity))
}

Related

Find/list the Unused storage accounts in azure using powershell

Trying to get the list of unused/inactive storage accounts in azure using powershell. Below is my script which im trying it will provide the storage account name and last modified date of your Azure storage accounts, but i need to list only the unused storage accounts names not all the storage accounts, for that some condition/filter i need to provide to achieve the same. Please assist me to solve this. Thanks in Advance
It will output the results into a table detailing the name and last modified date of your Azure storage accounts.
& {
foreach ($storageAccount in Get-AzStorageAccount) {
$storageAccountName = $storageAccount.StorageAccountName
$resourceGroupName = $storageAccount.ResourceGroupName
# Get storage account key
$storageAccountKey = (Get-AzStorageAccountKey -Name $storageAccountName -ResourceGroupName $resourceGroupName).Value[0]
# Create storage account context using above key
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Get the last modified date
$lastModified = Get-AzStorageContainer -Context $context | Sort-Object -Property #{Expression = {$_.LastModified.DateTime}} | Select-Object -Last 1 -ExpandProperty LastModified
# Collect the information to output to a table when the for loop has completed
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastModified.DateTime;
ResourceGroupName = $resourceGroupName
}
}
} | Format-Table Name, LastModified, ResourceGroupName -autosize
I tried to reproduce the same in my environment and got the same result as below:
By using the same script, I got the storage account name and last modified date of the Azure storage accounts.
To get only the unused/inactive storage accounts in azure using PowerShell, I modified the script like below:
I agree with #Niclas, you need make use of get-date command.
& {
foreach ($storageAccount in Get-AzStorageAccount) {
$storageAccountName = $storageAccount.StorageAccountName
$resourceGroupName = $storageAccount.ResourceGroupName
$storageAccountKey = (Get-AzStorageAccountKey -Name $storageAccountName -ResourceGroupName $resourceGroupName).Value[0]
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$lastModified = Get-AzStorageContainer -Context $context | Sort-Object -Property #{Expression = {$_.LastModified.DateTime}} | Select-Object -Last 1 -ExpandProperty LastModified
$unusedacc = (Get-Date).AddDays(-10)
if ($lastModified.DateTime -lt $unusedacc) {
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastModified.DateTime;
ResourceGroupName = $resourceGroupName
}
}
}
} | Format-Table Name, LastModified, ResourceGroupName -autosize
Note: Based on your requirement you can change the number of days in this line $unusedacc = (Get-Date).AddDays(-10).
If there are no unused Storage accounts, then it will return blank results like below:
Use get-date
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/get-date
and use the Where-Object.
# ADD THIS
$lastModDate = (get-date).AddDays(-5).Date
$lastMod = $lastModified | Where-Object { ($_.DateTime).Date -lt $lastModDate}
# If $lastMod.DateTime is NOT empty, then:
if ($lastMod.DateTime) {
# Write-Host "variable is NOT null " + $storageAccountName # For testing purpose
# Collect the information to output to a table when the for loop has completed
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastMod.DateTime; # CHANGE THIS
ResourceGroupName = $resourceGroupName
}
}
https://www.techielass.com/find-unused-storage-accounts-in-azure/
With your script:
With my changes:

Powershell - Return Azure Blob Size by first split string value

So I am trying to list the contents on an Azure blog container size, I can get how much storage is being used as a whole. What I trying to do is break it down, by the first position of the filename.
And this is my sad attempt. Could someone point me in the right direction please?
$ResourceGroup = "RG"
$StorageAccountName = "SAN"
$ContainerName = "CN"
$storageAccount = Get-AzStorageAccount `
-ResourceGroupName $ResourceGroup `
-Name $StorageAccountName
$Context = $storageAccount.Context
$Blobs = Get-AzStorageBlob -Container $ContainerName -Context $Context
$length = 0
$Blobs | ForEach-Object {$length = $length + $_.Length }
#$Blobs.Name.Split("_",2)[0]
$Blobs | Select-Object Name, Length
$TotalSize = [math]::Round(($length / 1024 / 1024 / 1024 / 1024),2)
Write-Host "Total Size: $TotalSize Terabytes"
Current Output.
ABCD_History_20221127_110045 9306112
ABCD_History_20221204_110052 11010048
ABCD_History_20221211_110045 10616832
EFGH_20220327_110201 48562176
EFGH_20220403_110159 46596096
Total Size: 29.63 Terabytes
Desired Output
ABCD 30932992
EFGH 95158272
Total Size: 29.63 Terabytes
I have reproduced in my environment and got expected results as below and I followed Microsoft-Document:
$ResourceGroup = "XX"
$StorageAccountName = "rith"
$ContainerName = "rithwik"
$storageAccount = Get-AzStorageAccount `
-ResourceGroupName $ResourceGroup `
-Name $StorageAccountName
$Context = $storageAccount.Context
$Blobs = Get-AzStorageBlob -Container $ContainerName -Context $Context
$length = 0
$Blobs | ForEach-Object {$length = $length + $_.Length }
$Blobs | Select-Object Name, Length
After that used below code to get required Output:
foreach($emo in $blobs)
{
$BlobName=$Blobs.Name.Substring(0,3)
}
$Target = #()
foreach($emo in $BlobName)
{
$bn=$emo
$x=0
foreach($b in $blobs)
{
if ($b.Name -match $Bn)
{
$x=$x+$b.Length
}
}
$out = $bn + $x
$Target += $out
}
$Target | select -uniq
If you want 4 letters in output give 4 in substring command instead of 3.
If you space between name and length you use like below in place of $out = $bn + $x in for loop:
$out = $bn +" "+ $x

Writing the data to Azure Data Lake Store - Powershell Scripting

I need to write the data to Azure Data Lake Storage rather than my local D:\ Drive. I am trying to fetch the ADF triggers information via PowerShell and want to load the data to Azure Data Lake Container in a directory rather than in a blob storage.
ADF -> PowerShell -> Azure Data Lake
I want to load the data in Azure Data Lake Directory inside container in YYYY (Folder) -> MM (Folder) -> DD (Folder) -> Data File in .CSV
Here is my code to write the data to local Machine, I need to convert it to load the data to Data Lake Storage. For hiding the username & password I have used a mechanism with Passowrd & AES Encryption File.
Any help and suggestions will be appreciated?
CODE :
# 1- Connect to Azure Account
$username = "xyz#abc.com"
$password = Get-Content D:\Powershell\new\passwords\password.txt | ConvertTo-SecureString -Key (Get-Content D:\Powershell\new\passwords\aes.key)
$credential = New-Object System.Management.Automation.PsCredential($username,$password)
#Connect-AzureRmAccount -Credential $credential | out-null
Connect-AzAccount -Credential $credential | out-null
# 2 - Input Area
$subscriptionName = 'Data Analytics'
$resourceGroupName = 'DataLake-Gen2'
$dataFactoryName = 'dna-production-gen2'
# 3 - (All Triggers Information)
$ErrorActionPreference="SilentlyContinue"
Stop-Transcript | out-null
$ErrorActionPreference = "Continue"
Start-Transcript -path D:\Powershell\new\TriggerInfo.txt -append
Get-AzDataFactoryV2Trigger -ResourceGroupName $resourceGroupName -DataFactoryName $dataFactoryName
Stop-Transcript
# read the file as a single, multiline string using the -Raw switch
$triggers = Get-Content "D:\Powershell\new\TriggerInfo.txt" -Raw
# split the text in 'trigger' text blocks on the empty line
# loop through these blocks (skip any possible empty textblock)
$triggers = ($triggers -split '(\r?\n){2,}'| Where-Object {$_ -match '\S'}) | ForEach-Object {
# and parse the data into Hashtables
$today = Get-Date
$yesterday = $today.AddDays(-1)
$data = $_ -replace ':', '=' | ConvertFrom-StringData
$splat = #{
ResourceGroupName = $data.ResourceGroupName
DataFactoryName = $data.DataFactoryName
TriggerName = $data.TriggerName
TriggerRunStartedAfter = $yesterday
TriggerRunStartedBefore = $today
}
Get-AzDataFactoryV2TriggerRun #splat
} | Export-Csv -Path 'D:\Powershell\new\Output.csv' -Encoding UTF8 -NoTypeInformation
# 4 - To extract the final output from the Output File.
Import-Csv D:\Powershell\new\Output.csv -DeLimiter "," |
Select-Object 'TriggerRunTimestamp', 'ResourceGroupName','DataFactoryName','TriggerName','TriggerRunId','TriggerType','Status' |
Export-Csv -Path 'D:\Powershell\new\Finalresult.csv' -Encoding UTF8 -NoTypeInformation -Force
Code tried to upload the file from local system:
$storageAccount = Get-AzStorageAccount -ResourceGroupName "DataLake-Gen2" -AccountName "dna2020gen2"
>> $ctx = $storageAccount.Context
PS C:\Windows\system32> $filesystemName = "dev"
>> $dirname = "triggers/"
>> New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Directory
$localSrcFile = "D:\Powershell\new\passwords\password.txt"
>> $filesystemName = "dev"
>> $dirname = "triggers/"
>> $destPath = $dirname + (Get-Item $localSrcFile).Name
>> New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $localSrcFile -Force
I am able to upload the file but not able to write the command output to datalake.
Regarding the issue, please refer to the following script
$username = "xyz#abc.com"
$password =ConvertTo-SecureString "" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PsCredential($username,$password)
#Connect-AzureRmAccount -Credential $credential | out-null
Connect-AzAccount -Credential $credential
$dataFactoryName=""
$resourceGroupName=""
# get dataFactory triggers
$triggers=Get-AzDataFactoryV2Trigger -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName
$datas=#()
foreach ($trigger in $triggers) {
# get the trigger run history
$today = Get-Date
$yesterday = $today.AddDays(-1)
$splat = #{
ResourceGroupName = $trigger.ResourceGroupName
DataFactoryName = $trigger.DataFactoryName
TriggerName = $trigger.Name
TriggerRunStartedAfter = $yesterday
TriggerRunStartedBefore = $today
}
$historys =Get-AzDataFactoryV2TriggerRun #splat
if($historys -ne $null){
# create date
foreach($history in $historys){
$obj =[PsCustomObject]#{
'TriggerRunTimestamp ' = $history.TriggerRunTimestamp
'ResourceGroupName ' =$history.ResourceGroupName
'DataFactoryName' =$history.DataFactoryName
'TriggerName ' = $history.TriggerName
'TriggerRunId'= $history.TriggerRunId
'TriggerType'=$history.TriggerType
'Status' =$history.Status
}
# add data to an array
$datas += $obj
}
}
}
# convert data to csv string
$contents =(($datas | ConvertTo-Csv -NoTypeInformation) -join [Environment]::NewLine)
# upload to Azure Data Lake Store Gen2
#1. Create a sas token
$accountName="testadls05"
$fileSystemName="test"
$filePath="data.csv"
$account = Get-AzStorageAccount -ResourceGroupName andywin7 -Name $accountName
$sas= New-AzStorageAccountSASToken -Service Blob -ResourceType Service,Container,Object `
-Permission "racwdlup" -StartTime (Get-Date).AddMinutes(-10) `
-ExpiryTime (Get-Date).AddHours(2) -Context $account.Context
$baseUrl ="https://{0}.dfs.core.windows.net/{1}/{2}{3}" -f $accountName , $fileSystemName, $filePath, $sas
#2. Create file
$endpoint =$baseUrl +"&resource=file"
Invoke-RestMethod -Method Put -Uri $endpoint -Headers #{"Content-Length" = 0} -UseBasicParsing
#3 append data
$endpoint =$baseUrl +"&action=append&position=0"
Invoke-RestMethod -Method Patch -Uri $endpoint -Headers #{"Content-Length" = $contents.Length} -Body $contents -UseBasicParsing
#4 flush data
$endpoint =$baseUrl + ("&action=flush&position={0}" -f $contents.Length)
Invoke-RestMethod -Method Patch -Uri $endpoint -UseBasicParsing
#Check the result (get data)
Invoke-RestMethod -Method Get -Uri $baseUrl -UseBasicParsing
For more details, please refer to here, here and here

Allocated/Unused data disk space VM Wise

(get-azurermvm -ResourceGroupName "<rgname>" -Name "vmname>").StorageProfile.DataDisks
Or
((get-azurermvm -resourcegroupname "<rgname>" -name "vmname>").StorageProfile).DataDisks.DiskSizeGB
But this is showing only the size of data disks. I want to get the Allocated and Unused data disk space. Please someone help.
It seems you want to list all blobs in that storage account, you can use this PowerShell command Get-AzureStorageBlob to list all blobs, like this:
Login-AzureRmAccount
$RGName = "ResourceGroupName"
$SAName = "storageaccountname"
$ConName = "containername"
$TempObj = New-Object -TypeName PSCustomObject
$TempObj |Add-Member -Name BlobName -MemberType NoteProperty -Value $null
$TempObj |Add-Member -Name LeaseState -MemberType NoteProperty -Value $null
$Keylist = Get-AzureRmStorageAccountKey -ResourceGroupName $RGName -StorageAccountName $SAName
$Key = $Keylist[0].Value
$Ctx = New-AzureStorageContext -StorageAccountName $SAName -StorageAccountKey $Key
$List = Get-AzureStorageBlob -Blob *.vhd -Container $ConName -Context $Ctx

How to get size of Azure Container in PowerShell

Similar to this question How to get size of Azure CloudBlobContainer
How can one get the size of the Azure Container in PowerShell. I can see a suggested script at https://gallery.technet.microsoft.com/scriptcenter/Get-Billable-Size-of-32175802 but want to know if there is a simpler way to do in PowerShell
With Azure PowerShell, you can list all blobs in the container with Get-AzureStorageBlob with Container and Context parameter like:
$ctx = New-AzureStorageContext -StorageAccountName youraccountname -storageAccountKey youraccountkey
$blobs = Get-AzureStorageBlob -Container containername -Context $ctx
Output of Get-AzureStorageBlob is an array of AzureStorageBlob, which has a property with name ICloudBlob, you can get blob length in its Properties, then you can sum length of all blobs to get content length of the container.
The following PowerShell script is a simple translation of the c# code in the accepted answer of the question How to get size of Azure CloudBlobContainer. Hope this suit your needs.
Login-AzureRmAccount
$accountName = "<your storage account name>"
$keyValue = "<your storage account key>"
$containerName = "<your container name>"
$storageCred = New-Object Microsoft.WindowsAzure.Storage.Auth.StorageCredentials ($accountName, $keyValue)
$storageAccount = New-Object Microsoft.WindowsAzure.Storage.CloudStorageAccount ($storageCred, $true)
$container = $storageAccount.CreateCloudBlobClient().GetContainerReference($containerName)
$length = 0
$blobs = $container.ListBlobs($null, $true, [Microsoft.WindowsAzure.Storage.Blob.BlobListingDetails]::None, $null, $null)
$blobs | ForEach-Object {$length = $length + $_.Properties.Length}
$length
Note: the leading Login-AzureRmAccount command will load the necessary .dll for you. If you do know the path of "Microsoft.WindowsAzure.Storage.dll", you can replace it by [Reflection.Assembly]::LoadFile("$StorageLibraryPath") | Out-Null. The path is usually like this "C:\Program Files\Microsoft SDKs\Azure.NET SDK\v2.7\ToolsRef\Microsoft.WindowsAzure.Storage.dll"
Here's my solution I just hammered through today. Above examples didn't give me what I wanted which was (1) a byte sum of all blobs in a container and (2) a list of each blob + path + size so that it can be used to compare the results to a du -b on linux (origin).
Login-AzureRmAccount
$ResourceGroupName = ""
$StorageAccountName = ""
$StorageAccountKey = ""
$ContainerName = ""
New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
# Don't NEED the Resource Group but, without it, fills the screen with red as it search each RG...
$size = 0
$blobs = Get-AzureRmStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName -ErrorAction Ignore | Get-AzureStorageBlob -Container $ContainerName
foreach ($blob in $blobs) {$size = $size + $blob.length}
write-host "The container is $size bytes."
$properties = #{Expression={$_.Name};Label="Name";width=180}, #{Expression={$_.Length};Label="Bytes";width=80}
$blobs | ft $properties | Out-String -width 800 | Out-File -Encoding ASCII AzureBlob_files.txt
I then moved the file to Linux to do some flip flopping of it and the find output to create a list of files to input into blobxfer. Solution to a different problem, but perhaps a suitable solution for your needs as well.

Resources