Writing the data to Azure Data Lake Store - Powershell Scripting - azure

I need to write the data to Azure Data Lake Storage rather than my local D:\ Drive. I am trying to fetch the ADF triggers information via PowerShell and want to load the data to Azure Data Lake Container in a directory rather than in a blob storage.
ADF -> PowerShell -> Azure Data Lake
I want to load the data in Azure Data Lake Directory inside container in YYYY (Folder) -> MM (Folder) -> DD (Folder) -> Data File in .CSV
Here is my code to write the data to local Machine, I need to convert it to load the data to Data Lake Storage. For hiding the username & password I have used a mechanism with Passowrd & AES Encryption File.
Any help and suggestions will be appreciated?
CODE :
# 1- Connect to Azure Account
$username = "xyz#abc.com"
$password = Get-Content D:\Powershell\new\passwords\password.txt | ConvertTo-SecureString -Key (Get-Content D:\Powershell\new\passwords\aes.key)
$credential = New-Object System.Management.Automation.PsCredential($username,$password)
#Connect-AzureRmAccount -Credential $credential | out-null
Connect-AzAccount -Credential $credential | out-null
# 2 - Input Area
$subscriptionName = 'Data Analytics'
$resourceGroupName = 'DataLake-Gen2'
$dataFactoryName = 'dna-production-gen2'
# 3 - (All Triggers Information)
$ErrorActionPreference="SilentlyContinue"
Stop-Transcript | out-null
$ErrorActionPreference = "Continue"
Start-Transcript -path D:\Powershell\new\TriggerInfo.txt -append
Get-AzDataFactoryV2Trigger -ResourceGroupName $resourceGroupName -DataFactoryName $dataFactoryName
Stop-Transcript
# read the file as a single, multiline string using the -Raw switch
$triggers = Get-Content "D:\Powershell\new\TriggerInfo.txt" -Raw
# split the text in 'trigger' text blocks on the empty line
# loop through these blocks (skip any possible empty textblock)
$triggers = ($triggers -split '(\r?\n){2,}'| Where-Object {$_ -match '\S'}) | ForEach-Object {
# and parse the data into Hashtables
$today = Get-Date
$yesterday = $today.AddDays(-1)
$data = $_ -replace ':', '=' | ConvertFrom-StringData
$splat = #{
ResourceGroupName = $data.ResourceGroupName
DataFactoryName = $data.DataFactoryName
TriggerName = $data.TriggerName
TriggerRunStartedAfter = $yesterday
TriggerRunStartedBefore = $today
}
Get-AzDataFactoryV2TriggerRun #splat
} | Export-Csv -Path 'D:\Powershell\new\Output.csv' -Encoding UTF8 -NoTypeInformation
# 4 - To extract the final output from the Output File.
Import-Csv D:\Powershell\new\Output.csv -DeLimiter "," |
Select-Object 'TriggerRunTimestamp', 'ResourceGroupName','DataFactoryName','TriggerName','TriggerRunId','TriggerType','Status' |
Export-Csv -Path 'D:\Powershell\new\Finalresult.csv' -Encoding UTF8 -NoTypeInformation -Force
Code tried to upload the file from local system:
$storageAccount = Get-AzStorageAccount -ResourceGroupName "DataLake-Gen2" -AccountName "dna2020gen2"
>> $ctx = $storageAccount.Context
PS C:\Windows\system32> $filesystemName = "dev"
>> $dirname = "triggers/"
>> New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Directory
$localSrcFile = "D:\Powershell\new\passwords\password.txt"
>> $filesystemName = "dev"
>> $dirname = "triggers/"
>> $destPath = $dirname + (Get-Item $localSrcFile).Name
>> New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $localSrcFile -Force
I am able to upload the file but not able to write the command output to datalake.

Regarding the issue, please refer to the following script
$username = "xyz#abc.com"
$password =ConvertTo-SecureString "" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PsCredential($username,$password)
#Connect-AzureRmAccount -Credential $credential | out-null
Connect-AzAccount -Credential $credential
$dataFactoryName=""
$resourceGroupName=""
# get dataFactory triggers
$triggers=Get-AzDataFactoryV2Trigger -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName
$datas=#()
foreach ($trigger in $triggers) {
# get the trigger run history
$today = Get-Date
$yesterday = $today.AddDays(-1)
$splat = #{
ResourceGroupName = $trigger.ResourceGroupName
DataFactoryName = $trigger.DataFactoryName
TriggerName = $trigger.Name
TriggerRunStartedAfter = $yesterday
TriggerRunStartedBefore = $today
}
$historys =Get-AzDataFactoryV2TriggerRun #splat
if($historys -ne $null){
# create date
foreach($history in $historys){
$obj =[PsCustomObject]#{
'TriggerRunTimestamp ' = $history.TriggerRunTimestamp
'ResourceGroupName ' =$history.ResourceGroupName
'DataFactoryName' =$history.DataFactoryName
'TriggerName ' = $history.TriggerName
'TriggerRunId'= $history.TriggerRunId
'TriggerType'=$history.TriggerType
'Status' =$history.Status
}
# add data to an array
$datas += $obj
}
}
}
# convert data to csv string
$contents =(($datas | ConvertTo-Csv -NoTypeInformation) -join [Environment]::NewLine)
# upload to Azure Data Lake Store Gen2
#1. Create a sas token
$accountName="testadls05"
$fileSystemName="test"
$filePath="data.csv"
$account = Get-AzStorageAccount -ResourceGroupName andywin7 -Name $accountName
$sas= New-AzStorageAccountSASToken -Service Blob -ResourceType Service,Container,Object `
-Permission "racwdlup" -StartTime (Get-Date).AddMinutes(-10) `
-ExpiryTime (Get-Date).AddHours(2) -Context $account.Context
$baseUrl ="https://{0}.dfs.core.windows.net/{1}/{2}{3}" -f $accountName , $fileSystemName, $filePath, $sas
#2. Create file
$endpoint =$baseUrl +"&resource=file"
Invoke-RestMethod -Method Put -Uri $endpoint -Headers #{"Content-Length" = 0} -UseBasicParsing
#3 append data
$endpoint =$baseUrl +"&action=append&position=0"
Invoke-RestMethod -Method Patch -Uri $endpoint -Headers #{"Content-Length" = $contents.Length} -Body $contents -UseBasicParsing
#4 flush data
$endpoint =$baseUrl + ("&action=flush&position={0}" -f $contents.Length)
Invoke-RestMethod -Method Patch -Uri $endpoint -UseBasicParsing
#Check the result (get data)
Invoke-RestMethod -Method Get -Uri $baseUrl -UseBasicParsing
For more details, please refer to here, here and here

Related

Find/list the Unused storage accounts in azure using powershell

Trying to get the list of unused/inactive storage accounts in azure using powershell. Below is my script which im trying it will provide the storage account name and last modified date of your Azure storage accounts, but i need to list only the unused storage accounts names not all the storage accounts, for that some condition/filter i need to provide to achieve the same. Please assist me to solve this. Thanks in Advance
It will output the results into a table detailing the name and last modified date of your Azure storage accounts.
& {
foreach ($storageAccount in Get-AzStorageAccount) {
$storageAccountName = $storageAccount.StorageAccountName
$resourceGroupName = $storageAccount.ResourceGroupName
# Get storage account key
$storageAccountKey = (Get-AzStorageAccountKey -Name $storageAccountName -ResourceGroupName $resourceGroupName).Value[0]
# Create storage account context using above key
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
# Get the last modified date
$lastModified = Get-AzStorageContainer -Context $context | Sort-Object -Property #{Expression = {$_.LastModified.DateTime}} | Select-Object -Last 1 -ExpandProperty LastModified
# Collect the information to output to a table when the for loop has completed
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastModified.DateTime;
ResourceGroupName = $resourceGroupName
}
}
} | Format-Table Name, LastModified, ResourceGroupName -autosize
I tried to reproduce the same in my environment and got the same result as below:
By using the same script, I got the storage account name and last modified date of the Azure storage accounts.
To get only the unused/inactive storage accounts in azure using PowerShell, I modified the script like below:
I agree with #Niclas, you need make use of get-date command.
& {
foreach ($storageAccount in Get-AzStorageAccount) {
$storageAccountName = $storageAccount.StorageAccountName
$resourceGroupName = $storageAccount.ResourceGroupName
$storageAccountKey = (Get-AzStorageAccountKey -Name $storageAccountName -ResourceGroupName $resourceGroupName).Value[0]
$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$lastModified = Get-AzStorageContainer -Context $context | Sort-Object -Property #{Expression = {$_.LastModified.DateTime}} | Select-Object -Last 1 -ExpandProperty LastModified
$unusedacc = (Get-Date).AddDays(-10)
if ($lastModified.DateTime -lt $unusedacc) {
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastModified.DateTime;
ResourceGroupName = $resourceGroupName
}
}
}
} | Format-Table Name, LastModified, ResourceGroupName -autosize
Note: Based on your requirement you can change the number of days in this line $unusedacc = (Get-Date).AddDays(-10).
If there are no unused Storage accounts, then it will return blank results like below:
Use get-date
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/get-date
and use the Where-Object.
# ADD THIS
$lastModDate = (get-date).AddDays(-5).Date
$lastMod = $lastModified | Where-Object { ($_.DateTime).Date -lt $lastModDate}
# If $lastMod.DateTime is NOT empty, then:
if ($lastMod.DateTime) {
# Write-Host "variable is NOT null " + $storageAccountName # For testing purpose
# Collect the information to output to a table when the for loop has completed
New-Object psobject -Property #{
Name = $storageAccountName;
LastModified = $lastMod.DateTime; # CHANGE THIS
ResourceGroupName = $resourceGroupName
}
}
https://www.techielass.com/find-unused-storage-accounts-in-azure/
With your script:
With my changes:

How to split different values in powershell by a line

With this script i am able to fetch all the Tags that a VM has but i want that in output the each key and its value should be separated by a line in the way that each key and its value appears on different lines like this
reference image
# Sign into Azure Portal
connect-azaccount
# Fetch the Virtual Machines from the subscription
$azureVMDetails = get-azvm
# Fetch the NIC details from the subscription
$azureNICDetails = Get-AzNetworkInterface | ?{ $_.VirtualMachine -NE $null}
#Fetching Virtual Machine Details
$virtual_machine_object = $null
$virtual_machine_object = #()
#Iterating over the NIC Interfaces under the subscription
foreach($azureNICDetail in $azureNICDetails){
#Fetching the VM Name
$azureVMDetail = $azureVMDetails | ? -Property Id -eq $azureNICDetail.VirtualMachine.id
#Fetching the VM Tags
foreach($azureDetail in $azureVMDetails) {
$vm_tags = $azureVMDetail| Select-Object -Property (
#{name='Tags'; expression = {($_.tags.GetEnumerator().ForEach({ '{0} : {1}' -f $_.key, $_.value }) -join ';')}}
)
}
#VM Details export
$virtual_machine_object_temp = new-object PSObject
$virtual_machine_object_temp | add-member -membertype NoteProperty -name "name" -Value $azureVMDetail.Name
$virtual_machine_object_temp | add-member -membertype NoteProperty -name "comments" -Value ($vm_tags.Tags -join ';')
$virtual_machine_object += $virtual_machine_object_temp
}
#Report format and path
$virtual_machine_object | Export-Csv "C:\Users\JOHN\Desktop\Inventory\Final Scripts\VM_details_$(get-date -f dd.MM.yyyy).csv" -NoTypeInformation -Force
I tried to reproduce the same in my environment and got the results successfully by using the below PowerShell script:
$vmdeatil = Get-AzVm -Name testvm | Select -ExpandProperty Tags
$value = $vmdeatil
foreach($i in 0..($value.Count -1))
{
$ErrorActionPreference = ‘SilentlyContinue’
[array]$report += [pscustomobject] #{
key = $key[$i]
name = $value[$i]
}
}
$report | Export-Csv -Path "C:\Users\ruk1.csv" -NoTypeInformation
Response:
The output is successfully exported in the csv file like below:

How to loop through mulitple azure subscriptions parallelly

I am trying with nested ForEach-Object -Parallel to loop through multiple azure subscriptions parallelly and get data from all VMs in one go. I am using below code:
$SubsJob = Get-AzSubscription -WarningAction SilentlyContinue | Where-Object {$_.Name -match 'abc'} | ForEach-Object -Parallel {
$Context = Set-AzContext -Tenant $_.TenantId -SubscriptionId $_.SubscriptionId
[System.String]$ScriptBlock = {Get-Process}
$VMsJob = Get-AzVM | ForEach-Object -Parallel {
$FileName = $using:Context.Subscription.Name + "_$($_.Name)_" + (Get-Random) + ".ps1"
Out-File -FilePath $FileName -InputObject $using:ScriptBlock -NoNewline
$Output = Invoke-AzVMRunCommand -Name $_.Name -ResourceGroupName $_.ResourceGroupName -CommandId 'RunPowerShellScript' -ScriptPath $FileName
$PSCustomObject = [PSCustomObject]#{Subscription = $using:Context.Subscription.Name; ServerName = $_.Name; Output = $Output}
#Remove-Item -Path $FileName -Force -ErrorAction SilentlyContinue
Write-Output $PSCustomObject
} -ThrottleLimit 200 -AsJob
Write-Output $VMsJob
} -ThrottleLimit 200 -AsJob
I am not able to get it work, not sure what is wrong. One thing I observed while debugging is that, Get-AzVM command is getting VMs list from all the subscriptions rather than specific subscription. I came to know that by looking at the Out-File -FilePath $FileName which are generated.
sub1_server1_980337551.ps1
sub2_server1_42701325.ps1
server1 is only present in sub1 but it is being picked in sub2 as well.

How to import csv to az storage table

Am trying to upload csv file to the azure table storage, which has 1000 + records and 6 columns, as shown below as Csvheadres, unfortunately, ended up with an error.do I need to add partition key and row key columns?
the CSV columns(CsvHeaders)
res1
res2
res3
res4
res5
res6
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value[0]
$ctx = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzStorageTable -Name $tableName -Context $ctx
$CsvContents = Import-Csv -Path $Path
$CsvHeaders = ($CsvContents[0] | Get-Member -MemberType NoteProperty).Name | Where{$_ -ne "RowKey" -and $_ -ne "PartitionKey"}
Foreach($CsvContent in $CsvContents)
{
$PartitionKey = $CsvContent.PartitionKey
$RowKey = $CsvContent.RowKey
$Entity = New-Object "Microsoft.Azure.Cosmos.Table.DynamicTableEntity" "$PartitionKey", "$RowKey"
Foreach($CsvHeader in $CsvHeaders)
{
$Value = $CsvContent.$CsvHeader
$Entity.Properties.Add($CsvHeader, $Value)
}
Write-Verbose "Inserting the entity into table storage."
$result = $Table.Execute([Microsoft.Azure.Cosmos.Table.TableOperation]::Insert($Entity))
}

Read from txt file and convert to managed disks

I've got a list of virtua machines in Azure which I'm trying to convert to managed disks.
I have a list of vm's, I read from the list and export to csv capturing resourcegroupname and vm name, however I seem to get vms from the whole subscription.
Also when I attempt to import the csv, when I run $comps it returns the correct information in the csv, however I can't seem to pass them through to the next lines.
CSV format is
ResouceGroupName Name
RG-01 vm-01
RG-01 vm-02
RG-01 vm-03
RG-01 vm-04
The code I'm trying is
Login-AzureRmAccount
$sub = Get-AzureRmSubscription | ogv -PassThru
Select-AzureSubscription -SubscriptionId $sub
$virtualmachines = Get-Content C:\temp\vm.txt | % {
Get-Azurermvm | select ResourceGroupName,Name | export-csv c:\temp\vm.csv -NoClobber -NoTypeInformation -Append
}
$comps = Import-Csv c:\temp\Vm.csv |
foreach ($Comp in $comps)
{
Stop-AzureRmVM -ResourceGroupName $_.ResourceGroupName -Name $_.Name -Force
ConvertTo-AzureRmVMManagedDisk -ResourceGroupName $_.ResourceGroupName -VMName $_.Name
}
Thanks in advance..
For your issue, you export the virtual machines in a csv file and use it in the foreach code. So, it's unneccesary to use command:
$virtualmachines = Get-Content C:\temp\vm.txt | % {
Get-Azurermvm | select ResourceGroupName,Name | export-csv c:\temp\vm.csv -NoClobber -NoTypeInformation -Append
}
And your VMs all in a resourcegroup, you can get them with ResourceGroupName directly.
For the pipeline in foreach, it's unneccesary. You can use the following code that I make a little change with your code and it works well.
Login-AzureRmAccount
$sub = Get-AzureRmSubscription | ogv -PassThru
Select-AzureRmSubscription -Subscription $sub
Get-Azurermvm –ResourceGroupName RG-01 | select ResourceGroupName,Name | export-csv c:\temp\vm.csv -NoClobber -NoTypeInformation -Append
$comps = Import-Csv c:\temp\Vm.csv
foreach ($Comp in $comps)
{
Stop-AzureRmVM -ResourceGroupName $Comp.ResourceGroupName -Name $Comp.Name -Force
ConvertTo-AzureRmVMManagedDisk -ResourceGroupName $Comp.ResourceGroupName -VMName $Comp.Name
}
This is the screenshot of my result.

Resources