Azure DevOps - PowerShell Output as File - azure

I have written a PowerShell script in Azure DevOps to get all Azure Active Directory users as JSONM. I want to save the output as a JSON file in Azure repos.
My code:
Install-Module -Name "AzureAD" -Force
Import-Module -Name "AzureAD"
[string]$userName = 'TEST#TEST.TEST'
[string]$userPassword = 'TEST'
[securestring]$secStringPassword = ConvertTo-SecureString $userPassword -AsPlainText -Force
[pscredential]$credObject = New-Object System.Management.Automation.PSCredential ($userName, $secStringPassword)
Connect-AzureAD -Credential $credObject
Get-AzureADUser | select #{N='email';E={$_.UserPrincipalName}} | ConvertTo-Json | Out-File -FilePath .\UserExport.txt

here is an example :
YOUR-COMMAND | Out-File -FilePath c:\PATH\TO\FOLDER\OUTPUT.txt
You seemed right to this, have you tried to see what it gives you in txt first ? Or it simply don't save the file ?

Hm... Can you give an example of the json ? By default the depth in "ConvertTo-Json" is 2.
So you may have to change it like this :
Get-AzureADUser | select #{N='email';E={$_.UserPrincipalName}} | ConvertTo-Json -depth 100 | Out-File -FilePath .\UserExport.txt

Related

Exported .CSV files comes up empty

I need a quick script to convert device names to Object IDs, so that I can perform a bulk upload in Intune. I have the device names saved as a .csv which I import. After running the script the output BulkObjectID.csv comes up empty (0 kb). I am not sure what I could be doing wrong. Thanks in advance for any help.
connect-azuread
$csv = Import-Csv C:\Tools\NEW.csv
$ObjectID=#()
foreach ($DisplayName in $csv){
$ObjectID += get-AzureADDevice -Filter "DisplayName eq '$._DisplayName'" | Select ObjectID
}
$ObjectID
$ObjectID | export-csv -path 'C:\Tools\BulkObjectID.csv' -append
I tried to reproduce the same in my environment and got below results
I have few Azure AD devices existing in my tenant like below:
I created one csv file with display names of above devices like this:
Now, I ran the same script as you and got same output with empty (0 kb) BulkObjectID.csv file like below:
Connect-AzureAD
$csv = Import-Csv C:\test\new.csv
$ObjectID=#()
foreach ($DisplayName in $csv){
$ObjectID += get-AzureADDevice -Filter "DisplayName eq '$._DisplayName'" | Select ObjectID
}
$ObjectID
$ObjectID | export-csv -path 'C:\test\BulkObjectID.csv' -append
Response:
When I checked the folder, empty (0 kb) BulkObjectID.csv file is present like below:
To resolve this, modify your script by making few changes like below:
Connect-AzureAD
$csv = Import-Csv C:\test\new.csv
$ObjectID=#()
foreach ($DisplayName in $csv)
{
$name = $DisplayName.DisplayName
$ObjectID = get-AzureADDevice -Filter "DisplayName eq '$name'" | Select ObjectID
$ObjectID
$ObjectID | export-csv -path 'C:\test\ObjectID.csv' -append
}
Response:
When I checked the folder, ObjectID.csv file is present with device IDs like below:

Query ObjectId of ConditionalAccessLocationCondition

I am writing a script to write to Azure, I basically want to find a user, create a network location, create a conditional access policy. This is what I have so far. The trouble is that the $secmon_guid and $location_policy_guid do not work. If I manually put the values in, it works.
# Run these commands first to connect and install without the #
Install-Module -Name AzureAD -AllowClobber -Force # Answer Y to install NuGet. Run once on workstation running script.
Install-Module -Name Microsoft.Graph.Identity.SignIns -Force # Install this to allow us to setup a trusted location. Run once on workstation running script.
Install-Module MSOnline -Force #Allow us to edit users. Run once on workstation running script.
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope LocalMachine #Set execution policy to allow our script to do things.
Import-Module -Name AzureAD #The following 3 commands are ran for each client.
Connect-AzureAD # Use GA credentials from Glue
Connect-MsolService #Reauthenticate if necessary.
Get-AzureADMSConditionalAccessPolicy #This will list out all of the existing CA policies. This is a good opportunity to get them into documentation.
Connect-MgGraph #This enabled graph, you will need to approve the request in the popup window.
#Set variable for account name
Set-Variable -name "account" -Value "secmon"
#Create named location for the IP address
$ipRanges = New-Object -TypeName Microsoft.Open.MSGraph.Model.IpRange
$ipRanges.cidrAddress = "IP ADDR"
New-AzureADMSNamedLocationPolicy -OdataType "#microsoft.graph.ipNamedLocation" -DisplayName "Blackpoint IP Address for SecMon" -IsTrusted $true -IpRanges $ipRanges
#Disable MFA for secmon
Get-MsolUser -SearchString "secmon" | Set-MsolUser -StrongAuthenticationRequirements #()
#Get the Azure AD GUID for use later
$secmon_guid = Get-MsolUser -SearchString "secmon" | Select ObjectID
#Name the policy
$name = "Allow Secmon Only from Blackpoint IP"
#Enable the policy. Set to Disabled to test.
$state = "Enabled"
#Get location GUID and save to variable
$location_policy_guid = Get-AzureADMSNamedLocationPolicy | Where-Object -Property DisplayName -Contains 'Blackpoint IP Address for SecMon' | Select-Object -Property Id
#Working on this
#Create the overarching condition set for CA, this is the container.
$conditions = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessConditionSet
#Include all applications - This might be able to be removed?
$conditions.Applications = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessApplicationCondition
$conditions.Applications.IncludeApplications = 'All'
#Create the user condition and include secmon
$conditions.Users = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessUserCondition
$conditions.Users.IncludeUsers = $secmon_guid
#Add new location policy to CA policy
$conditions.Locations = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessLocationCondition
$conditions.Locations.IncludeLocations = $location_policy_guid
#Grant access control to CA policy
$controls = New-Object -TypeName Microsoft.Open.MSGraph.Model.ConditionalAccessGrantControls
$controls._Operator = "OR"
$controls.BuiltInControls = "block"
#End work
New-AzureADMSConditionalAccessPolicy `
-DisplayName $name `
-State $state `
-Conditions $conditions `
-GrantControls $controls
The error I get is due to poorly formatted GUID's, the values I am pulling are not correct. How can I fix this? Any help is greatly appreciated!
New-AzureADMSConditionalAccessPolicy : Error occurred while executing NewAzureADMSConditionalAccessPolicy
Code: BadRequest
Message: 1054: Invalid location value: #{Id=1234GUID}.
InnerError:
RequestId: 5678GUID
Where you define the variables, you need to use -ExpandProperty on the select-object statement e.g:
$secmon_guid = Get-MsolUser -SearchString "secmon" | Select -ExpandProperty ObjectID
Otherwise, you would have to access your current variable like so:
$conditions.Users.IncludeUsers = $secmon_guid.ObjectID

Writing the data to Azure Data Lake Store - Powershell Scripting

I need to write the data to Azure Data Lake Storage rather than my local D:\ Drive. I am trying to fetch the ADF triggers information via PowerShell and want to load the data to Azure Data Lake Container in a directory rather than in a blob storage.
ADF -> PowerShell -> Azure Data Lake
I want to load the data in Azure Data Lake Directory inside container in YYYY (Folder) -> MM (Folder) -> DD (Folder) -> Data File in .CSV
Here is my code to write the data to local Machine, I need to convert it to load the data to Data Lake Storage. For hiding the username & password I have used a mechanism with Passowrd & AES Encryption File.
Any help and suggestions will be appreciated?
CODE :
# 1- Connect to Azure Account
$username = "xyz#abc.com"
$password = Get-Content D:\Powershell\new\passwords\password.txt | ConvertTo-SecureString -Key (Get-Content D:\Powershell\new\passwords\aes.key)
$credential = New-Object System.Management.Automation.PsCredential($username,$password)
#Connect-AzureRmAccount -Credential $credential | out-null
Connect-AzAccount -Credential $credential | out-null
# 2 - Input Area
$subscriptionName = 'Data Analytics'
$resourceGroupName = 'DataLake-Gen2'
$dataFactoryName = 'dna-production-gen2'
# 3 - (All Triggers Information)
$ErrorActionPreference="SilentlyContinue"
Stop-Transcript | out-null
$ErrorActionPreference = "Continue"
Start-Transcript -path D:\Powershell\new\TriggerInfo.txt -append
Get-AzDataFactoryV2Trigger -ResourceGroupName $resourceGroupName -DataFactoryName $dataFactoryName
Stop-Transcript
# read the file as a single, multiline string using the -Raw switch
$triggers = Get-Content "D:\Powershell\new\TriggerInfo.txt" -Raw
# split the text in 'trigger' text blocks on the empty line
# loop through these blocks (skip any possible empty textblock)
$triggers = ($triggers -split '(\r?\n){2,}'| Where-Object {$_ -match '\S'}) | ForEach-Object {
# and parse the data into Hashtables
$today = Get-Date
$yesterday = $today.AddDays(-1)
$data = $_ -replace ':', '=' | ConvertFrom-StringData
$splat = #{
ResourceGroupName = $data.ResourceGroupName
DataFactoryName = $data.DataFactoryName
TriggerName = $data.TriggerName
TriggerRunStartedAfter = $yesterday
TriggerRunStartedBefore = $today
}
Get-AzDataFactoryV2TriggerRun #splat
} | Export-Csv -Path 'D:\Powershell\new\Output.csv' -Encoding UTF8 -NoTypeInformation
# 4 - To extract the final output from the Output File.
Import-Csv D:\Powershell\new\Output.csv -DeLimiter "," |
Select-Object 'TriggerRunTimestamp', 'ResourceGroupName','DataFactoryName','TriggerName','TriggerRunId','TriggerType','Status' |
Export-Csv -Path 'D:\Powershell\new\Finalresult.csv' -Encoding UTF8 -NoTypeInformation -Force
Code tried to upload the file from local system:
$storageAccount = Get-AzStorageAccount -ResourceGroupName "DataLake-Gen2" -AccountName "dna2020gen2"
>> $ctx = $storageAccount.Context
PS C:\Windows\system32> $filesystemName = "dev"
>> $dirname = "triggers/"
>> New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Directory
$localSrcFile = "D:\Powershell\new\passwords\password.txt"
>> $filesystemName = "dev"
>> $dirname = "triggers/"
>> $destPath = $dirname + (Get-Item $localSrcFile).Name
>> New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $localSrcFile -Force
I am able to upload the file but not able to write the command output to datalake.
Regarding the issue, please refer to the following script
$username = "xyz#abc.com"
$password =ConvertTo-SecureString "" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PsCredential($username,$password)
#Connect-AzureRmAccount -Credential $credential | out-null
Connect-AzAccount -Credential $credential
$dataFactoryName=""
$resourceGroupName=""
# get dataFactory triggers
$triggers=Get-AzDataFactoryV2Trigger -DataFactoryName $dataFactoryName -ResourceGroupName $resourceGroupName
$datas=#()
foreach ($trigger in $triggers) {
# get the trigger run history
$today = Get-Date
$yesterday = $today.AddDays(-1)
$splat = #{
ResourceGroupName = $trigger.ResourceGroupName
DataFactoryName = $trigger.DataFactoryName
TriggerName = $trigger.Name
TriggerRunStartedAfter = $yesterday
TriggerRunStartedBefore = $today
}
$historys =Get-AzDataFactoryV2TriggerRun #splat
if($historys -ne $null){
# create date
foreach($history in $historys){
$obj =[PsCustomObject]#{
'TriggerRunTimestamp ' = $history.TriggerRunTimestamp
'ResourceGroupName ' =$history.ResourceGroupName
'DataFactoryName' =$history.DataFactoryName
'TriggerName ' = $history.TriggerName
'TriggerRunId'= $history.TriggerRunId
'TriggerType'=$history.TriggerType
'Status' =$history.Status
}
# add data to an array
$datas += $obj
}
}
}
# convert data to csv string
$contents =(($datas | ConvertTo-Csv -NoTypeInformation) -join [Environment]::NewLine)
# upload to Azure Data Lake Store Gen2
#1. Create a sas token
$accountName="testadls05"
$fileSystemName="test"
$filePath="data.csv"
$account = Get-AzStorageAccount -ResourceGroupName andywin7 -Name $accountName
$sas= New-AzStorageAccountSASToken -Service Blob -ResourceType Service,Container,Object `
-Permission "racwdlup" -StartTime (Get-Date).AddMinutes(-10) `
-ExpiryTime (Get-Date).AddHours(2) -Context $account.Context
$baseUrl ="https://{0}.dfs.core.windows.net/{1}/{2}{3}" -f $accountName , $fileSystemName, $filePath, $sas
#2. Create file
$endpoint =$baseUrl +"&resource=file"
Invoke-RestMethod -Method Put -Uri $endpoint -Headers #{"Content-Length" = 0} -UseBasicParsing
#3 append data
$endpoint =$baseUrl +"&action=append&position=0"
Invoke-RestMethod -Method Patch -Uri $endpoint -Headers #{"Content-Length" = $contents.Length} -Body $contents -UseBasicParsing
#4 flush data
$endpoint =$baseUrl + ("&action=flush&position={0}" -f $contents.Length)
Invoke-RestMethod -Method Patch -Uri $endpoint -UseBasicParsing
#Check the result (get data)
Invoke-RestMethod -Method Get -Uri $baseUrl -UseBasicParsing
For more details, please refer to here, here and here

How can I simply get the powerstate from within a powershell workflow automation runbook in Azure?

I have a Powershell workflow runbook that automates starting and shutting down VMs in Azure, I updated the modules in an automation account (so I could use it for other things) and it has stopped the script working. I have fixed most of the broken stuff but the bit that is not now working is obtaining the power state eg: PowerState/deallocated so that it can be shutdown/started up. Here is my code:
$vmFullStatus = Get-AzureRmVM -ResourceGroupName test1 -Name test1 -Status
$vmStatusJson = $vmFullStatus | ConvertTo-Json -depth 100
$vmStatus = $vmStatusJson | ConvertFrom-Json
$vmStatusCode = $vmStatus.Statuses[1].code
Write-Output " VM Status Code: $vmStatusCode"
The Write-Output VM Status Code is now blank in the output of the runbook, but it outputs fine in standard shell. I only have limited experiences in workflow runbooks but I believe it needs to be converted to Json so the Workflow can use it.
I think the issue may lie with the statuses as when it is converted to Json it displays:
"Statuses": [
"Microsoft.Azure.Management.Compute.Models.InstanceViewStatus",
"Microsoft.Azure.Management.Compute.Models.InstanceViewStatus"
],
Which doesn't now show the PowerState. How can I get the powerstate of a vm from within a powershell workflow runbook so it can used? Thanks
I have tried an inline script and it does work if you specify a vm name:
$vmStatusCode = InlineScript {
$vmFullStatus = Get-AzureRmVM -ResourceGroupName test1 -Name test1 -Status
$vmStatusJson = $vmFullStatus | ConvertTo-Json -depth 100
$vmStatus = $vmStatusJson | ConvertFrom-Json
$vmStatus.Statuses[1].code
}
But it doesn't work when you pass variables:
$vmFullStatus = Get-AzureRmVM -ResourceGroupName $vm.ResourceGroupName -Name $vm.Name -Status
Get-AzureRmVM : Cannot validate argument on parameter 'ResourceGroupName'. The argument is null or empty. Provide an
argument that is not null or empty, and then try the command again.
it needs to be run without an inline script - any ideas?
forgot to add $using:
$vmStatusCode = InlineScript {
$vmFullStatus = Get-AzureRmVM -ResourceGroupName $using:vm.ResourceGroupName -Name $using:vm.Name -Status
$vmStatusJson = $vmFullStatus | ConvertTo-Json -depth 100
$vmStatus = $vmStatusJson | ConvertFrom-Json
$vmStatus.Statuses[1].code
}
This now works!

Read from txt file and convert to managed disks

I've got a list of virtua machines in Azure which I'm trying to convert to managed disks.
I have a list of vm's, I read from the list and export to csv capturing resourcegroupname and vm name, however I seem to get vms from the whole subscription.
Also when I attempt to import the csv, when I run $comps it returns the correct information in the csv, however I can't seem to pass them through to the next lines.
CSV format is
ResouceGroupName Name
RG-01 vm-01
RG-01 vm-02
RG-01 vm-03
RG-01 vm-04
The code I'm trying is
Login-AzureRmAccount
$sub = Get-AzureRmSubscription | ogv -PassThru
Select-AzureSubscription -SubscriptionId $sub
$virtualmachines = Get-Content C:\temp\vm.txt | % {
Get-Azurermvm | select ResourceGroupName,Name | export-csv c:\temp\vm.csv -NoClobber -NoTypeInformation -Append
}
$comps = Import-Csv c:\temp\Vm.csv |
foreach ($Comp in $comps)
{
Stop-AzureRmVM -ResourceGroupName $_.ResourceGroupName -Name $_.Name -Force
ConvertTo-AzureRmVMManagedDisk -ResourceGroupName $_.ResourceGroupName -VMName $_.Name
}
Thanks in advance..
For your issue, you export the virtual machines in a csv file and use it in the foreach code. So, it's unneccesary to use command:
$virtualmachines = Get-Content C:\temp\vm.txt | % {
Get-Azurermvm | select ResourceGroupName,Name | export-csv c:\temp\vm.csv -NoClobber -NoTypeInformation -Append
}
And your VMs all in a resourcegroup, you can get them with ResourceGroupName directly.
For the pipeline in foreach, it's unneccesary. You can use the following code that I make a little change with your code and it works well.
Login-AzureRmAccount
$sub = Get-AzureRmSubscription | ogv -PassThru
Select-AzureRmSubscription -Subscription $sub
Get-Azurermvm –ResourceGroupName RG-01 | select ResourceGroupName,Name | export-csv c:\temp\vm.csv -NoClobber -NoTypeInformation -Append
$comps = Import-Csv c:\temp\Vm.csv
foreach ($Comp in $comps)
{
Stop-AzureRmVM -ResourceGroupName $Comp.ResourceGroupName -Name $Comp.Name -Force
ConvertTo-AzureRmVMManagedDisk -ResourceGroupName $Comp.ResourceGroupName -VMName $Comp.Name
}
This is the screenshot of my result.

Resources