Filter on Name of Azure Disk Resources - azure

Hi I'm building a DevOps pipeline and trying to get a list of disks to query.
To try and make the code a bit neater, just trying to understand if there's a better way f doing this. We currently have disks named disk_2 or disk2 or disk-2. this is an example with up to 8 disks per vm. I could use
Get-AzDisk | ? {$_.name -like "*disk-2*" -or $_.Name -like "*disk2*" -or $_.name -like "*disk_2*"}
But i was thinking could i create a list, maybe something like this $list = disk_1,disk1,disk-1,disk_2,disk2,disk-2,disk_3,disk3,disk-3,etc..
Then reference the list in the Powershell pipeline.
Get-AzDisk | ? {$_.name -like "*disk-2*" -or $_.Name -like "*$list*"}
Thsi doesn't seem to work, this will be running in Azure DevOps in an automated pipeline.

Maybe not easier to read when you're not familiar with regex but much less to type would it be like this:
Get-AzDisk |
Where-Object {$_.name -match 'disk(-|_)?\d'}
... or this:
Get-AzDisk |
Where-Object {$_.name -match 'disk[-_]?\d'}

Related

Looking to validate that certain string is present in a text file, send warning if not

I have a process where files containing data are generated in separate locations, saved to a networked location, and merged into a single file.
And the end of the process, I would like to check that all locations are present in that merged file, and notify me if not.
I am having a problem finding a way to identify that a string specific to each location isn't present, to be used in an if statement, but it doesn't seem to be identifying the string correctly?
I have tried :
get-childitem -filter *daily.csv.ready \\x.x.x.x\data\* -recurse | where-object {$_ -notin 'D,KPI,KPI,1,'}
I know it's probably easier to do nothing if it is present, and perform the warning action if not, but I'm curious if this can be done in the reverse.
Thank you,
As Doug Maurer points out, your command does not search through the content of the files output by the Get-ChildItem command, because what that cmdlet emits are System.IO.FileInfo (or, potentially, System.IO.DirectoryInfo) instances containing metadata about the matching files (directories) rather than their content.
In other words: the automatic $_ variable in your Where-Object command refers to an object describing a file rather than its content.
However, you can pipe System.IO.FileInfo instances to the Select-String cmdlet, which indeed searches the input files' content:
Get-ChildItem -Filter *daily.csv.ready \\x.x.x.x\data\* -Recurse |
Where-Object { $_ | Select-String -Quiet -NotMatch 'D,KPI,KPI,1,' }

Azure Powershell: How Do I search for files in a BLOB storage quickly?

We store log files in an Azure storage account, sorted in directories, by date and customer, like this:
YYYY/MM/DD/customerNo/.../.../somestring.customerNo.applicatoinID.log
I need to parse some of these files automatically every day which works fine. However, all I know is the prefix mentioned above and the suffix, they might be in different subdirectories.
So this is how I did it:
$files = (Get-AzStorageBlob -Container logfiles -Context $context) | Where-Object { $_.Name -like "$customerId.$appID.txt" }
This was fast while there weren't any log files, but now after a year this search takes ages. I read somewhere that it would be faster to search by prefix than by suffix. Unfortunately, I have to use the suffix, but I now use the date as a prefix as well. I tried to improve it by doing this:
$date = Get-Date -UFormat "%Y/%m/%d"
$prefix = "$date/$customerId/"
$files = (Get-AzStorageBlob -Container logfiles -Context $context) | Where-Object { $_.Name -like "$prefix*$customerId.$appID.txt" }
However, there is no improvement whatsoever, it just takes as long as before. And it feels like the time the search takes is increasing exponentially with the amount of log files (A few hundred thousand in a very few tens of GBs)
I get a status message which stays there literally for half an hour:
From what I understand, Azure's BLOB storage does not have a hierarchical file system that supports folders, so the "/" are part of the BLOB name and are being interpreted as folders by client software.
However, that does not help me speeding up the search. Any suggestions on how to improve the situation?
Azure Blob Storage supports server-side filtering of blobs by prefix however your code is not taking advantage of that.
$files = (Get-AzStorageBlob -Container logfiles -Context $context) | Where-Object { $_.Name -like "$prefix*$customerId.$appID.txt" }
Essentially the code above is listing all blobs and then doing the filtering on the client side.
To speed up the search, please modify your code to something like:
$files = (Get-AzStorageBlob -Container logfiles -Prefix $prefix -Context $context) | Where-Object { $_.Name -like "$prefix*$customerId.$appID.txt" }
I simply passed the prefix in the Prefix parameter. Now you'll receive only the blobs names of which start with the prefix .

Compare two Files and list only the differences of 2nd File using Powershell

I'm trying to get the current list of azure vms on first run of script -> Stores to Storage Account in CSV File
O the 2nd run - Current List should be compared with existing csv file in Storage Account incase of any vms decommisioned that should be recorded and stored in 2nd File in Storage Account
This works fine for me but the issue is when we create a new azure vm which was also gets added to decommission csv list
$Difference = Compare-Object $existingVmCsv $vmrecordFile -Property VmName -PassThru | Select-Object VmName,ResourceGroupName,SubscriptionName
I tried couple of side indicators but dint work
$Difference = Compare-Object -ReferenceObject #($vmrecordFile | Select-Object) -DifferenceObject #($existingVmCsv | Select-Object) -PassThru -Property VmName,ResourceGroupName,SubscriptionName | Where-Object {$_sideIndicator -eq "<="}
$Difference = Compare-Object -ReferenceObject $vmrecordFile -DifferenceObject $existingVmCsv -PassThru -Property VmName,ResourceGroupName,SubscriptionName | Where-Object {$_sideIndicator -eq "<="}
Thank you User Cpt.Whale - Stack Overflow . Posting your suggestions as answer to help other community members.
It seems, you have a typo in a syntax. Object property references should use a "." , like Where-Object { $_.sideIndicator -eq '<=' }
'<=' This indicates that property value appears only in the -ReferenceObject setReferences: powershell - compare two files and update the differences to 2nd file - Stack Overflow , Powershell : How to Compare Two Files, and List Differences | | Dotnet Helpers (dotnet-helpers.com) and compare-object not working : PowerShell (reddit.com)

PowerShell: update O365 AD bulk attributes through csv file

We are trying to bulk update our Azure Active Directory. We have a excel csv list of UserPrincipalNames that we will update the Title, Department, and Office attributes
# Get List of Clinical CMs
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Import-csv $PATH
# Pass CMs into Function
ForEach ($UPN in $CMs) {
# Do AD Update Task Here
Set-Msoluser -UserPrincipalName $UPN -Title "Case Manager" -Department "Clinical" -Office "Virtual"
}
The CSV:
User.1#domain.com
User.2#domain.com
User.3#domain.com
The Set-MsolUser command will work on its own, but it is not working as intended in this For loop. Any help or insight is greatly appreciated
As Jim Xu commented, here my comment as answer.
The input file you show us is not a CSV file, instead, it is a list of UPN values all on a separate line.
To read these values as string array, the easiest thing to is to use Get-Content:
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Get-Content -Path $PATH
Of course, although massive overkill, it can be done using the Import-Csv cmdlet:
$CMs = (Import-Csv -Path $PATH -Header upn).upn

PowerShell - Export-CSV only listing last of Text File

This may be really ridiculous and simple but I am missing something. I have a very basic script block, but the output is doing something funky. Code Block is:
Get-Content C:\test.txt | ForEach-Object{(Get-ADComputer -Identity $_ -Properties description) | Select-Object name, description `
| Export-Csv -Path 'C:\test.csv' -NoTypeInformation -Force}
The strange thing is that if I comment out the export-csv cmdlet, the code works perfectly by grabbing everything in the text file and lists all the descriptions (as it should). However, when i include it, the Export only lists the very last item on the text file. I have read a few other questions similar to this but no success. Like I said, I am sure it is something simple but I can't seem to find it.
Thank You!
I found it.. i was missing -append in the export-csv and that worked.

Resources