Compare two Files and list only the differences of 2nd File using Powershell - azure

I'm trying to get the current list of azure vms on first run of script -> Stores to Storage Account in CSV File
O the 2nd run - Current List should be compared with existing csv file in Storage Account incase of any vms decommisioned that should be recorded and stored in 2nd File in Storage Account
This works fine for me but the issue is when we create a new azure vm which was also gets added to decommission csv list
$Difference = Compare-Object $existingVmCsv $vmrecordFile -Property VmName -PassThru | Select-Object VmName,ResourceGroupName,SubscriptionName
I tried couple of side indicators but dint work
$Difference = Compare-Object -ReferenceObject #($vmrecordFile | Select-Object) -DifferenceObject #($existingVmCsv | Select-Object) -PassThru -Property VmName,ResourceGroupName,SubscriptionName | Where-Object {$_sideIndicator -eq "<="}
$Difference = Compare-Object -ReferenceObject $vmrecordFile -DifferenceObject $existingVmCsv -PassThru -Property VmName,ResourceGroupName,SubscriptionName | Where-Object {$_sideIndicator -eq "<="}

Thank you User Cpt.Whale - Stack Overflow . Posting your suggestions as answer to help other community members.
It seems, you have a typo in a syntax. Object property references should use a "." , like Where-Object { $_.sideIndicator -eq '<=' }
'<=' This indicates that property value appears only in the -ReferenceObject setReferences: powershell - compare two files and update the differences to 2nd file - Stack Overflow , Powershell : How to Compare Two Files, and List Differences | | Dotnet Helpers (dotnet-helpers.com) and compare-object not working : PowerShell (reddit.com)

Related

Looking to validate that certain string is present in a text file, send warning if not

I have a process where files containing data are generated in separate locations, saved to a networked location, and merged into a single file.
And the end of the process, I would like to check that all locations are present in that merged file, and notify me if not.
I am having a problem finding a way to identify that a string specific to each location isn't present, to be used in an if statement, but it doesn't seem to be identifying the string correctly?
I have tried :
get-childitem -filter *daily.csv.ready \\x.x.x.x\data\* -recurse | where-object {$_ -notin 'D,KPI,KPI,1,'}
I know it's probably easier to do nothing if it is present, and perform the warning action if not, but I'm curious if this can be done in the reverse.
Thank you,
As Doug Maurer points out, your command does not search through the content of the files output by the Get-ChildItem command, because what that cmdlet emits are System.IO.FileInfo (or, potentially, System.IO.DirectoryInfo) instances containing metadata about the matching files (directories) rather than their content.
In other words: the automatic $_ variable in your Where-Object command refers to an object describing a file rather than its content.
However, you can pipe System.IO.FileInfo instances to the Select-String cmdlet, which indeed searches the input files' content:
Get-ChildItem -Filter *daily.csv.ready \\x.x.x.x\data\* -Recurse |
Where-Object { $_ | Select-String -Quiet -NotMatch 'D,KPI,KPI,1,' }

Azure Powershell: How Do I search for files in a BLOB storage quickly?

We store log files in an Azure storage account, sorted in directories, by date and customer, like this:
YYYY/MM/DD/customerNo/.../.../somestring.customerNo.applicatoinID.log
I need to parse some of these files automatically every day which works fine. However, all I know is the prefix mentioned above and the suffix, they might be in different subdirectories.
So this is how I did it:
$files = (Get-AzStorageBlob -Container logfiles -Context $context) | Where-Object { $_.Name -like "$customerId.$appID.txt" }
This was fast while there weren't any log files, but now after a year this search takes ages. I read somewhere that it would be faster to search by prefix than by suffix. Unfortunately, I have to use the suffix, but I now use the date as a prefix as well. I tried to improve it by doing this:
$date = Get-Date -UFormat "%Y/%m/%d"
$prefix = "$date/$customerId/"
$files = (Get-AzStorageBlob -Container logfiles -Context $context) | Where-Object { $_.Name -like "$prefix*$customerId.$appID.txt" }
However, there is no improvement whatsoever, it just takes as long as before. And it feels like the time the search takes is increasing exponentially with the amount of log files (A few hundred thousand in a very few tens of GBs)
I get a status message which stays there literally for half an hour:
From what I understand, Azure's BLOB storage does not have a hierarchical file system that supports folders, so the "/" are part of the BLOB name and are being interpreted as folders by client software.
However, that does not help me speeding up the search. Any suggestions on how to improve the situation?
Azure Blob Storage supports server-side filtering of blobs by prefix however your code is not taking advantage of that.
$files = (Get-AzStorageBlob -Container logfiles -Context $context) | Where-Object { $_.Name -like "$prefix*$customerId.$appID.txt" }
Essentially the code above is listing all blobs and then doing the filtering on the client side.
To speed up the search, please modify your code to something like:
$files = (Get-AzStorageBlob -Container logfiles -Prefix $prefix -Context $context) | Where-Object { $_.Name -like "$prefix*$customerId.$appID.txt" }
I simply passed the prefix in the Prefix parameter. Now you'll receive only the blobs names of which start with the prefix .

Delete rows in a .CSV file containing specific character with Powershell

I receive an automatic weekly export from a system in a .csv format. It contains a lot of usernames with the initials of the users (e.g. "fl", "nk"). A few of them have their first and last names, separated by coma (e.g. firstname.lastname). These are the ones, which have to be deleted from the .csv file.
My goal here is to write a Powershell script, which delete all rows, containing the character "." (dot) and then save the same .csv file by overwritting it.
Since I'm very new to Powershell, I'd highly appreciate a more detailed answer including the potential code. I tried various examples from similar issues, which I found here, but none of them worked and/or I am getting error messages, mostly because my syntax isn't correct.
Additional info. Here is a part of the table.
I tried this code:
Get-Content "D:\file.csv" | Where-Object {$_ -notmatch '\.'} | Set-Content "D:\File.csv"-Force -NoTypeInformation
As Mathias says, it is helpful to see what you have tried so we can help you come to a working result. It is easy to give you something like this:
$csv = Import-Csv -Path C:\Temp\temp.csv -Delimiter ";"
$newCSV = #()
foreach($row in $csv){
if(!$row.username -or $row.username -notlike "*.*"){
$newCSV += $row
}
}
$newCSV | Export-Csv -Path C:\Temp\temp.csv -Delimiter ";" -NoTypeInformation
The above code eliminates rows that have a dot on the username field. It leaves rows with an empty username intact with the 'if(!$row.username' part. But I have no idea whether this is helpful since there is no example CSV file, also there is no way to know what you have tried so far ;)
Note that I always prefer using ";" as delimiter, because opening the file in Excel will already be correctly seperated. If the current file uses ',' as a delimiter, you will need to change that when importing the CSV.
You were very close! For this you don't need a loop, you just need to do it using the correct cmdlets:
(Import-Csv -Path 'D:\file.csv' -Delimiter ';') |
Where-Object { $_.Initials -notmatch '\.' } |
Export-Csv -Path 'D:\file.csv' -Delimiter ';' -Force -NoTypeInformation
Get-Content simply reads a text file and returns the lines as string array, whereas Import-Csv parses the structure and creates objects with properties from the header line.
The brackets around the Import-Csv are needed to ensure the importing/parsing of the file is completely done before piping the results through. Without that, the resulting file may become completely empty because you cannot read and overwrite the same file at the same time.

Filter on Name of Azure Disk Resources

Hi I'm building a DevOps pipeline and trying to get a list of disks to query.
To try and make the code a bit neater, just trying to understand if there's a better way f doing this. We currently have disks named disk_2 or disk2 or disk-2. this is an example with up to 8 disks per vm. I could use
Get-AzDisk | ? {$_.name -like "*disk-2*" -or $_.Name -like "*disk2*" -or $_.name -like "*disk_2*"}
But i was thinking could i create a list, maybe something like this $list = disk_1,disk1,disk-1,disk_2,disk2,disk-2,disk_3,disk3,disk-3,etc..
Then reference the list in the Powershell pipeline.
Get-AzDisk | ? {$_.name -like "*disk-2*" -or $_.Name -like "*$list*"}
Thsi doesn't seem to work, this will be running in Azure DevOps in an automated pipeline.
Maybe not easier to read when you're not familiar with regex but much less to type would it be like this:
Get-AzDisk |
Where-Object {$_.name -match 'disk(-|_)?\d'}
... or this:
Get-AzDisk |
Where-Object {$_.name -match 'disk[-_]?\d'}

PowerShell: update O365 AD bulk attributes through csv file

We are trying to bulk update our Azure Active Directory. We have a excel csv list of UserPrincipalNames that we will update the Title, Department, and Office attributes
# Get List of Clinical CMs
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Import-csv $PATH
# Pass CMs into Function
ForEach ($UPN in $CMs) {
# Do AD Update Task Here
Set-Msoluser -UserPrincipalName $UPN -Title "Case Manager" -Department "Clinical" -Office "Virtual"
}
The CSV:
User.1#domain.com
User.2#domain.com
User.3#domain.com
The Set-MsolUser command will work on its own, but it is not working as intended in this For loop. Any help or insight is greatly appreciated
As Jim Xu commented, here my comment as answer.
The input file you show us is not a CSV file, instead, it is a list of UPN values all on a separate line.
To read these values as string array, the easiest thing to is to use Get-Content:
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Get-Content -Path $PATH
Of course, although massive overkill, it can be done using the Import-Csv cmdlet:
$CMs = (Import-Csv -Path $PATH -Header upn).upn

Resources