compare 2 rows of IP Subnets - excel

I need to be able to compare 2 rows of IP Subnets and tell if there is overlap.
For example:
In Row 1 I have a /24 and I need to check if this /24 is existing in Row 2 (either via the /24 or via the supernet /21 for instance)
so:
ROW 1: 192.168.2.0/24
ROW 2: 192.168.0.0/21
Result -> Row 1 exists in Row 2
I am not sure how to do this in Excel
Anybody any idea?

If you want to do it in powershell, you may use this script:
Clear-Host
#Import csv file with IPs (Delimiter ";")
$rowlist = Import-Csv -Path "C:\rows_directory\rowlist.csv" -Delimiter ";"
$row1 = $rowlist | Select-Object -ExpandProperty "row1"
$row2 = $rowlist | Select-Object -ExpandProperty "row2"
foreach($string in $row1) {
if($string -in $row2) {
Write-Output "ROW1: $string exist in ROW2"
}
}
I filled file with:
And result was:
ROW1: 123 exist in ROW2

For this I would create a function to find the base address (as UInt32 type) of the IP address in the concerned Subnet:
Function Get-IPBase($Address) {
$IP, $SubNet = $Address.Split('/', 2)
$Bytes = ([IPAddress]$IP).GetAddressBytes()
if ([BitConverter]::IsLittleEndian) { [Array]::Reverse($Bytes) }
[BitConverter]::ToUInt32($bytes, 0) -BAnd -BNot ([UInt32][Math]::Pow(2, $SubNet) - 1)
}
Example of what the function returns:
Get-IPBase 192.168.2.0/24
3221225472
Get-IPBase 192.168.0.0/24
3221225472
Than, do a self-join, using this Join-Object script/Join-Object Module (see also: In Powershell, what's the best way to join two tables into one?):
Import-CSV .\My.csv |Join -On { Get-IPBase $_.Row1 } -Eq { Get-IPBase $_.Row2 }
Please add more details to your question (as what you tried yourself and a sample list. See also: How to Ask) if you like a more in dept explanation or have problems to implement this.

Related

Query Multi-Domain for SamAccountName using PowerShell

I'm trying to populate an employee ID column in a CSV file by querying Active Directory against another column in the CSV file called "OwnerEmail" (Which is the user principal name). The problem is users in the owneremail column do not all belong to the same domain. How can I query 2 domains at one time?
Table for reference
Employee ID
OwnerEmail
DeptNumber
Costs
test#mail.com
0894
4654.45
test2#mail.com
3453
4994.15
This is what I have tried so far. The script isn't working and there are no error messages. Any Ideas
$Domains='us.domain.corp', 'uk.domain.corp'
$CSVImport |Select-Object #{
Name = "employeeID"
Expression = {
foreach($user in $CSVImport)
{
foreach($Domain in $Domains){
$user= (Get-ADUser -Filter "UserPrincipalName -eq '$($_.OwnerEmail)'" -Server $Domain -Properties 'SamAccountName').SamAccountName
}
}
}}, * |Select-Object employeeID, DepartmentNumber, OwnerEmail, #{Name="Costs"; Expression={"$ $($_.Cost)"}} | Export-Csv "$Env:temp/$OutputFile" -NoTypeInformation
How can I query 2 domains at one time?
There is no need to do this, you can query both at once with multithreading but seems like an overkill. What I would recommend is to query all users at once per Domain, the code below may seem awfully complicated but should be pretty efficient. See the inline comments for details.
# Import the Csv
$CSVImport = Import-Csv path\to\thecsv.csv
# Create a LDAP Filter to query all users at once
# This filter would look like this for example:
# (|(userPrincipalName=test#mail.com)(userPrincipalName=test2#mail.com))
$filter = "(|"
foreach($email in $CSVImport.OwnerEmail) {
if(-not [string]::IsNullOrWhiteSpace($email)) {
$filter += "(userPrincipalName=$email)"
}
}
$filter += ")"
# For each Domain, use the same filter and get all existing users
'us.domain.corp', 'uk.domain.corp' | ForEach-Object { $map = #{} } {
foreach($user in Get-ADUser -LDAPFilter $filter -Server $_) {
# and store them in a hashtable where
# the Keys are their `UserPrincipalName`
# and the Values are the attribute of interest (`SamAccountName`)
$map[$user.UserPrincipalName] = $user.SamAccountName
}
}
# Now we can simply use a calculated property with `Select-Object`
$CSVImport | Select-Object #{N='EmployeeID'; E={ $map[$_.OwnerEmail] }}, * |
Export-Csv "$Env:temp/$OutputFile" -NoTypeInformation

Need to apply an if condition based on a check in Powershell

I am new to Powershell. I am actually getting the details of the azure data factory linked services but after get I need to use contains to check if the element exists. In python I would just check if string in a list but powershell not quite sure. Please check the code below.
$output = Get-AzDataFactoryV2LinkedService -ResourceGroupName $ResourceGroupName -DataFactoryName "xxxxxxxx" | Format-List
The output of the below is :
sample output given below
LinkedServiceName : abcdef
ResourceGroupName : ghijk
DataFactoryName : lmnopq
Properties : Microsoft.Azure.Management.DataFactory.Models.AzureDatabricksLinkedService
So now I try to do this:
if ($output.Properties -contains "Microsoft.Azure.Management.DataFactory.Models.AzureDatabricksLinkedService") {
Write-Output "test output"
}
But $output.Properties gives us the properties of that json.
I need to check if "Microsoft.Azure.Management.DataFactory.Models.AzureDatabricksLinkedService" exists in output variable and perform the required operations. Please help me on this.
The -contains operator requires a collection and an element. Here's a basic example of its proper use:
$collection = #(1,2,3,4)
$element1 = 5
$element2 = 3
if ($collection -contains $element1) {'yes'} else {'no'}
if ($collection -contains $element2) {'yes'} else {'no'}
What you've done is ask PowerShell to look in an object that isn't a collection for an element of type [string] and value equal to the name of that same object.
What you need to do is inspect this object:
$output.Properties | format-list *
Then once you figure out what needs to be present inside of it, create a new condition.
$output.Properties.something -eq 'some string value'
...assuming that your value is a string, for example.
I would recommend watching some beginner tutorials.

How do I optimise performance when selecting first 2 rows per group from Excel spreadsheet in PowerShell?

I have a requirement to select the first X rows for each unique agent ID. My method below works, but it runs into performance issues when the spreadsheet has over about 5k results to consider. I am hopeful that you very smart people can help me optimise the approach to require less processing.
I am using ImportExcel to import the spreadsheet of call records, then I filter out uninteresting rows and I'm left with $UsableCalls as my pool of calls to be evaluated. Sometimes, this pool has only 2k rows. Sometimes it has 16k. It's possible that it might have even more. Unfortunately, it seems like the max this method can support is around 5k-ish results. Anything over that and the process hangs. So if I have 5k rows, then I can really only handle getting the first 1 call per agent. If I have 2k, then I can get the first 2 calls per agent. The number of calls per agent is selectable, and I'd like to have the option to get up to the first 5 calls per agent, but that simply won't work with the way it processes right now.
Ultimately, the goal is to select the first X# calls (rows) for each agent. I then export that as a second spreadsheet. This is the only method I could come up with, but I am certainly open to suggestion.
Here is what I have, how can I improve it?
# this custom function allows the user to select a digit, uses wpf messagebox
$numberagent = select-numberperagent
# collect the unique agent IDs
$UniqueAgentIDs = #()
$UniqueAgentIDs += ($UsableCalls | Select-Object -Property "Agent ID" -Unique )."Agent ID"
# select first X# of each agent's calls
$CallsPerAgent = #()
foreach ($UniqueAgent in $UniqueAgentIDs) {
$CallsPerAgent += ($UsableCalls | ? {$_."Agent ID" -eq "$UniqueAgent"}) | select -First $numberagent
} #close foreach uniqueagent
And here is an example of one of the custom PS Objects in the variable $usableCalls:
PS C:\> $usableCalls[0]
DateTime : 2022-03-03 11:06:16.063
DigitDialed : 781
Agent ID : 261
Agent Name : CCM
Skill group : PAYE.
CallType : PAYE
PPSN : 81
DNIS : 10
ANI : 772606677789
Disposition : Handled
Duration : 818
RingTime : 12
DelayTime : 0
HoldTime : 0
TalkTime : 14
WorkTime : 31
The first thing to improve the speed is to not use += to add stuff to an array.
By doing so, on each addition, the entire array needs to be rebuilt in memory. Better let PowerShell do the collecting of data for you:
# collect the unique agent IDs
$UniqueAgentIDs = ($UsableCalls | Select-Object -Property 'Agent ID' -Unique).'Agent ID'
# select first X# of each agent's calls
$CallsPerAgent = foreach ($UniqueAgent in $UniqueAgentIDs) {
$UsableCalls | Where-Object {$_.'Agent ID' -eq $UniqueAgent} | Select-Object -First $numberagent
}
Without really knowing what objects are in your variable $UsableCalls, you might even be better off using Group-Object to group all calls in the Agent's ID and loop over these groups
$CallsPerAgent = $UsableCalls | Group-Object -Property 'Agent ID' | ForEach-Object {
$_.Group | Select-Object -First $numberagent
}

Azure Portal - Assign tags to Resource Groups using Powershell and reading data from Excel spreadsheet or CSV

I am looking to apply a new set of tags to a number of Azure Resource Groups. The tags must adhere to a naming convention which is a concatenation of a number of Microsoft Excel column data field values and will be read row by row for each Resource Group.
For example, let's use the below as an example of the first 6 columns:
Column A: Azure Subscription Name
Column B: Resource Group Name
Column C: Business Unit
Column D: Cost Center
Column E: Project Code
Column F: Service or Application Name
The proposed tag will then be a concatenation of for example:
[Column A] - [Column B] - [Column C] - [Column E] - [Column F]
The resultant tag for the first row should therefore look like the below example:
"Subscription 1 : RG-01 : BU-A1 : 1001 : WebApp-1"
Powershell would probably be the preferred solution to complete this task and I'd welcome any suggestions or ideas on how to achieve this.
Updated answer 0406:
The scripts below can work on multi-rows scenario:
Note: to use Set-AzContext cmdlet, please install Az.Accounts 2.2.7 module.
#load the .csv file
$file_path = "D:\test\t1.csv"
#define the tag name, you can change it as per your need
$tag_name="mytag111"
#loop the values in the .csv file
Import-Csv -Path $file_path | ForEach-Object{
#define a tag value with empty string
$tag_value=""
#define a variable for subscription name
$subscription_name = ""
#define a variable for resource group name
$resource_group_name = ""
foreach($property in $_.PSObject.Properties)
{
#here, get the subscription name from the csv file
if($property.name -eq "Az Sub Name"){
$subscription_name = $property.Value
}
#here, get the resource group name from the csv file
if($property.name -eq "RG Name"){
$resource_group_name = $property.Value
}
#exclude the "Cost Ctr" column
if(!($property.name -eq "Cost Ctr"))
{
#then we loop all the values from each row, and then concatenate them
#here, we just don't want to add the colon(:) at the end of the value from "Svc/App" column
if(!($property.name -eq "Svc/App"))
{
$tag_value += $property.Value + " : "
}
else
{
$tag_value += $property.Value
}
}
}
#change the context as per different subscription
Set-AzContext -Subscription $subscription_name
#get the existing tags for the resource group
$tags = (Get-AzResourceGroup -Name $resource_group_name).Tags
#add the new tags to the existing tags, so the existing tags will not be removed
$tags +=#{$tag_name=$tag_value}
#set the tags
Set-AzResourceGroup -Name $resource_group_name -Tag $tags
}
"completed********"
Here are the test data:
The scripts work fine as per my testing. And please let me know if you have any issues about it.
Original answer:
Please correct me if I misunderstood you.
I write a simple code to read data from .csv file and then add tags to the resource group.
Note that in my testing, there is only 1 row data in .csv file and hard-cord the resource group name, please feel free to modify the code to meet your requirement.
And to use the Set-AzResourceGroup cmdlet, you should make sure the Az.Resources module is installed.
The .csv file:
The powershell code:
#load the .csv file
$file_path = "D:\test\t1.csv"
#define the tag name, you can change it as per your need
$tag_name="mytag111"
#define a tag value with empty string
$tag_value=""
#loop the values in the .csv file
Import-Csv -Path $file_path | ForEach-Object{
foreach($property in $_.PSObject.Properties)
{
#exclude the "Cost Ctr" column
if(!($property.name -eq "Cost Ctr"))
{
#then we loop all the values from each row, and then concatenate them
#here, we just don't want to add the colon(:) at the end of the value from "Svc/App" column
if(!($property.name -eq "Svc/App"))
{
$tag_value += $property.Value + " : "
}
else
{
$tag_value += $property.Value
}
}
}
#get the existing tags for the resource group
$tags = (Get-AzResourceGroup -Name yyrg11).Tags
#add the new tags to the existing tags, so the existing tags will not be removed
$tags +=#{$tag_name=$tag_value}
#set the tags
Set-AzResourceGroup -Name yyrg11 -Tag $tags
}
"completed********"
The test result:
In azure portal, the tag is added:

How to get the SharePoint workflow status for each SP item with a PowerShell script?

The workflow status data that I want is in the column with the same name as the document library. However I cannot access the data in that column with the name for that column displayed in SharePoint. I need the internal column name if I am going to access that column with the code below.
$list = $SPWeb.lists["document library name here"]
$items = $list.Items
$count = 0
foreach($item in $items)
{
# (typically you put the column name you want where SPWorkflowStatusColumnName is)
if($item["SPWorkflowStatusColumnName"] -eq "Completed")
{
$count = $count + 1
}
}
The internal Column name is eight character long removing spaces and non-alphanumeric characters.
or to find the internal name of the column just click on the column name in list and check the url value after SortField= will be the internal name of the column.
for eg. in my case usrl looks like this:
http://[server]/[sitecollection]/[site]/listname/Forms/AllItems.aspx?View={view GUID}&SortField=TradingS&SortDir=Asc
and my fields internal name is "TradingS".

Resources