Adding Key and value to JSON using Powershell and Excel - excel

I have an excel file which I'm reading from to populate a Parameter ARM JSON file.
The code I use is
$ws = $wb.Worksheets.Item(1)
$data = Get-Content -Path "$path\$jsonfile" -Raw | ConvertFrom-Json
$Row = 2
$col = 2
$data.parameters.client.value = $ws.Cells.Item($Row, $col).Value()
$data.parameters.user.value = $ws.Cells.Item($Row, $col).Offset(1, 0).Value()
$data.parameters.business.value = $ws.Cells.Item($Row, $col).Offset(2, 0).Value()
$data.parameters.dev.value = $ws.Cells.Item($Row, $col).Offset(3, 0).Value()
$data | ConvertTo-Json -Depth 9 | % {
[System.Text.RegularExpressions.Regex]::Unescape($_)
} | Set-Content -Path "$newpath\$JSONFile"
I need the business value to be a "key" : "value"
In my excel I have the fields as below
Name Value Key
$client Client1
$user User1
$business Bus-key bus-value
I can't work out how to add the "key" : "value" to the excel and get powershell to read it and populate the parameter sheet.
The Client and user values are strings so work fine.
I was hoping for some direction on where I'm going wrong
Thanks in advance :)

From what I understood from your screenshot, I think this might be of help to you:
$MyJson = '
{
"FortinetTags": {
"value":"6EB3B02F-50E5-4A3E-8CB8-2E129258317D"
}
}' | ConvertFrom-Json
# We will create a hash table containing the required key/value pair
$MyJson.FortinetTags.value = #{'provider' = $MyJson.FortinetTags.value}
$MyJson | ConvertTo-Json
It will update your JSON to this:
{
"FortinetTags": {
"value": {
"provider": "6EB3B02F-50E5-4A3E-8CB8-2E129258317D"
}
}
}

Related

PowerShell - Add imported csv column into newly exported csv

I was hoping someone can help me out. I am trying to get the date a license was assigned to a user and export it to a new csv. The import csv contains the UserPrincipalName. I was able to narrow down to only show which license I want but having the UPN show next to the license/date would complete this script. Thanks in advance
$getusers = Import-csv -Path 'C:\test\userlist.csv'
foreach ($user in $getusers) {
(Get-AzureADUser -searchstring $User.UserPrincipalName).assignedplans | where {$_.Service -eq 'MicrosoftOffice'} | Select-Object Service,AssignedTimeStamp |
Export-CSV -Path "C:\test\userlist-export.csv" -notypeinformation
}
I would do it this way, first querying the user and storing it in a variable and then filter the AssignedPlans where Service = MicrosoftOffice. To construct the objects you can use [pscustomobject]. Worth noting, the call to Export-Csv should be the last statement in your pipeline (it shouldn't be inside the loop), otherwise you would be replacing the Csv with a new value on each loop iteration instead of appending data.
Import-Csv -Path 'C:\test\userlist.csv' | ForEach-Object {
$azUser = Get-AzureADUser -ObjectId $_.UserPrincipalName
foreach($plan in $azUser.AssignedPlans) {
if($plan.Service -eq 'MicrosoftOffice') {
[pscustomobject]#{
UserPrincipalName = $azUser.UserPrincipalName
Service = $plan.Service
AssignedTimeStamp = $plan.AssignedTimeStamp
}
}
}
} | Export-Csv "C:\test\userlist-export.csv" -NoTypeInformation

How to modify excel data and export to text file using PowerShell script?

First time poster here. Apologies if I am not following best practices for posting this question.
I am very new to scripting and PowerShell.
Problem:
I have data in an excel sheet in this format.
Excel Data Image Link
I want to modify and export this data into a text file. In this format.
Required Output Image Link
Till now I have tried to modify the excel data by accessing each cell. To access each cell I am using a similar code mentioned below.
for (($i = 1); $i -lt 4; $i++)
{
$column=$ExcelWorkSheet.Columns.Item(1).Rows.Item($i).Text
$dataType=$ExcelWorkSheet.Columns.Item(2).Rows.Item($i).Text
$c1=("`"" + "$column" + "`""+":")
$c2=("`"" + "$dataType" + "`"" + ",")
$ExcelWorkSheet.Columns.Item(1).Rows.Item($i).Value=$c1
$ExcelWorkSheet.Columns.Item(2).Rows.Item($i).Value=$c2
}
I am still not sure if this is the correct way to go.
what would be the best way to solve this?
Just want to understand what I should do to solve this problem. I am not looking for the exact code.
Step by step instructions or some resources would be helpful.
Thanks!
This might help... maybe...
# Import Stuff
$Data = Import-Csv -Path .\Desktop\data.csv
# New Array
$Output = #()
# Run through Unique Owners
foreach ($Owner in ($Data | Select-Object OWNER -Unique)) {
$Lines = $Data | Where-Object {$_.OWNER -eq $Owner.OWNER}
# Lazy way to do a bit of checking, if same then use it or Break
if ($Lines[0].TABLE_NAME -eq $Lines[1].TABLE_NAME) {
$Out_TableName = $Lines[0].TABLE_NAME
# ID and NAME data
$Out_ID = $Lines | Where-Object {$_.COLUMN_NAME -eq "ID"} | Select-Object COLUMN_NAME, DATA_TYPE, DATA_LENGTH
$Out_NAME = $Lines | Where-Object {$_.COLUMN_NAME -eq "NAME"} | Select-Object COLUMN_NAME, DATA_TYPE, DATA_LENGTH
} else {
# Show the user that something
Write-Host "Problem with Owner ""$($Owner.OWNER)"" Data?!" -ForegroundColor Red
Break
}
# Output into the array in format
$Output += #"
"$($Owner.OWNER).$($Out_TableName)":{
"$($Out_ID.COLUMN_NAME)": "$($Out_ID.DATA_TYPE) ($($Out_ID.DATA_LENGTH))",
"$($Out_NAME.COLUMN_NAME)": "$($Out_NAME.DATA_TYPE) ($($Out_NAME.DATA_LENGTH))"
}
"#
}
# Put Output in a text file
$Output | Set-Content .\Desktop\output.txt -Force
I should add, that I had your data in a CSV like this...
OWNER,TABLE_NAME,COLUMN_NAME,DATA_TYPE,DATA_LENGTH
A,Employee,ID,NUMBER,22
A,Employee,NAME,VARCHAR2,22
B,Department,ID,NUMBER,23
B,Department,NAME,VARCHAR2,24

Extracting most recent activity/logon date

I am extracting computers from our Azure portal and I am specifically looking for devices which have had no activity in over 120 days.
The issue every time I extract this info it only gives me the activity/logon of the initial registration for the device.
Some of our devices have multiple registrations as they are passed from user to user.
So when I extract the info it is not accurate.
I used the below which I thought should work but I still get the devices that have newer activity than my target date. I also see an error when I am running the script I have include that below beneath the code I have used.
Could someone assist I feel I am close.
clear-host
$maxDate = Get-Date '6/1/19'
#Create a hash table to remove dupes
$set = #{}
#create our final array
$cleanSet = #()
#load data
$data = Get-MsolDevice -All | select-object -Property ObjectID, ApproximateLastLogonTimestamp, DisplayName
#for each item in the pipeline, add it to our hash table
foreach($_ in $data) {
#if the item isn't in the hash table, add it
if(!$set.Contains($_.ObjectID)) {
$set.Add($_.ObjectID, $_)
}
else {
#if we have a more recent date for the item, update the date so we only have the most recent one
if((Get-Date $_.ApproximateLastLogonTimestamp) -gt (Get-Date $set[$_.ObjectId].ApproximateLastLogonTimestamp)) {
$set[$_.ObjectID].ApproximateLastLogonTimestamp = $_.ApproximateLastLogonTimestamp
}
}
}
#now that we have the most recent date for each item, remove ones newer than our target date.
$set.GetEnumerator() | ForEach-Object{
if((get-date $_.Value.ApproximateLastLogonTimestamp) -lt $maxDate) {
$cleanSet += $_.Value
}
}
$cleanSet | select-object -Property ObjectID, DisplayName | export-csv "C:\Users\tesyuser\Desktop\Project Work\Stale machine on Azure\Exported CSV\2HIE-Stale-Device-List.csv" -NoTypeInformation
error
Get-Date : Cannot bind parameter 'Date' to the target. Exception
setting "Date": "Cannot convert null to type "System.DateTime"." At
line:29 char:18
Hi So my code should now look like this?
$maxDate = Get-Date '6/1/19'
#Create a hash table to remove dupes
$set = #{}
#create our final array
$cleanSet = #()
#load data
$data = Get-MsolDevice -All | select-object -Property ObjectID, ApproximateLastLogonTimestamp, DisplayName
#for each item in the pipeline, add it to our hash table
$data | ForEach-Object {..} {
#if the item isn't in the hash table, add it
if(!$set.Contains($_.ObjectID)) {
$set.Add($_.ObjectID, $_)
}
else {
#if we have a more recent date for the item, update the date so we only have the most recent one
if((Get-Date $_.ApproximateLastLogonTimestamp) -gt (Get-Date $set[$_.ObjectId].ApproximateLastLogonTimestamp)) {
$set[$_.ObjectID].ApproximateLastLogonTimestamp = $_.ApproximateLastLogonTimestamp
}
}
}
#now that we have the most recent date for each item, remove ones newer than our target date.
$set.GetEnumerator() | ForEach-Object{
if((get-date $_.Value.ApproximateLastLogonTimestamp) -lt $maxDate) {
$cleanSet += $_.Value
}
}
$cleanSet | select-object -Property ObjectID, DisplayName | export-csv "C:\Users\birrelld\Desktop\Project Work\Stale machine on Azure\Exported CSV\3HIE-Stale-Device-List.csv" -NoTypeInformation

List down column headers and get the maximum length of string per column

I'm looking for a translation of my Excel formula in a form of a script in Powershell, vbscript or Excel VBA. I'm trying to get the list of column headers and the max length of string under it.
Normally, what I do is manually open the .txt file in Excel, from there I can get the header names.. next, I create an array formula =MAX(LEN(A1:A100,000)) for example. This will get the max length of string in the column. I'll do the same formula to other columns.
Right now I can't do this since files have increased to 1GB in size and i can't open them anymore, my desktop crashes. It is also maybe because theyre more than 1 million rows which Excel cant handle. My friend suggested Powershell but I have limited knowledge there.. don't know if it can be done in vbscript or Excel VBA.
Thanks in advance for your help.
Below code works for .csv files but does not with .txt delimited files -
$fileName = "C:\Desktop\EFile.csv"
<#
Sample format of c:\temp\data.csv
"id","name","grade","address"
"1","John","Grade-9","test1"
"2","Ben","Grade-9","test12222"
"3","Cathy","Grade-9","test134343"
#>
$colCount = (Import-Csv $fileName | Get-Member | Where-Object {$_.MemberType -eq 'NoteProperty'} | Measure-Object).Count
$csv = Import-Csv $fileName
$csvHeaders = ($csv | Get-Member -MemberType NoteProperty).name
$dict = #{}
foreach($header in $csvHeaders) {
$dict.Add($header,0)
}
foreach($row in $csv)
{
foreach($header in $csvHeaders)
{
if($dict[$header] -le ($row.$header).Length)
{
$dict[$header] =($row.$header).Length
}
}
}
$dict.Keys | % { "key = $_ , Column Length = " + $dict.Item($_) }
This is how I get my data.
$data = #"
"id","name","grade","address"
"1","John","Grade-9","test1"
"2","Ben","Grade-9","test12222"
"3","Cathy","Grade-9","test134343"
"#
$csv = ConvertFrom-Csv -Delimiter ',' $data
But you should get your data like this
$fileName = "C:\Desktop\EFile.csv"
$csv = Import-Csv -Path $fileName
And then
# Extract the header names
$headers = $csv | Get-Member -MemberType NoteProperty | Select-Object -ExpandProperty Name
# Capture output in $result variable
$result = foreach($header in $headers) {
# Select all items in $header column, find the longest, and select the item for output
$maximum = $csv | Select-Object -ExpandProperty $header | Measure-Object -Maximum | Select-Object -ExpandProperty Maximum
# Generate new object holding the information.
# This will end up in $results
[pscustomobject]#{
Header = $header
Max = $maximum.Length
String = $maximum
}
}
# Simple output
$result | Format-Table
This is what I get:
Header Max String
------ --- ------
address 10 test134343
grade 7 Grade-9
id 1 3
name 4 John
Alternatively, if you have memory issues dealing with large files, you may have to get a bit more dirty with the .NET framework. This snippet processes one csv line at a time, instead of reading the entire file into memory.
$fileName = "$env:TEMP\test.csv"
$delimiter = ','
# Open a StreamReader
$reader = [System.IO.File]::OpenText($fileName)
# Read the headers and turn it into an array, and trim away any quotes
$headers = $reader.ReadLine() -split $delimiter | % { $_.Trim('"''') }
# Prepare a hashtable for the results
$result = #{}
# So long as there's more data, keep running
while(-not $reader.EndOfStream) {
# Read a single line and process it as csv
$csv = $reader.ReadLine() | ConvertFrom-Csv -Header $headers -Delimiter $delimiter
# Determine if the item in the result hashtable is smaller than the current, using the header as a key
foreach($header in $headers) {
$item = $csv | Select-Object -ExpandProperty $header
if($result[$header].Maximum -lt $item.Length) {
$result[$header] = [pscustomobject]#{
Header = $header
Maximum = $item.Length
String = $item
}
}
}
}
# Clean up our spent resource
$reader.Close()
# Simple output
$result.Values | Format-Table

PowerShell - paste data into Excel

Today I have just thrown together this PowerShell script which
takes a tab-delimited text file,
reads it into memory,
makes a variable number of filter queries based on distinct values of a certain column
creates a new empty Excel workbook
adds each of the subsets of filtered data to
a new Excel worksheet
The last step is where I am stuck. Currently my code puts a few lines of data into a range in the worksheet, in the form of unrolled/transposed "key: value" entries, resulting in a horizontal data layout. The same range of data is always overwritten.
I want data in the form of a vertical layout, i.e., data in columns, just the same way as if the CSV file was imported with the import-file-wizard of MS Excel.
Is there a simpler way to do it than below?
I admit, some of the PowerShell features are pasted in here in a cargo-cult mode of programming. Please note that I have no PowerShell experience whatsoever. I did some batchfile, VBScript, and VBA coding a few years back. So, other criticisms are also welcome.
PARAM (
[Parameter(ValueFromPipeline = $true)]
$infile = ".\04-2011\110404-13.txt"
)
PROCESS {
echo " $infile"
Write-Host "Num Args:" $args.Length;
$xl = New-Object -comobject Excel.Application;
$xl.Visible = $true;
$Workbook = $xl.Workbooks.Add();
$content = Import-Csv -delimiter "`t" $infile;
$ports = $content | Select-Object Port# | Sort-Object Port# -Unique -Descending;
$ports | ForEach-Object {
$p = $_;
Write-Host $p.{Port#};
$Worksheet = $Workbook.Worksheets.Add();
$workSheet.Name = [string]::Format("{0} {1}", "PortNo", $p.{Port#});
$filtered = $content | Where-Object {$_.{Port#} -eq $p.{Port#} };
$filtered | ForEach-Object {
Write-Host $_.{ObsDateTime}, $_.{Port#}
}
$filtered | clip.exe;
$range = $Workbook.ActiveSheet.Range("a2", "a$($filtered.count)");
$Workbook.ActiveSheet.Paste($range, $false);
}
$xl.Quit()
}
Data Output Example
Wrong
Port# : 1
Obs# : 1
Exp_Flux : 0,99
IV Cdry : 406.96
IV Tcham : 16.19
IV Pressure : 100.7
IV H2O : 9.748
IV V3 : 11.395
IV V4 : 0.759
IV RH : 53.12
Right
Port# Obs# Exp_Flux IV Cdry IV Tcham IV Pressure IV H2O IV V3 IV V4 IV RH
1 1 0,99 406.96 16.19 100.7 9.748 11.395 0.759 53.12
Try Export-Xls, it looks very nice. Never had the chance to use it, but (virtually) knowing the person who worked on it, I'm sure you will be very happy to use it. If you'll go with it, please provide a feedback here will be appreciated.
POSSIBLE WORKAROUND FOR UNORDERED PROPERTIES IN Export-Xls
The function Add-Array2Clipboard could be changed so that it accepts a new input parameter: an array providing the name of the properties ordered as required.
Then the you can change the section where get-member is used. Silly example:
"z", "a", "c" | %{ get-member -name $_ -inputobject $thecurrentobject }
This is just an example on how you can achieve ordered properties from get-member.
I've used the $Workbook.ActiveSheet.Cells.Item($row, $col).Value2 function to more be able to pinpoint more precisely where to put the data when exporting to Excel.
Something like
$row = 1
Get-Content $file | Foreach-Object {
$cols = $_.split("`t")
for ($i = 0; $i < $cols.count; $i++)
{
$Workbook.ActiveSheet.Cells.Item($row, $i+1).Value2 = $cols[$i]
}
$row++
}
Warning: dry-coded! You'll probably need some try..catch as well.
I used a modified Export-Xls function, a bit different as User empo suggested.
This is my call to it
Export-Xls $filtered -Path $outfile -WorksheetName "$wn" -SheetPosition "end" | Out-Null # -SheetPosition "end";
However, the current release of Export-Xls re-orders the columns of the in-memory representation of the csv-text -file. I want the data columns of the text file in their original order, so I had to hack and simplify the original code as follows:
function Add-Array2Clipboard {
param (
[PSObject[]]$ConvertObject,
[switch]$Header
)
process{
$array = #();
$line =""
if ($Header) {
$line = #()
$row = $ConvertObject | Select -First 1
$row.psobject.properties | Foreach {$line += "$($_.Name)" }
$array += [String]::Join("`t", $line)
}
else {
foreach($row in $ConvertObject){
$line =""
$vals = #()
$row.psobject.properties | Foreach {$vals += $_.Value}
$array += [String]::Join("`t", $vals)
}
}
$array | clip.exe;
}
}

Resources