PowerShell - paste data into Excel - excel

Today I have just thrown together this PowerShell script which
takes a tab-delimited text file,
reads it into memory,
makes a variable number of filter queries based on distinct values of a certain column
creates a new empty Excel workbook
adds each of the subsets of filtered data to
a new Excel worksheet
The last step is where I am stuck. Currently my code puts a few lines of data into a range in the worksheet, in the form of unrolled/transposed "key: value" entries, resulting in a horizontal data layout. The same range of data is always overwritten.
I want data in the form of a vertical layout, i.e., data in columns, just the same way as if the CSV file was imported with the import-file-wizard of MS Excel.
Is there a simpler way to do it than below?
I admit, some of the PowerShell features are pasted in here in a cargo-cult mode of programming. Please note that I have no PowerShell experience whatsoever. I did some batchfile, VBScript, and VBA coding a few years back. So, other criticisms are also welcome.
PARAM (
[Parameter(ValueFromPipeline = $true)]
$infile = ".\04-2011\110404-13.txt"
)
PROCESS {
echo " $infile"
Write-Host "Num Args:" $args.Length;
$xl = New-Object -comobject Excel.Application;
$xl.Visible = $true;
$Workbook = $xl.Workbooks.Add();
$content = Import-Csv -delimiter "`t" $infile;
$ports = $content | Select-Object Port# | Sort-Object Port# -Unique -Descending;
$ports | ForEach-Object {
$p = $_;
Write-Host $p.{Port#};
$Worksheet = $Workbook.Worksheets.Add();
$workSheet.Name = [string]::Format("{0} {1}", "PortNo", $p.{Port#});
$filtered = $content | Where-Object {$_.{Port#} -eq $p.{Port#} };
$filtered | ForEach-Object {
Write-Host $_.{ObsDateTime}, $_.{Port#}
}
$filtered | clip.exe;
$range = $Workbook.ActiveSheet.Range("a2", "a$($filtered.count)");
$Workbook.ActiveSheet.Paste($range, $false);
}
$xl.Quit()
}
Data Output Example
Wrong
Port# : 1
Obs# : 1
Exp_Flux : 0,99
IV Cdry : 406.96
IV Tcham : 16.19
IV Pressure : 100.7
IV H2O : 9.748
IV V3 : 11.395
IV V4 : 0.759
IV RH : 53.12
Right
Port# Obs# Exp_Flux IV Cdry IV Tcham IV Pressure IV H2O IV V3 IV V4 IV RH
1 1 0,99 406.96 16.19 100.7 9.748 11.395 0.759 53.12

Try Export-Xls, it looks very nice. Never had the chance to use it, but (virtually) knowing the person who worked on it, I'm sure you will be very happy to use it. If you'll go with it, please provide a feedback here will be appreciated.
POSSIBLE WORKAROUND FOR UNORDERED PROPERTIES IN Export-Xls
The function Add-Array2Clipboard could be changed so that it accepts a new input parameter: an array providing the name of the properties ordered as required.
Then the you can change the section where get-member is used. Silly example:
"z", "a", "c" | %{ get-member -name $_ -inputobject $thecurrentobject }
This is just an example on how you can achieve ordered properties from get-member.

I've used the $Workbook.ActiveSheet.Cells.Item($row, $col).Value2 function to more be able to pinpoint more precisely where to put the data when exporting to Excel.
Something like
$row = 1
Get-Content $file | Foreach-Object {
$cols = $_.split("`t")
for ($i = 0; $i < $cols.count; $i++)
{
$Workbook.ActiveSheet.Cells.Item($row, $i+1).Value2 = $cols[$i]
}
$row++
}
Warning: dry-coded! You'll probably need some try..catch as well.

I used a modified Export-Xls function, a bit different as User empo suggested.
This is my call to it
Export-Xls $filtered -Path $outfile -WorksheetName "$wn" -SheetPosition "end" | Out-Null # -SheetPosition "end";
However, the current release of Export-Xls re-orders the columns of the in-memory representation of the csv-text -file. I want the data columns of the text file in their original order, so I had to hack and simplify the original code as follows:
function Add-Array2Clipboard {
param (
[PSObject[]]$ConvertObject,
[switch]$Header
)
process{
$array = #();
$line =""
if ($Header) {
$line = #()
$row = $ConvertObject | Select -First 1
$row.psobject.properties | Foreach {$line += "$($_.Name)" }
$array += [String]::Join("`t", $line)
}
else {
foreach($row in $ConvertObject){
$line =""
$vals = #()
$row.psobject.properties | Foreach {$vals += $_.Value}
$array += [String]::Join("`t", $vals)
}
}
$array | clip.exe;
}
}

Related

How to modify excel data and export to text file using PowerShell script?

First time poster here. Apologies if I am not following best practices for posting this question.
I am very new to scripting and PowerShell.
Problem:
I have data in an excel sheet in this format.
Excel Data Image Link
I want to modify and export this data into a text file. In this format.
Required Output Image Link
Till now I have tried to modify the excel data by accessing each cell. To access each cell I am using a similar code mentioned below.
for (($i = 1); $i -lt 4; $i++)
{
$column=$ExcelWorkSheet.Columns.Item(1).Rows.Item($i).Text
$dataType=$ExcelWorkSheet.Columns.Item(2).Rows.Item($i).Text
$c1=("`"" + "$column" + "`""+":")
$c2=("`"" + "$dataType" + "`"" + ",")
$ExcelWorkSheet.Columns.Item(1).Rows.Item($i).Value=$c1
$ExcelWorkSheet.Columns.Item(2).Rows.Item($i).Value=$c2
}
I am still not sure if this is the correct way to go.
what would be the best way to solve this?
Just want to understand what I should do to solve this problem. I am not looking for the exact code.
Step by step instructions or some resources would be helpful.
Thanks!
This might help... maybe...
# Import Stuff
$Data = Import-Csv -Path .\Desktop\data.csv
# New Array
$Output = #()
# Run through Unique Owners
foreach ($Owner in ($Data | Select-Object OWNER -Unique)) {
$Lines = $Data | Where-Object {$_.OWNER -eq $Owner.OWNER}
# Lazy way to do a bit of checking, if same then use it or Break
if ($Lines[0].TABLE_NAME -eq $Lines[1].TABLE_NAME) {
$Out_TableName = $Lines[0].TABLE_NAME
# ID and NAME data
$Out_ID = $Lines | Where-Object {$_.COLUMN_NAME -eq "ID"} | Select-Object COLUMN_NAME, DATA_TYPE, DATA_LENGTH
$Out_NAME = $Lines | Where-Object {$_.COLUMN_NAME -eq "NAME"} | Select-Object COLUMN_NAME, DATA_TYPE, DATA_LENGTH
} else {
# Show the user that something
Write-Host "Problem with Owner ""$($Owner.OWNER)"" Data?!" -ForegroundColor Red
Break
}
# Output into the array in format
$Output += #"
"$($Owner.OWNER).$($Out_TableName)":{
"$($Out_ID.COLUMN_NAME)": "$($Out_ID.DATA_TYPE) ($($Out_ID.DATA_LENGTH))",
"$($Out_NAME.COLUMN_NAME)": "$($Out_NAME.DATA_TYPE) ($($Out_NAME.DATA_LENGTH))"
}
"#
}
# Put Output in a text file
$Output | Set-Content .\Desktop\output.txt -Force
I should add, that I had your data in a CSV like this...
OWNER,TABLE_NAME,COLUMN_NAME,DATA_TYPE,DATA_LENGTH
A,Employee,ID,NUMBER,22
A,Employee,NAME,VARCHAR2,22
B,Department,ID,NUMBER,23
B,Department,NAME,VARCHAR2,24

Powershell script to extract data from multiple text files into an excel spreadsheet

I'm pretty new to PS and been struggling for a few days.
I have multiple text files in a folder with specific data that I would like to extract into an excel spreadsheet.
each files look like this :
Client n° : xxx Client name : xxx
Computer status
pc group 1 :
n°1 OK n°2 Disconnected n°3 Unresponsive
n°4 Unreachable host n°5 Unresponsive
Data read 11/11/20 12:50:07
Version: x.x.x
I would like to have an output file that looks like this :
Client name and n° OK Disconnected Unresponsive Unreachable host version
xxx/xxx 1 1 2 1 x.x.x
For the status columns it's the sum number of pc with that status and not the pc n° that I would like to display.
At the moment I'm working with multiple .bat files that searches for the status and output one file per status
find /c "Disconnected" *.* > disconnected.txt
find /c "Unresponsive" *.* > unresponsive.txt
And then I sort every single output in an excel which takes me too much time, I was wondering if it was possible to automate this task with a script.
I really don't have any knowledge of PS, only basic batch commands.
Let's assume your files are all in one folder and all of them have the .txt extension.
Then you need to loop through these files and parse the data you need from it:
# create a Hashtable to add the different status values in
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0}
# loop through the files in your path and parse the information out
$result = Get-ChildItem -Path 'D:\Test' -Filter '*.txt' -File | ForEach-Object {
switch -Regex -File $_.FullName {
'^Client n°\s*:\s*([^\s]+)\s+Client name\s*:\s*(.+)$' {
# start collecting data for this client
$client = '{0}/{1}' -f $matches[2], $matches[1]
# reset the Hashtable to keep track of the status values
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0 }
}
'^\d+' {
# increment the various statuses in the Hahstable
($_ -split '\d+').Trim() | ForEach-Object { $status[$_]++ }
}
'^Version:\s(.+)$' {
$version = $matches[1]
# since this is the last line for this client, output the collected data as object
[PsCustomObject]#{
'Client name and n°' = $client
'OK' = $status['OK']
'Disconnected' = $status['Disconnected']
'Unresponsive' = $status['Unresponsive']
'Unreachable host' = $status['Unreachable host']
'Version' = $version
}
}
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path 'D:\Test\clientdata.csv' -UseCulture -NoTypeInformation
Result on screen:
Client name and n° OK Disconnected Unresponsive Unreachable host Version
------------------ -- ------------ ------------ ---------------- -------
xxx/xxx 1 1 2 1 x.x.x
I used this as an exercise to test my abilities. I created three of the same files, with different data, and tested this script. As long as they are text files in the directory the script will iterate through each file and pull the data from each as you stated it needs to be. If a stray text file gets added the script does not know nor care and will treat it like the others. If there is data it can find it will, and it will output that data to the excel file. Lastly the file is set to save itself and then immediately close.
It starts by Creating the Excel file, then Workbook. (I commented out the naming of the workbook. If you like you can add it back.) Finds all text files in a directory, then searches the text for the specific content within the text you specified above.
During the script I commented as much as I thought might be needed to assist with modification later on.
Output formatted like this:
Excel Output
#Create An Excel File
$excel = New-Object -ComObject excel.application
$excel.visible = $True
#Add Workbook
$workbook = $excel.Workbooks.Add()
<#Rename Workbook
$workbook= $workbook.Worksheets.Item(1)
$workbook.Name = 'Client name and #'#>
#create the column headers
$workbook.Cells.Item(1,1) = 'Client name and n°'
$workbook.Cells.Item(1,2) = 'OK'
$workbook.Cells.Item(1,3) = 'Disconnected'
$workbook.Cells.Item(1,4) = 'Unresponsive'
$workbook.Cells.Item(1,5) = 'Unreachable'
$workbook.Cells.Item(1,6) = 'Version'
$workbook.Cells.Item(1,7) = 'Date Gathered'
$move = "C:\Users\iNet\Desktop\Testing"
$root = "C:\Users\iNet\Desktop\Testing"
$files = Get-ChildItem -Path $root -Filter *.txt
#Starting on Row 2
[int]$i = 2
ForEach ($file in $files){
$location = $root+"\"+$file
#Format your client data to output what you want to see.
$ClientData = select-string -path "$location" -pattern "Client"
$ClientData = $ClientData.line
$ClientData = $ClientData -replace "Client n° :" -replace ""
$ClientData = $ClientData -replace "Client name :" -replace "|"
$row = $i
$Column = 1
$workbook.Cells.Item($row,$column)= "$ClientData"
#Data Read Date
$DataReadDate = select-string -path "$location" -pattern "Data read"
$DataReadDate = $DataReadDate.line
$DataReadDate = $DataReadDate -replace "Data read " -replace ""
#Data Read Date, you asked for everything but this.
$row = $i
$Column = 7
$workbook.Cells.Item($row,$column)= "$DataReadDate"
#Version
$Version = select-string -path "$location" -pattern "Version:"
$Version = $Version.line
$Version = $Version -replace "Version: " -replace ""
$row = $i
$Column = 6
$workbook.Cells.Item($row,$column)= "$Version"
#How Many Times Unresponsive Shows Up
$Unresponsive = (Get-Content "$location" | select-string -pattern "Unresponsive").length
$row = $i
$Column = 4
$workbook.Cells.Item($row,$column)= "$Unresponsive"
#How Many Times Disconnected Shows Up
$Disconnected = (Get-Content "$location" | select-string -pattern "Disconnected").length
$row = $i
$Column = 3
$workbook.Cells.Item($row,$column)= "$Disconnected"
#How Many Times Unreachable host Shows Up
$Unreachable = (Get-Content "$location" | select-string -pattern "Unreachable host").length
$row = $i
$Column = 5
$workbook.Cells.Item($row,$column)= "$Unreachable"
#How Many Times OK Shows Up
$OK = (Get-Content "$location" | select-string -pattern "OK").length
$row = $i
$Column = 2
$workbook.Cells.Item($row,$column)= "$OK"
#Iterate by one so each text file goes to its own line.
$i++
}
#Save Document
$output = "\Output.xlsx"
$FinalOutput = $move+$output
#saving & closing the file
$workbook.SaveAs($move)
$excel.Quit()

Adding Key and value to JSON using Powershell and Excel

I have an excel file which I'm reading from to populate a Parameter ARM JSON file.
The code I use is
$ws = $wb.Worksheets.Item(1)
$data = Get-Content -Path "$path\$jsonfile" -Raw | ConvertFrom-Json
$Row = 2
$col = 2
$data.parameters.client.value = $ws.Cells.Item($Row, $col).Value()
$data.parameters.user.value = $ws.Cells.Item($Row, $col).Offset(1, 0).Value()
$data.parameters.business.value = $ws.Cells.Item($Row, $col).Offset(2, 0).Value()
$data.parameters.dev.value = $ws.Cells.Item($Row, $col).Offset(3, 0).Value()
$data | ConvertTo-Json -Depth 9 | % {
[System.Text.RegularExpressions.Regex]::Unescape($_)
} | Set-Content -Path "$newpath\$JSONFile"
I need the business value to be a "key" : "value"
In my excel I have the fields as below
Name Value Key
$client Client1
$user User1
$business Bus-key bus-value
I can't work out how to add the "key" : "value" to the excel and get powershell to read it and populate the parameter sheet.
The Client and user values are strings so work fine.
I was hoping for some direction on where I'm going wrong
Thanks in advance :)
From what I understood from your screenshot, I think this might be of help to you:
$MyJson = '
{
"FortinetTags": {
"value":"6EB3B02F-50E5-4A3E-8CB8-2E129258317D"
}
}' | ConvertFrom-Json
# We will create a hash table containing the required key/value pair
$MyJson.FortinetTags.value = #{'provider' = $MyJson.FortinetTags.value}
$MyJson | ConvertTo-Json
It will update your JSON to this:
{
"FortinetTags": {
"value": {
"provider": "6EB3B02F-50E5-4A3E-8CB8-2E129258317D"
}
}
}

List down column headers and get the maximum length of string per column

I'm looking for a translation of my Excel formula in a form of a script in Powershell, vbscript or Excel VBA. I'm trying to get the list of column headers and the max length of string under it.
Normally, what I do is manually open the .txt file in Excel, from there I can get the header names.. next, I create an array formula =MAX(LEN(A1:A100,000)) for example. This will get the max length of string in the column. I'll do the same formula to other columns.
Right now I can't do this since files have increased to 1GB in size and i can't open them anymore, my desktop crashes. It is also maybe because theyre more than 1 million rows which Excel cant handle. My friend suggested Powershell but I have limited knowledge there.. don't know if it can be done in vbscript or Excel VBA.
Thanks in advance for your help.
Below code works for .csv files but does not with .txt delimited files -
$fileName = "C:\Desktop\EFile.csv"
<#
Sample format of c:\temp\data.csv
"id","name","grade","address"
"1","John","Grade-9","test1"
"2","Ben","Grade-9","test12222"
"3","Cathy","Grade-9","test134343"
#>
$colCount = (Import-Csv $fileName | Get-Member | Where-Object {$_.MemberType -eq 'NoteProperty'} | Measure-Object).Count
$csv = Import-Csv $fileName
$csvHeaders = ($csv | Get-Member -MemberType NoteProperty).name
$dict = #{}
foreach($header in $csvHeaders) {
$dict.Add($header,0)
}
foreach($row in $csv)
{
foreach($header in $csvHeaders)
{
if($dict[$header] -le ($row.$header).Length)
{
$dict[$header] =($row.$header).Length
}
}
}
$dict.Keys | % { "key = $_ , Column Length = " + $dict.Item($_) }
This is how I get my data.
$data = #"
"id","name","grade","address"
"1","John","Grade-9","test1"
"2","Ben","Grade-9","test12222"
"3","Cathy","Grade-9","test134343"
"#
$csv = ConvertFrom-Csv -Delimiter ',' $data
But you should get your data like this
$fileName = "C:\Desktop\EFile.csv"
$csv = Import-Csv -Path $fileName
And then
# Extract the header names
$headers = $csv | Get-Member -MemberType NoteProperty | Select-Object -ExpandProperty Name
# Capture output in $result variable
$result = foreach($header in $headers) {
# Select all items in $header column, find the longest, and select the item for output
$maximum = $csv | Select-Object -ExpandProperty $header | Measure-Object -Maximum | Select-Object -ExpandProperty Maximum
# Generate new object holding the information.
# This will end up in $results
[pscustomobject]#{
Header = $header
Max = $maximum.Length
String = $maximum
}
}
# Simple output
$result | Format-Table
This is what I get:
Header Max String
------ --- ------
address 10 test134343
grade 7 Grade-9
id 1 3
name 4 John
Alternatively, if you have memory issues dealing with large files, you may have to get a bit more dirty with the .NET framework. This snippet processes one csv line at a time, instead of reading the entire file into memory.
$fileName = "$env:TEMP\test.csv"
$delimiter = ','
# Open a StreamReader
$reader = [System.IO.File]::OpenText($fileName)
# Read the headers and turn it into an array, and trim away any quotes
$headers = $reader.ReadLine() -split $delimiter | % { $_.Trim('"''') }
# Prepare a hashtable for the results
$result = #{}
# So long as there's more data, keep running
while(-not $reader.EndOfStream) {
# Read a single line and process it as csv
$csv = $reader.ReadLine() | ConvertFrom-Csv -Header $headers -Delimiter $delimiter
# Determine if the item in the result hashtable is smaller than the current, using the header as a key
foreach($header in $headers) {
$item = $csv | Select-Object -ExpandProperty $header
if($result[$header].Maximum -lt $item.Length) {
$result[$header] = [pscustomobject]#{
Header = $header
Maximum = $item.Length
String = $item
}
}
}
}
# Clean up our spent resource
$reader.Close()
# Simple output
$result.Values | Format-Table

PowerShell script to monitor IIS logs for 500 errors every 10 minutes

I'm trying to set up a script to monitor IIS 7.5 logs fro 500 errors. Now I can get it to do that OK but I would like it to check every 30 minutes. Quite naturally I don't want it to warn me about the previous 500 errors it has already reported.
As you can see from the script below I have added a $time variable to take this into account, however I can't seem to find a way to use this variable. Any help would be appreciated.
#Set Time Variable -30
$time = (Get-Date -Format hh:mm:ss (Get-Date).addminutes(-30))
# Location of IIS LogFile
$File = "C:\Users\here\Documents\IIS-log\"+"u_ex"+(get-date).ToString("yyMMdd")+".log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select time,csuristem,scstatus
OK With KevinD's help and PowerGUI with a fair bit of trial and error, I got it working as I expected. Here's the finished product.
#Set Time Variable -30
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-30))
# Location of IIS LogFile
$File = "C:\Users\here\Documents\IIS-log\"+"u_ex"+(get-date).ToString("yyMMdd")+".log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="Time"; e={Get-Date -Format "HH:mm:ss"("$($_.time)")}},csuristem,scstatus | ? { $_.time -ge $time }
Thanks again Kev you're a good man. Hope this code helps someone else out there.
Here's
Try changing your last line to:
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},csuristem,scstatus | ? { $_.DateTime -ge $time }
In the select, we're concatenating the date and time fields, and converting them to a date object, then selecting rows where this field is greater than your $time variable.
You'll also need to change your $time variable:
$time = (Get-Date).AddMinutes(-30)
You want a DateTime object here, not a string.

Resources