Delete extra rows in an excel file with powershell? - excel

I have been tasked to automate part of the logging process on a SPLA server owned by the company. My task is to date, archive, and remove the old files, then move onto a generating a report to be emailed to the department. This task is supposed to be ran at the end of every week.
I figured powershell would be the best option to complete this task. This is my first time working with powershell so I had a bit of learning to do.
My question:
Is it possible to loop through an excel worksheet and delete unused rows using this script?
My condition would be if there are two empty rows -> delete one row and keep going
I am taking info from the log and splitting it into a CSV then converting the CSV to an excel for formatting.
Sample of the Excel spreadsheet, others vary in excess rows between information
Get-ChildItem C:\ScriptsDirectory1\*.log | foreach{
$input = Get-Content $_.FullName #Initialize input
$a = Get-Date #Save the current date (for if/else wrapper)
#=============================================#
# File Name Changer #
#=============================================#
$x = $_.LastWriteTime.ToShortDateString() #Save a temp variable with the LastWriteTime and send it to a string
$new_folder_name = Get-Date $x -Format yyyy.MM.dd #Create a new folder that contains the string information
$des_path = "C:\Archive\ArchivedLogs\$new_folder_name" #Send the new folder to the archive directory
#=============================================#
$data = $input[1..($input.Length - 1)] #Initialize Array and set it to the length of the input file.
$maxLength = 0
$objects = ForEach($record in $data) { #Loop through each object within the array
$split = $record -split ": " #Split objects within array at the ": " string
If($split.Length -gt $maxLength){
$maxLength = $split.Length
}
$properties = #{}
For($i=0; $i -lt $split.Length; $i++) { #Adds the split information to the strings array
$properties.Add([String]($i+1),$split[$i])
}
New-Object -TypeName PSObject -Property $properties
}
$objects | format-table
$headers = [String[]](1..$maxLength)
$objects |
Select-Object $headers |
Export-Csv -NoTypeInformation -Path "C:\Archive\CSVReports\$new_folder_name.csv"#Export CSV path using the new folder name to prevent overwrite
if (test-path $des_path){ #Test if the path exists, and fill the directory with the file to be archived
move-item $_.fullname $des_path
} else {
new-item -ItemType directory -Path $des_path
move-item $_.fullname $des_path
}
} #End of Parser
#===============================================================================#
#======================================#========================================#
#===============================================================================#
# File Archiver and Zipper (After Parse/CSV) #
#===============================================================================#
#======================================#========================================#
#===============================================================================#
$files = Get-ChildItem C:\Archive\ArchivedLogs #Fill the $files variable with the new files in the Archive directory
#********************************#
#Loop Through and Compress/Delete#
#********************************#
foreach ($file in $files) {
Write-Zip $file "C:\Archive\ArchivedLogs\$file.zip" -Level 9 #Write compressed file
} #End of Archiver
Remove-Item C:\Archive\ArchivedLogs\* -exclude *.zip -recurse #Remove the un-needed files within the archive folder
#Run the Formatting and Conversion script for the CSV-to-XLSX
#C:\ScriptsDirectory1\Script\TestRunner1.ps1 #<---Can be Ran using a Invoke call
#===============================================================================#
#======================================#========================================#
#===============================================================================#
# CSV to XLSX Format/Conversion #
#===============================================================================#
#======================================#========================================#
#===============================================================================#
Get-ChildItem C:\Archive\CSVReports | foreach{
$excel_file_path = $_.FullName #Create the file path variable to initialize for formating
$Excel = New-Object -ComObject Excel.Application #Start a new excel application
$Excel.Visible = $True
$Excel.DisplayAlerts=$False
$Excel_Workbook = $Excel.Workbooks.Open($excel_file_path) #Create workbook variable and open a workbook in the path
$FileName = $_.BaseName #Save the base file name of the current value
$Excel.ActiveSheet.ListObjects.add(1,$Excel_Workbook.ActiveSheet.UsedRange,0,1)
$Excel_Workbook.ActiveSheet.UsedRange.EntireColumn.AutoFit()
$SPLA1wksht = $Excel_Workbook.Worksheets.Item(1) #Create the new Sheet (SPLA1wksht)
#*******************************************************#
# Formating for Title Cell #
#*******************************************************#
$SPLA1wksht.Name = 'SPLA Info Report' #Change worksheet name
$SPLA1wksht.Cells.Item(1,1) = $FileName #Title (Date of log) in cell A1
$SPLA1wksht.Cells.Item(1,2) = 'SPLA Weekly Report' #Title for all Excel reports
$SPLA1wksht.Cells.Item(1.2).Font.Size = 18
$SPLA1wksht.Cells.Item(1.2).Font.Bold=$True
$SPLA1wksht.Cells.Item(1.2).Font.Name="Cambria"
$SPLA1wksht.Cells.Item(1.2).Font.ThemeFont = 1
$SPLA1wksht.Cells.Item(1.2).Font.ThemeColor = 5
$SPLA1wksht.Cells.Item(1.2).Font.Color = 8210719
#*******************************************************#
#************************************#
# Adjust and Merge Cell B1 #
#************************************#
$range = $SPLA1wksht.Range("b1","h2")
$range.Style = 'Title'
$range = $SPLA1wksht.Range("b1","g2")
$range.VerticalAlignment = -4108 #Center align vertically (Value -4108 is center)
#************************************#
#***********************************************************************#
# Horizontal Centering for all cells #
#***********************************************************************#
$ColumnRange = $SPLA1wksht.Range("a1","a500").horizontalAlignment =-4108 #Center all cells in this range as -4108
$ColumnRange = $SPLA1wksht.Range("b1","b500").horizontalAlignment =-4108
#**********************************************#
# Delete Blank Rows Inneffective- Logs that have different
#data end up with a different amount of rows and offsets this deletion
# # This method deletes the first row then
#moves onto
#**********************************************# # the next-in-line blank lines and deletes the one
#$SPLA1wksht.Cells.Item(2,1).EntireRow.Delete() # # line until the blank spots are in perfect format
#
#$SPLA1wksht.Cells.Item(4,1).EntireRow.Delete() #
#$SPLA1wksht.Cells.Item(4,1).EntireRow.Delete() #
#$SPLA1wksht.Cells.Item(4,1).EntireRow.Delete() #
#$SPLA1wksht.Cells.Item(4,1).EntireRow.Delete() #
#
#$SPLA1wksht.Cells.Item(19,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(19,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(19,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(19,1).EntireRow.Delete()#
#
#$SPLA1wksht.Cells.Item(25,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(25,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(25,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(25,1).EntireRow.Delete()#
#
#$SPLA1wksht.Cells.Item(33,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(33,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(33,1).EntireRow.Delete()#
#$SPLA1wksht.Cells.Item(33,1).EntireRow.Delete()#
#**********************************************#
#*****************************************************************#
# Final Export as a CSV-to-XLSX file #
#*****************************************************************#
$Excel_Workbook.SaveAs("C:\Archive\ExcelReports\$FileName.xlsx",51) #Save the file in the proper location
$Excel_Workbook.Saved = $True
$Excel.Quit()
# Find a way to optimize this process
#Potential optimization places:
# 1.) Don't open and close excel file and instead just write changes and save
# 2.) Change way empty rows are formatted instead of seperate calls each time
} #End of Format/Converter
#******End******#
#---------------#--#--------------#
#---------------------------------#
# What to Add to the Script #
#---------------------------------#
#---------------#--#--------------#
# -[/] <-Complete -[] <- Incomplete
# -[] Archive or delete CSV Files
# -[] Add a If/Else statement that checks if files are >7 days old
# -[] Compile a weekender report that indicates any SPLA programs changed to keep compliance
# -[] Filter for only SPLA files (Need a list)
# -[] Loop through CSV/Excel file and delete empty rows
The following code worked to run through the program:
for($i = 350 ; $i -ge 0 ; $i--) {
If ($SPLA1wksht.Cells.Item($i, 1).Text-eq "") {
$Range = $SPLA1wksht.Cells.Item($i, 1).EntireRow
[void]$Range.Delete()
echo $i
}
If ($SPLA1wksht.Cells.Item($i, 2).Text-eq "") {
$Range = $SPLA1wksht.Cells.Item($i, 2).EntireRow
[void]$Range.Delete()
echo $i
}
If($i -eq 2){ break;}
}

This should be relatively straight forward
$file = C:\path\to\file.csv
$csv = Import-Csv $file
foreach($row in $csv) {
# logic to delete row
# $csv is an array, so you can could make the row = null to delete it
}
# spit out updated excel sheet
Export-Csv | $csv -NoTypeInformation

Related

Powershell script to extract data from multiple text files into an excel spreadsheet

I'm pretty new to PS and been struggling for a few days.
I have multiple text files in a folder with specific data that I would like to extract into an excel spreadsheet.
each files look like this :
Client n° : xxx Client name : xxx
Computer status
pc group 1 :
n°1 OK n°2 Disconnected n°3 Unresponsive
n°4 Unreachable host n°5 Unresponsive
Data read 11/11/20 12:50:07
Version: x.x.x
I would like to have an output file that looks like this :
Client name and n° OK Disconnected Unresponsive Unreachable host version
xxx/xxx 1 1 2 1 x.x.x
For the status columns it's the sum number of pc with that status and not the pc n° that I would like to display.
At the moment I'm working with multiple .bat files that searches for the status and output one file per status
find /c "Disconnected" *.* > disconnected.txt
find /c "Unresponsive" *.* > unresponsive.txt
And then I sort every single output in an excel which takes me too much time, I was wondering if it was possible to automate this task with a script.
I really don't have any knowledge of PS, only basic batch commands.
Let's assume your files are all in one folder and all of them have the .txt extension.
Then you need to loop through these files and parse the data you need from it:
# create a Hashtable to add the different status values in
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0}
# loop through the files in your path and parse the information out
$result = Get-ChildItem -Path 'D:\Test' -Filter '*.txt' -File | ForEach-Object {
switch -Regex -File $_.FullName {
'^Client n°\s*:\s*([^\s]+)\s+Client name\s*:\s*(.+)$' {
# start collecting data for this client
$client = '{0}/{1}' -f $matches[2], $matches[1]
# reset the Hashtable to keep track of the status values
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0 }
}
'^\d+' {
# increment the various statuses in the Hahstable
($_ -split '\d+').Trim() | ForEach-Object { $status[$_]++ }
}
'^Version:\s(.+)$' {
$version = $matches[1]
# since this is the last line for this client, output the collected data as object
[PsCustomObject]#{
'Client name and n°' = $client
'OK' = $status['OK']
'Disconnected' = $status['Disconnected']
'Unresponsive' = $status['Unresponsive']
'Unreachable host' = $status['Unreachable host']
'Version' = $version
}
}
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path 'D:\Test\clientdata.csv' -UseCulture -NoTypeInformation
Result on screen:
Client name and n° OK Disconnected Unresponsive Unreachable host Version
------------------ -- ------------ ------------ ---------------- -------
xxx/xxx 1 1 2 1 x.x.x
I used this as an exercise to test my abilities. I created three of the same files, with different data, and tested this script. As long as they are text files in the directory the script will iterate through each file and pull the data from each as you stated it needs to be. If a stray text file gets added the script does not know nor care and will treat it like the others. If there is data it can find it will, and it will output that data to the excel file. Lastly the file is set to save itself and then immediately close.
It starts by Creating the Excel file, then Workbook. (I commented out the naming of the workbook. If you like you can add it back.) Finds all text files in a directory, then searches the text for the specific content within the text you specified above.
During the script I commented as much as I thought might be needed to assist with modification later on.
Output formatted like this:
Excel Output
#Create An Excel File
$excel = New-Object -ComObject excel.application
$excel.visible = $True
#Add Workbook
$workbook = $excel.Workbooks.Add()
<#Rename Workbook
$workbook= $workbook.Worksheets.Item(1)
$workbook.Name = 'Client name and #'#>
#create the column headers
$workbook.Cells.Item(1,1) = 'Client name and n°'
$workbook.Cells.Item(1,2) = 'OK'
$workbook.Cells.Item(1,3) = 'Disconnected'
$workbook.Cells.Item(1,4) = 'Unresponsive'
$workbook.Cells.Item(1,5) = 'Unreachable'
$workbook.Cells.Item(1,6) = 'Version'
$workbook.Cells.Item(1,7) = 'Date Gathered'
$move = "C:\Users\iNet\Desktop\Testing"
$root = "C:\Users\iNet\Desktop\Testing"
$files = Get-ChildItem -Path $root -Filter *.txt
#Starting on Row 2
[int]$i = 2
ForEach ($file in $files){
$location = $root+"\"+$file
#Format your client data to output what you want to see.
$ClientData = select-string -path "$location" -pattern "Client"
$ClientData = $ClientData.line
$ClientData = $ClientData -replace "Client n° :" -replace ""
$ClientData = $ClientData -replace "Client name :" -replace "|"
$row = $i
$Column = 1
$workbook.Cells.Item($row,$column)= "$ClientData"
#Data Read Date
$DataReadDate = select-string -path "$location" -pattern "Data read"
$DataReadDate = $DataReadDate.line
$DataReadDate = $DataReadDate -replace "Data read " -replace ""
#Data Read Date, you asked for everything but this.
$row = $i
$Column = 7
$workbook.Cells.Item($row,$column)= "$DataReadDate"
#Version
$Version = select-string -path "$location" -pattern "Version:"
$Version = $Version.line
$Version = $Version -replace "Version: " -replace ""
$row = $i
$Column = 6
$workbook.Cells.Item($row,$column)= "$Version"
#How Many Times Unresponsive Shows Up
$Unresponsive = (Get-Content "$location" | select-string -pattern "Unresponsive").length
$row = $i
$Column = 4
$workbook.Cells.Item($row,$column)= "$Unresponsive"
#How Many Times Disconnected Shows Up
$Disconnected = (Get-Content "$location" | select-string -pattern "Disconnected").length
$row = $i
$Column = 3
$workbook.Cells.Item($row,$column)= "$Disconnected"
#How Many Times Unreachable host Shows Up
$Unreachable = (Get-Content "$location" | select-string -pattern "Unreachable host").length
$row = $i
$Column = 5
$workbook.Cells.Item($row,$column)= "$Unreachable"
#How Many Times OK Shows Up
$OK = (Get-Content "$location" | select-string -pattern "OK").length
$row = $i
$Column = 2
$workbook.Cells.Item($row,$column)= "$OK"
#Iterate by one so each text file goes to its own line.
$i++
}
#Save Document
$output = "\Output.xlsx"
$FinalOutput = $move+$output
#saving & closing the file
$workbook.SaveAs($move)
$excel.Quit()

Importing a column from 1 Excel sheet to Another and Comparing the Values

I'm working on a script that basically takes 2 Excel files and will compare the values from a certain column in both files to each other, and will import any differences found to a CSV file named for that day the task was executed.
function MatrexDiff(){
# Task is sch to happen daily to find errors
$action = New-ScheduledTaskAction -Execute "MatrexDiff.ps1" -Argument "C:\zach\MatrexDiff.ps1"
$ trigger = New-ScheduledTaskTrigger -Daily -At 5am
Register-ScheduledTask -Action $action -Trigger $trigger -TaskName "Matrex Differences" -Description "Finds difference between RPM and Matrex Inventories"
# Makes a txt file than converts txt file to a csv file to hold any differences
$csv = New-Item -Path "C:\Backup" -Name "Difference" -ItemType file -Force
$date = (Get-Date).ToString("dd-MM-yyyy")
$new = "C:\Data\" + $date + "_Difference.csv"
$csv | Export-Csv $new
# imports the lists that will be compared to each other to find and print differences on _Difference .csv file
$zachPath = "C:\zach\zach.xlsx"
$tehyaPath = "C:\zach\Tehya.xlsx"
$xl = New-Object -c excel.application
$xl.displayAlerts = $false
$wb2 = $xl.Workbooks.Open($zachPath, $null, $true)
# copys targeted data in the column
$wb1 = $xl.Workbooks.Open($tehyaPath)
$needColumn = $wb1.Sheets.Item('Sheet1')
$ColumnToCopy = $needColumn.Range("C1").EnitreColumn
$ColumnToCopy.Copy
# paste the copied column into the other excel file
$sheetToUse = $wb2.Sheets.Item('Sheet1')
$ColumnToFill = $sheetToUse.Range("C1").EnitreColumn
$ColumnToFill.Paste($ColumnToCopy)
# (poor attempt) exporting the file with the pasted data to Back folder
$wb2.Close($false) | Export-Csv $wb2
$wb1.Close($true)
$xl.Quit()
}

Checking file names in a directory with entries in an excel spreadsheet; What am I doing wrong?

I'm attempting to write a PowerShell script (my first ever, so be gentle) to go through all the file names in a directory and check if they exist in an excel spreadsheet that I have. If a file name does exist in both, I want to move/copy that file to a new directory.
Right now it runs with no errors, but nothing actually happens.
So far I have:
#open excel sheet
$objexcel=new-object -com excel.application
$workbook=$objexcel.workbooks.open("<spreadsheet location>")
#use Sheet2
$worksheet = $workbook.sheets.Item(2)
#outer loop: loop through each file in directory
foreach ($_file in (get-childitem -path "<directory to search>"))
{
$filename = [system.IO.path]::GetFileNameWithoutExtension($_)
#inner loop: check with every entry in excel sheet (if is equal)
$intRowCount = ($worksheet.UsedRange.Rows).count
for ($intRow = 2 ; $intRow -le $intRowCount ; $intRow++)
{
$excelname = $worksheet.cells.item($intRow,1).value2
if ($excelname -eq $filename)
{ #move to separate folder
Copy-Item -path $_file -Destination "<directory for files to be copied to>"
}
#else do nothing
}
}
#close excel sheet
$workbook.close()
$objexcel.quit()
You're trying to define $filename based on the current object ($_), but that variable isn't populated in a foreach loop:
$filename = [system.IO.path]::GetFileNameWithoutExtension($_)
Because of that $filename is always $null and therefore never equal to $excelname.
Replace the foreach loop with a ForEach-Object loop if you want to use $_. I'd also recommend to read the Excel cell values into an array outside that loop. That improves performance and allows you to use the array it in a -contains filter, which would remove the need for having a loop in the first place.
$intRowCount = ($worksheet.UsedRange.Rows).count
$excelnames = for ($intRow = 2; $intRow -le $intRowCount; $intRow++) {
$worksheet.cells.item($intRow,1).value2
}
Get-ChildItem -Path "<directory to search>" |
Where-Object { $excelnames -contains $_.BaseName } |
Copy-Item -Destination "<directory for files to be copied to>"
On a more general note: you shouldn't use variable names starting with an underscore. They're too easily confused with properties of the current object variable ($_name vs. $_.name).

How to export a CSV to Excel using Powershell

I'm trying to export a complete CSV to Excel by using Powershell. I stuck at a point where static column names are used. But this doesn't work if my CSV has generic unknown header names.
Steps to reproduce
Open your PowerShell ISE and copy & paste the following standalone code. Run it with F5
"C:\Windows\system32\WindowsPowerShell\v1.0\powershell_ise.exe"
Get-Process | Export-Csv -Path $env:temp\process.csv -NoTypeInformation
$processes = Import-Csv -Path $env:temp\process.csv
$Excel = New-Object -ComObject excel.application
$workbook = $Excel.workbooks.add()
$i = 1
foreach($process in $processes)
{
$excel.cells.item($i,1) = $process.name
$excel.cells.item($i,2) = $process.vm
$i++
}
Remove-Item $env:temp\process.csv
$Excel.visible = $true
What it does
The script will export a list of all active processes as a CSV to your temp folder. This file is only for our example. It could be any CSV with any data
It reads in the newly created CSV and saves it under the $processes variable
It creates a new and empty Excel workbook where we can write data
It iterates through all rows (?) and writes all values from the name and vm column to Excel
My questions
What if I don't know the column headers? (In our example name and vm). How do I address values where I don't know their header names?
How do I count how many columns a CSV has? (after reading it with Import-Csv)
I just want to write an entire CSV to Excel with Powershell
Ups, I entirely forgot this question. In the meantime I got a solution.
This Powershell script converts a CSV to XLSX in the background
Gimmicks are
Preserves all CSV values as plain text like =B1+B2 or 0000001.
You don't see #Name or anything like that. No autoformating is done.
Automatically chooses the right delimiter (comma or semicolon) according to your regional setting
Autofit columns
PowerShell Code
### Set input and output path
$inputCSV = "C:\somefolder\input.csv"
$outputXLSX = "C:\somefolder\output.xlsx"
### Create a new Excel Workbook with one empty sheet
$excel = New-Object -ComObject excel.application
$workbook = $excel.Workbooks.Add(1)
$worksheet = $workbook.worksheets.Item(1)
### Build the QueryTables.Add command
### QueryTables does the same as when clicking "Data » From Text" in Excel
$TxtConnector = ("TEXT;" + $inputCSV)
$Connector = $worksheet.QueryTables.add($TxtConnector,$worksheet.Range("A1"))
$query = $worksheet.QueryTables.item($Connector.name)
### Set the delimiter (, or ;) according to your regional settings
$query.TextFileOtherDelimiter = $Excel.Application.International(5)
### Set the format to delimited and text for every column
### A trick to create an array of 2s is used with the preceding comma
$query.TextFileParseType = 1
$query.TextFileColumnDataTypes = ,2 * $worksheet.Cells.Columns.Count
$query.AdjustColumnWidth = 1
### Execute & delete the import query
$query.Refresh()
$query.Delete()
### Save & close the Workbook as XLSX. Change the output extension for Excel 2003
$Workbook.SaveAs($outputXLSX,51)
$excel.Quit()
I am using excelcnv.exe to convert csv into xlsx and that seemed to work properly.
You will have to change the directory to where your excelcnv is. If 32 bit, it goes to Program Files (x86)
Start-Process -FilePath 'C:\Program Files\Microsoft Office\root\Office16\excelcnv.exe' -ArgumentList "-nme -oice ""$xlsFilePath"" ""$xlsToxlsxPath"""
This topic really helped me, so I'd like to share my improvements.
All credits go to the nixda, this is based on his answer.
For those who need to convert multiple csv's in a folder, just modify the directory. Outputfilenames will be identical to input, just with another extension.
Take care of the cleanup in the end, if you like to keep the original csv's you might not want to remove these.
Can be easily modifed to save the xlsx in another directory.
$workingdir = "C:\data\*.csv"
$csv = dir -path $workingdir
foreach($inputCSV in $csv){
$outputXLSX = $inputCSV.DirectoryName + "\" + $inputCSV.Basename + ".xlsx"
### Create a new Excel Workbook with one empty sheet
$excel = New-Object -ComObject excel.application
$excel.DisplayAlerts = $False
$workbook = $excel.Workbooks.Add(1)
$worksheet = $workbook.worksheets.Item(1)
### Build the QueryTables.Add command
### QueryTables does the same as when clicking "Data » From Text" in Excel
$TxtConnector = ("TEXT;" + $inputCSV)
$Connector = $worksheet.QueryTables.add($TxtConnector,$worksheet.Range("A1"))
$query = $worksheet.QueryTables.item($Connector.name)
### Set the delimiter (, or ;) according to your regional settings
### $Excel.Application.International(3) = ,
### $Excel.Application.International(5) = ;
$query.TextFileOtherDelimiter = $Excel.Application.International(5)
### Set the format to delimited and text for every column
### A trick to create an array of 2s is used with the preceding comma
$query.TextFileParseType = 1
$query.TextFileColumnDataTypes = ,2 * $worksheet.Cells.Columns.Count
$query.AdjustColumnWidth = 1
### Execute & delete the import query
$query.Refresh()
$query.Delete()
### Save & close the Workbook as XLSX. Change the output extension for Excel 2003
$Workbook.SaveAs($outputXLSX,51)
$excel.Quit()
}
## To exclude an item, use the '-exclude' parameter (wildcards if needed)
remove-item -path $workingdir -exclude *Crab4dq.csv
Why would you bother? Load your CSV into Excel like this:
$csv = Join-Path $env:TEMP "process.csv"
$xls = Join-Path $env:TEMP "process.xlsx"
$xl = New-Object -COM "Excel.Application"
$xl.Visible = $true
$wb = $xl.Workbooks.OpenText($csv)
$wb.SaveAs($xls, 51)
You just need to make sure that the CSV export uses the delimiter defined in your regional settings. Override with -Delimiter if need be.
Edit: A more general solution that should preserve the values from the CSV as plain text. Code for iterating over the CSV columns taken from here.
$csv = Join-Path $env:TEMP "input.csv"
$xls = Join-Path $env:TEMP "output.xlsx"
$xl = New-Object -COM "Excel.Application"
$xl.Visible = $true
$wb = $xl.Workbooks.Add()
$ws = $wb.Sheets.Item(1)
$ws.Cells.NumberFormat = "#"
$i = 1
Import-Csv $csv | ForEach-Object {
$j = 1
foreach ($prop in $_.PSObject.Properties) {
if ($i -eq 1) {
$ws.Cells.Item($i, $j++).Value = $prop.Name
} else {
$ws.Cells.Item($i, $j++).Value = $prop.Value
}
}
$i++
}
$wb.SaveAs($xls, 51)
$wb.Close()
$xl.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($xl)
Obviously this second approach won't perform too well, because it's processing each cell individually.
If you want to convert CSV to Excel without Excel being installed, you can use the great .NET library EPPlus (under LGPL license) to create and modify Excel Sheets and also convert CSV to Excel really fast!
Preparation
Download the latest stable EPPlus version
Extract EPPlus to your preferred location (e.g. to $HOME\Documents\WindowsPowerShell\Modules\EPPlus)
Right Click EPPlus.dll, select Properties and at the bottom of the General Tab click "Unblock" to allow loading of this dll. If you don't have the rights to do this, try [Reflection.Assembly]::UnsafeLoadFrom($DLLPath) | Out-Null
Detailed Powershell Commands to import CSV to Excel
# Create temporary CSV and Excel file names
$FileNameCSV = "$HOME\Downloads\test.csv"
$FileNameExcel = "$HOME\Downloads\test.xlsx"
# Create CSV File (with first line containing type information and empty last line)
Get-Process | Export-Csv -Delimiter ';' -Encoding UTF8 -Path $FileNameCSV
# Load EPPlus
$DLLPath = "$HOME\Documents\WindowsPowerShell\Modules\EPPlus\EPPlus.dll"
[Reflection.Assembly]::LoadFile($DLLPath) | Out-Null
# Set CSV Format
$Format = New-object -TypeName OfficeOpenXml.ExcelTextFormat
$Format.Delimiter = ";"
# use Text Qualifier if your CSV entries are quoted, e.g. "Cell1","Cell2"
$Format.TextQualifier = '"'
$Format.Encoding = [System.Text.Encoding]::UTF8
$Format.SkipLinesBeginning = '1'
$Format.SkipLinesEnd = '1'
# Set Preferred Table Style
$TableStyle = [OfficeOpenXml.Table.TableStyles]::Medium1
# Create Excel File
$ExcelPackage = New-Object OfficeOpenXml.ExcelPackage
$Worksheet = $ExcelPackage.Workbook.Worksheets.Add("FromCSV")
# Load CSV File with first row as heads using a table style
$null=$Worksheet.Cells.LoadFromText((Get-Item $FileNameCSV),$Format,$TableStyle,$true)
# Load CSV File without table style
#$null=$Worksheet.Cells.LoadFromText($file,$format)
# Fit Column Size to Size of Content
$Worksheet.Cells[$Worksheet.Dimension.Address].AutoFitColumns()
# Save Excel File
$ExcelPackage.SaveAs($FileNameExcel)
Write-Host "CSV File $FileNameCSV converted to Excel file $FileNameExcel"
This is a slight variation that worked better for me.
$csv = Join-Path $env:TEMP "input.csv"
$xls = Join-Path $env:TEMP "output.xlsx"
$xl = new-object -comobject excel.application
$xl.visible = $false
$Workbook = $xl.workbooks.open($CSV)
$Worksheets = $Workbooks.worksheets
$Workbook.SaveAs($XLS,1)
$Workbook.Saved = $True
$xl.Quit()
I had some problem getting the other examples to work.
EPPlus and other libraries produces OpenDocument Xml format, which is not the same as you get when you save from Excel as xlsx.
macks example with open CSV and just re-saving didn't work, I never managed to get the ',' delimiter to be used correctly.
Ansgar Wiechers example has some slight error which I found the answer for in the commencts.
Anyway, this is a complete working example. Save this in a File CsvToExcel.ps1
param (
[Parameter(Mandatory=$true)][string]$inputfile,
[Parameter(Mandatory=$true)][string]$outputfile
)
$excel = New-Object -ComObject Excel.Application
$excel.Visible = $false
$wb = $excel.Workbooks.Add()
$ws = $wb.Sheets.Item(1)
$ws.Cells.NumberFormat = "#"
write-output "Opening $inputfile"
$i = 1
Import-Csv $inputfile | Foreach-Object {
$j = 1
foreach ($prop in $_.PSObject.Properties)
{
if ($i -eq 1) {
$ws.Cells.Item($i, $j) = $prop.Name
} else {
$ws.Cells.Item($i, $j) = $prop.Value
}
$j++
}
$i++
}
$wb.SaveAs($outputfile,51)
$wb.Close()
$excel.Quit()
write-output "Success"
Execute with:
.\CsvToExcel.ps1 -inputfile "C:\Temp\X\data.csv" -outputfile "C:\Temp\X\data.xlsx"
I found this while passing and looking for answers on how to compile a set of csvs into a single excel doc with the worksheets (tabs) named after the csv files. It is a nice function. Sadly, I cannot run them on my network :( so i do not know how well it works.
Function Release-Ref ($ref)
{
([System.Runtime.InteropServices.Marshal]::ReleaseComObject(
[System.__ComObject]$ref) -gt 0)
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
Function ConvertCSV-ToExcel
{
<#
.SYNOPSIS
Converts one or more CSV files into an excel file.
.DESCRIPTION
Converts one or more CSV files into an excel file. Each CSV file is imported into its own worksheet with the name of the
file being the name of the worksheet.
.PARAMETER inputfile
Name of the CSV file being converted
.PARAMETER output
Name of the converted excel file
.EXAMPLE
Get-ChildItem *.csv | ConvertCSV-ToExcel -output ‘report.xlsx’
.EXAMPLE
ConvertCSV-ToExcel -inputfile ‘file.csv’ -output ‘report.xlsx’
.EXAMPLE
ConvertCSV-ToExcel -inputfile #(“test1.csv”,”test2.csv”) -output ‘report.xlsx’
.NOTES
Author: Boe Prox
Date Created: 01SEPT210
Last Modified:
#>
#Requires -version 2.0
[CmdletBinding(
SupportsShouldProcess = $True,
ConfirmImpact = ‘low’,
DefaultParameterSetName = ‘file’
)]
Param (
[Parameter(
ValueFromPipeline=$True,
Position=0,
Mandatory=$True,
HelpMessage=”Name of CSV/s to import”)]
[ValidateNotNullOrEmpty()]
[array]$inputfile,
[Parameter(
ValueFromPipeline=$False,
Position=1,
Mandatory=$True,
HelpMessage=”Name of excel file output”)]
[ValidateNotNullOrEmpty()]
[string]$output
)
Begin {
#Configure regular expression to match full path of each file
[regex]$regex = “^\w\:\\”
#Find the number of CSVs being imported
$count = ($inputfile.count -1)
#Create Excel Com Object
$excel = new-object -com excel.application
#Disable alerts
$excel.DisplayAlerts = $False
#Show Excel application
$excel.V isible = $False
#Add workbook
$workbook = $excel.workbooks.Add()
#Remove other worksheets
$workbook.worksheets.Item(2).delete()
#After the first worksheet is removed,the next one takes its place
$workbook.worksheets.Item(2).delete()
#Define initial worksheet number
$i = 1
}
Process {
ForEach ($input in $inputfile) {
#If more than one file, create another worksheet for each file
If ($i -gt 1) {
$workbook.worksheets.Add() | Out-Null
}
#Use the first worksheet in the workbook (also the newest created worksheet is always 1)
$worksheet = $workbook.worksheets.Item(1)
#Add name of CSV as worksheet name
$worksheet.name = “$((GCI $input).basename)”
#Open the CSV file in Excel, must be converted into complete path if no already done
If ($regex.ismatch($input)) {
$tempcsv = $excel.Workbooks.Open($input)
}
ElseIf ($regex.ismatch(“$($input.fullname)”)) {
$tempcsv = $excel.Workbooks.Open(“$($input.fullname)”)
}
Else {
$tempcsv = $excel.Workbooks.Open(“$($pwd)\$input”)
}
$tempsheet = $tempcsv.Worksheets.Item(1)
#Copy contents of the CSV file
$tempSheet.UsedRange.Copy() | Out-Null
#Paste contents of CSV into existing workbook
$worksheet.Paste()
#Close temp workbook
$tempcsv.close()
#Select all used cells
$range = $worksheet.UsedRange
#Autofit the columns
$range.EntireColumn.Autofit() | out-null
$i++
}
}
End {
#Save spreadsheet
$workbook.saveas(“$pwd\$output”)
Write-Host -Fore Green “File saved to $pwd\$output”
#Close Excel
$excel.quit()
#Release processes for Excel
$a = Release-Ref($range)
}
}

Is there a faster way to parse an excel document with Powershell?

I'm interfacing with an MS Excel document via Powershell. There is a possibility of each excel document of having around 1000 rows of data.
Currently this script seems to read the Excel file and write a value to screen at a rate of 1 record every .6 seconds. At first glance that seems extremely slow.
This is my first time reading an Excel file with Powershell, is this the norm? Is there a faster way for me to read and parse the Excel data?
Here is the script output (trimmed for readability)
PS P:\Powershell\ExcelInterfaceTest> .\WRIRMPTruckInterface.ps1 test.xlsx
3/20/2013 4:46:01 PM
---------------------------
2 078110
3 078108
4 078107
5 078109
<SNIP>
242 078338
243 078344
244 078347
245 078350
3/20/2013 4:48:33 PM
---------------------------
PS P:\Powershell\ExcelInterfaceTest>
Here is the Powershell script:
########################################################################################################
# This is a common function I am using which will release excel objects
########################################################################################################
function Release-Ref ($ref) {
([System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$ref) -gt 0)
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
########################################################################################################
# Variables
########################################################################################################
########################################################################################################
# Creating excel object
########################################################################################################
$objExcel = new-object -comobject excel.application
# Set to false to not open the app on screen.
$objExcel.Visible = $False
########################################################################################################
# Directory location where we have our excel files
########################################################################################################
$ExcelFilesLocation = "C:/ShippingInterface/" + $args[0]
########################################################################################################
# Open our excel file
########################################################################################################
$UserWorkBook = $objExcel.Workbooks.Open($ExcelFilesLocation)
########################################################################################################
# Here Item(1) refers to sheet 1 of of the workbook. If we want to access sheet 10, we have to modify the code to Item(10)
########################################################################################################
$UserWorksheet = $UserWorkBook.Worksheets.Item(2)
########################################################################################################
# This is counter which will help to iterrate trough the loop. This is simply a row counter
# I am starting row count as 2, because the first row in my case is header. So we dont need to read the header data
########################################################################################################
$intRow = 2
$a = Get-Date
write-host $a
write-host "---------------------------"
Do {
# Reading the first column of the current row
$TicketNumber = $UserWorksheet.Cells.Item($intRow, 1).Value()
write-host $intRow " " $TicketNumber
$intRow++
} While ($UserWorksheet.Cells.Item($intRow,1).Value() -ne $null)
$a = Get-Date
write-host $a
write-host "---------------------------"
########################################################################################################
# Exiting the excel object
########################################################################################################
$objExcel.Quit()
########################################################################################################
#Release all the objects used above
########################################################################################################
$a = Release-Ref($UserWorksheet)
$a = Release-Ref($UserWorkBook)
$a = Release-Ref($objExcel)
In his blog entry Speed Up Reading Excel Files in PowerShell, Robert M. Toups, Jr. explains that while loading to PowerShell is fast, actually reading the Excel cells is very slow. On the other hand, PowerShell can read a text file very quickly, so his solution is to load the spreadsheet in PowerShell, use Excel’s native CSV export process to save it as a CSV file, then use PowerShell’s standard Import-Csv cmdlet to process the data blazingly fast. He reports that this has given him up to a 20 times faster import process!
Leveraging Toups’ code, I created an Import-Excel function that lets you import spreadsheet data very easily.
My code adds the capability to select a specific worksheet within an Excel workbook, rather than just using the default worksheet (i.e. the active sheet at the time you saved the file). If you omit the –SheetName parameter, it uses the default worksheet.
function Import-Excel([string]$FilePath, [string]$SheetName = "")
{
$csvFile = Join-Path $env:temp ("{0}.csv" -f (Get-Item -path $FilePath).BaseName)
if (Test-Path -path $csvFile) { Remove-Item -path $csvFile }
# convert Excel file to CSV file
$xlCSVType = 6 # SEE: http://msdn.microsoft.com/en-us/library/bb241279.aspx
$excelObject = New-Object -ComObject Excel.Application
$excelObject.Visible = $false
$workbookObject = $excelObject.Workbooks.Open($FilePath)
SetActiveSheet $workbookObject $SheetName | Out-Null
$workbookObject.SaveAs($csvFile,$xlCSVType)
$workbookObject.Saved = $true
$workbookObject.Close()
# cleanup
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workbookObject) |
Out-Null
$excelObject.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excelObject) |
Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
# now import and return the data
Import-Csv -path $csvFile
}
These supplemental functions are used by Import-Excel:
function FindSheet([Object]$workbook, [string]$name)
{
$sheetNumber = 0
for ($i=1; $i -le $workbook.Sheets.Count; $i++) {
if ($name -eq $workbook.Sheets.Item($i).Name) { $sheetNumber = $i; break }
}
return $sheetNumber
}
function SetActiveSheet([Object]$workbook, [string]$name)
{
if (!$name) { return }
$sheetNumber = FindSheet $workbook $name
if ($sheetNumber -gt 0) { $workbook.Worksheets.Item($sheetNumber).Activate() }
return ($sheetNumber -gt 0)
}
If the data is static (no formulas involved, just data in cells), you can access the spreadsheet as an ODBC data source and execute SQL (or at least SQL-like) queries against it. Have a look at this reference for setting up your connectionstring (each worksheet in a workbook will be a "table" for this exercise), and use System.Data to query it the same as you would a regular database (Don Jones wrote a wrapper function for this which may help).
This should be faster than launching Excel & picking through cell by cell.

Resources