Automate creating configurations from template and excel - excel

I'm having trouble in automation of a configuration.
I have a template of a configuration and need to change all the hostname (marked as YYY) and IP (marked as XXX(only 3rd octet needs replacement)) according to a list of excel values.
Now I have a list of 100 different sites and IPs and I want to have also 100 different configurations.
A friend suggested to use the following Powershell code but it doesn't any create files..:
$replaceValues = Import-Csv -Path "\\ExcelFile.csv"
$file = "\\Template.txt"
$contents = Get-Content -Path $file
foreach ($replaceValue in $replaceValues)
{
$contents = $contents -replace "YYY", $replaceValue.hostname
$contents = $contents -replace "XXX", $replaceValue.site
Copy-Item $file "$($file.$replaceValue.hostname)"
Set-Content -Path "$($file.$replaceValue.hostname)" -Value $contents
echo "$($file.$replaceValue.hostname)"
}

Your code tries to overwrite the same $contents string in the loop, so if the values are replaced the first time you enter the loop, there won't be any YYY or XXX values to replace left..
You need to keep the template text intact, and create a new copy from the template inside the loop. That copy can then be altered the way you want. Every next iteration wil then start off with a fresh copy of the template.
There is no need to first copy the template text to a new location and then overwrite this file with the new contents. Set-Content is happy to create a new file for you if it does not already exist.
Try
$replaceValues = Import-Csv -Path 'D:\Test\Values.csv'
$template = Get-Content -Path 'D:\Test\Template.txt'
foreach ($item in $replaceValues) {
$content = $template -replace 'YYY', $item.hostname -replace 'XXX', $item.site
$newFile = Join-Path -Path 'D:\Test' -ChildPath ('{0}.txt' -f $item.hostname)
Write-Host "Creating file '$newFile'"
$content | Set-Content -Path $newFile
}

Related

Referring to Files as variables in multiple ForEach loops [PowerShell]

Im am making some progress on my script that automatically updates links in excel files without opening them. I have successfully made the function work on a single file with inputs of file name and text to replace. Now I am trying to scale this so that it does the same actions for all files in the script directory.
Here is how the script goes with comments on steps:
# This part will be responsible from fetching the week number to replace from the script directory name, currently I am testing with pre-determined number
# $wk = Get-Item -Path $PWD | Select-Object -Property BaseName
# $wknn = "$wk".Substring(13,2) -as [int]
$wknn = 41
$wkold = $wknn-1
$wkprev = $wknn-2
$DefaultFiles = Get-ChildItem | Where-Object {$_.Name -like "*.xls*"}
ForEach($File in $DefaultFiles)
{
# Build ZIP file name
$zipFile = $_ -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Rename file to ZIP
Rename-Item -Path $_ -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text. First replace wk-1 to wk and then wk-2 to wk-1
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Include *.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.FullName) |
Foreach-Object { $_ -replace $wkold, $wknn } |
Foreach-Object { $_ -replace $wkprev, $wkold } |
Set-Content $file.FullName
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempfolder
# Update archive with new files. Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $_
#Move the final .xlsb file back to the script root
move-Item -Path $_ -destination $PSScriptRoot
#Set location to script root to start over
Set-Location -Path $PSScriptRoot
}
}
I am running into problems with the forEach loop. I am unsure on how do I refer to the file name within the first loop at the Build Zip File Name step. And how do I refer to the output file when i Want to move it to the script root afterwards. Also I suspect that stacking of forEach loops may be not as simple and require extra steps within the code, but due to me just starting out in C I dont have the experience and could not find a simple answer to my problem.
I would really appreciate some assistance with the syntax in my code :)
I would create a temporary folder outside the main loop and set the working directory to that folder. Then remove the folder and reset the working directory when all is done.
Also, there is no need to rename the finished zip file first and then move it back to its original location, because you can do that with the Move-Item cmdlet at the same time.
Try:
$wknn = 41
$wkold = $wknn - 1
$wkprev = $wknn - 2
$7zipExe = 'C:\7z\7za.exe' # path to 7za.exe
$sourcePath = $PSScriptRoot # path to where the Excel files are
# Create temporary folder
$tempFolder = Join-Path -Path ([System.IO.Path]::GetTempPath()) -ChildPath ((New-Guid).Guid)
$null = New-Item -ItemType Directory -Path $tempFolder -Force
# retrieve a collection of Excel files (FullNames only).
# If you ONLY want .xlsb files, set the Filter to '*.xlsb'
$excelFiles = (Get-ChildItem -Path $sourcePath -Filter '*.xls*' -File).FullName
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempfolder
foreach($File in $excelFiles) {
# Build ZIP file name
$zipFile = [System.IO.Path]::ChangeExtension($File, '.zip')
# Rename file to ZIP
Rename-Item -Path $File -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
& $7zipExe x "$zipFile" -o"$tempFolder" | Out-Null
# Replace old text with new text. First replace wk-1 to wk and then wk-2 to wk-1
Get-ChildItem -Path $tempFolder -Recurse -Filter '*.rels' -File | ForEach-Object {
(Get-Content -Path $_.FullName -Raw) -replace $wkold, $wknn -replace $wkprev, $wkold |
Set-Content $_.FullName
}
# Update archive with new files. Not using Compress-Archive because it changes the ZIP format
& $7zipExe u -r "$zipFile" *.* | Out-Null
# Rename and Move the updated file back to the original location (overwrite)
Move-Item -Path $zipFile -Destination $File -Force
# remove all files from the temporary folder to start fresh
Remove-Item -Path "$tempfolder\*" -Recurse -Force
}
# Set location back to script root
Set-Location -Path $PSScriptRoot
# remove the temporary folder
Remove-Item -Path $tempfolder -Recurse -Force

Editing Connection String in a Boatload of XLSB Documents

I have a couple hundred .xlsb files that need their connection string and command text changed in an easily programmable way. They are all buried in different folders deep in the file system. How can I use Powershell or some other program to go through and edit them all so I don't have to do it manually?
I've started looking into Powershell and Format-Hex. I figured I could ask and someone else may be able to set me on the right track. What needs to be done is recursively searching the filesystem from a certain point, detect if "this string" and this number "11111" are in the connection string and command text (respectively) of all xlsb files, and if they are replace them with "that string" and this number "22222". All in xlsb files. I've also looked into using python, but the libraries I found did not mention editing this setting, so I figured some sort of hex detection and replacement would be easier.
Would it be possible to have more info on what is a "connection string" ? To my knowledge this is not part of the properties of an xlsb file.
I suppose it to be the string which is used to create an ODBC Connection so the text you want to modify will be within the code of a macro.
So three issues:
Recursively find all xlsb files within a folder
$Fllt = gci "*.xlsb" -r
Open them in Excel
$Excl = New-Object -ComObject Excel.Application
$Fllt | %{$xl.Workbooks.Open($_.Fullname)}
Replace "this string" by "that string" and "11111" by "22222" in every macro. This is much more difficult.
My suggestion:
#Generation of a test file
$Excl = New-Object -ComObject Excel.Application
$xlve = $Excl.Version
New-ItemProperty -Path "HKCU:\Software\Microsoft\Office\$xlve\Excel\Security" `
-Name AccessVBOM -Value 1 -Force | Out-Null
New-ItemProperty -Path "HKCU:\Software\Microsoft\Office\$xlve\Excel\Security" `
-Name VBAWarnings -Value 1 -Force | Out-Null
#'
Sub Co()
ConnectionString = "this string"
CommandText = "11111"
End Sub
'# | Out-File C:\Temp\Test.txt -Encoding ascii
$Wkbk = $Excl.Workbooks.Add()
$Wkbk.VBProject.VBComponents.Import("C:\Temp\Test.txt") | Out-Null
$Wkbk.SaveAs("C:\Temp\Test.xlsb", 50)
$Excl.Quit()
#Get The files
$Fllt = gci -Path C:\Temp\ -Include *.xlsb -r
#Open Excel and set the security parameters to be able to modify macros
$Excl = New-Object -ComObject Excel.Application
$xlve = $Excl.Version
New-ItemProperty -Path "HKCU:\Software\Microsoft\Office\$xlve\Excel\Security" `
-Name AccessVBOM -Value 1 -Force | Out-Null
New-ItemProperty -Path "HKCU:\Software\Microsoft\Office\$xlve\Excel\Security" `
-Name VBAWarnings -Value 1 -Force | Out-Null
#Loop through the files and modify the macros
$path = "C:\Temp\ModuleVBATemp.txt" #Temp text file to copy and modify the macros
foreach ($File in $Fllt) {
$Wkbk = $Excl.Workbooks.Open($File.Fullname)
if ($Wkbk.HasVBProject) <# Test if any macro #> {
foreach ($Vbco in $Wkbk.VBProject.VBComponents) {
if ($Vbco.Type -eq '1') <# Only modify the modules #> {
#Modification of the script
$Vbco.Export($path) | Out-Null
(gc $path) -replace "this string","that string" -replace "11111","22222" `
| Out-File $path -Encoding ascii
$Wkbk.VBProject.VBComponents.Remove($Vbco)
$Wkbk.VBProject.VBComponents.Import($path) | Out-Null
}}}
$Wkbk.Close($true) #Save the file
}
$Excl.Quit()
It is working on my test file, I hope that your configuration is similar.

add colum to merged csv file

Ok heres what I have code wise:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
(get-content $a) | set-content $b
This pulls all the data of all the files into one merged file, but I need one additional item, I need to pull the name of the individual files and append it to the first column of the file for multiple files, several hundred at a time.
Not tested but something like this should do it:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
Get-ChildItem $a | % {
Import-Csv $_.Fullname | Add-Member -MemberType NoteProperty -Name 'File Name' -Value $_.Name
} | Export-Csv $b
Assuming the CSV files each have the same column headings, I would lean toward using Import-CSV instead of Get-Content so that you can work with the CSV contents as arrays of objects with named properties.
Then all you need to do is iterate through each item of the array and append a property containing the file path, which you can do using the Add-Member cmdlet. Once that's done, export the array of objects using the Export-CSV cmdlet.
$directory = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\"
$search = $directory + "*.csv"
$exportpath = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
$paths = get-childitem $search
$objectArrays = #()
$paths | %{
$filepath = $_.fullname;
$objectArray = Import-CSV $filepath;
$objectArray | %{
Add-Member -inputobject $_ -Name "SourceFile" -Value $filepath -MemberType NoteProperty};
$objectArrays += $objectArray}
$objectArrays | export-csv -path $exportpath -notype
This puts the SourceFile property as the last column in the outputted CSV file
Ok, simplification... Search target folder, pipe to a ForEach-Object loop (shorthand % used), capture the file name as variable, import the CSV, add the sourcefile using the Select-Object cmdlet, convert it back to a CSV, end loop, pipe to destination file.
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
GCI $a | %{$FileName=$_.Name;Import-CSV $_|Select #{l='SourceFile';e={$FileName}},*|ConvertTo-CSV -NoType} | set-content $b

Powershell - Optimizing a very, very large csv and text file search and replace

I have a directory with ~ 3000 text files in it, and I'm doing periodic search and replaces on those text files as I transition a program to a new server.
Each text file may have an average of ~3000 lines, and I need to search the files for maybe 300 - 1000 terms at a time.
I'm replacing the server prefix which is related to the string I'm searching for. So for every one of the csv entries, I'm looking for Search_String, \\Old_Server\"Search_String" and making sure that after the program completes, the result is "\\New_Server\Search_String".
I cobbled together a powershell program, and it works. But it's so slow I've never seen it complete.
Any suggestions for making it faster?
EDIT 1:
I changed get-content as suggested, but it still took 3 minutes to search two files (~8000 lines) for 9 separate search terms. I must still be screwing up; a notepad++ search and replace would still be way faster if done manually 9 times.
I'm not sure how to get rid of the first (Get-Content) because I want to make a copy of the file for backup before I make any changes to it.
EDIT 2:
So this is an order of magnitude faster; it's searching a file in maybe 10 seconds. But now it doesn't write changes to files, and it only searches the first file in the directory! I didn't change that code, so I don't know why it broke.
EDIT 3:
Success! I adapted a solution posted below to make it much, much faster. It's searching each file in a couple of seconds now. I may reverse the loop order, so that it loads the file into the array and then searches and replaces each entry in the CSV rather than the other way around. I'll post that if I get it to work.
Final script is below for reference.
#get input from the user
$old = Read-Host 'Enter the old cimplicity qualifier (F24, IRF3 etc'
$new = Read-Host 'Enter the new cimplicity qualifier (CB3, F24_2 etc)'
$DirName = Get-Date -format "yyyy_MM_dd_hh_mm"
New-Item -ItemType directory -Path $DirName -force
New-Item "$DirName\log.txt" -ItemType file -force -Value "`nMatched CTX files on $dirname`n"
$logfile = "$DirName\log.txt"
$VerbosePreference = "SilentlyContinue"
$points = import-csv SearchAndReplace.csv -header find #Import CSV File
#$ctxfiles = Get-ChildItem . -include *.ctx | select -expand fullname #Import local directory of CTX Files
$points | foreach-object { #For each row of points in the CSV file
$findvar = $_.find #Store column 1 as string to search for
$OldQualifiedPoint = "\\\\"+$old+"\\" + $findvar #Use escape slashes to escape each invidual bs so it's not read as regex
$NewQualifiedPoint = "\\"+$new+"\" + $findvar #escape slashes are NOT required on the new string
$DuplicateNew = "\\\\" + $new + "\\" + "\\\\" + $new + "\\"
$QualifiedNew = "\\" + $new + "\"
dir . *.ctx | #Grab all CTX Files
select -expand fullname | #grab all of those file names and...
foreach {#iterate through each file
$DateTime = Get-Date -Format "hh:mm:ss"
$FileName = $_
Write-Host "$DateTime - $FindVar - Checking $FileName"
$FileCopied = 0
#Check file contents, and copy matching files to newly created directory
If (Select-String -Path $_ -Pattern $findvar -Quiet ) {
If (!($FileCopied)) {
Copy $FileName -Destination $DirName
$FileCopied = 1
Add-Content $logfile "`n$DateTime - Found $Findvar in $filename"
Write-Host "$DateTime - Found $Findvar in $filename"
}
$FileContent = Get-Content $Filename -ReadCount 0
$FileContent =
$FileContent -replace $OldQualifiedPoint,$NewQualifiedPoint -replace $findvar,$NewQualifiedPoint -replace $DuplicateNew,$QualifiedNew
$FileContent | Set-Content $FileName
}
}
$File.Dispose()
}
If I'm reading this correctly, you should be able to read a 3000 line file into memory, and do those replaces as an array operation, eliminating the need to iterate through each line. You can also chain those replace operations into a single command.
dir . *.ctx | #Grab all CTX Files
select -expand fullname | #grab all of those file names and...
foreach {#iterate through each file
$DateTime = Get-Date -Format "hh:mm:ss"
$FileName = $_
Write-Host "$DateTime - $FindVar - Checking $FileName"
#Check file contents, and copy matching files to newly created directory
If (Select-String -Path $_ -Pattern $findvar -Quiet ) {
Copy $FileName -Destination $DirName
Add-Content $logfile "`n$DateTime - Found $Findvar in $filename"
Write-Host "$DateTime - Found $Findvar in $filename"
$FileContent = Get-Content $Filename -ReadCount 0
$FileContent =
$FileContent -replace $OldQualifiedPoint,$NewQualifiedPoint -replace $findvar,$NewQualifiedPoint -replace $DuplicateNew,$QualifiedNew
$FileContent | Set-Content $FileName
}
}
On another note, Select-String will take the filepath as an argument, so you don't have to do a Get-Content and then pipe that to Select-String.
Yes, you can make it much faster by not using Get-Content... Use Stream Reader instead.
$file = New-Object System.IO.StreamReader -Arg "test.txt"
while (($line = $file.ReadLine()) -ne $null) {
# $line has your line
}
$file.dispose()
i wanted to use PowerShell for this and created a script like the one below:
$filepath = "input.csv"
$newfilepath = "input_fixed.csv"
filter num2x { $_ -replace "aaa","bbb" }
measure-command {
Get-Content -ReadCount 1000 $filepath | num2x | add-content $newfilepath
}
It took 19 minutes on my laptop to process 6.5Gb file. The code below is reading file in a batch (using ReadCount) and uses filter that should optimize performance.
But then I tried FART and it did the same thing in 3 minutes! quite a difference!

Renaming many folders in PowerShell

I have over 1000+ files that have to be renamed.
The first set folder and/or files are grouped by location, so the first four characters are the same for each file; there are four-five different locations. I need to delete the first few characters of the folder's name.
Example:
Old File: ABC_Doe, Jane
New File: Doe, Jane
any suggestions as to the quickest way to carry this out?
I've tried all of the following:
1st Attempt
$a = Get-ChildItem C:\example
$b = Where-Object {$_.name -like “*ABC_*”}
$cmdlet_name = “Rename-Item”
$d = (cmdlet_name $a $b)
invoke-expression $d
2nd Attempt
$e = Get-ChildItem C:\example
$f = $e.TrimStart (“ABC_”)
3rd Attempt
Rename-Item -{$_.name -like “*ASD*”, “”}
Try this, get all child items (files only), remove abc_ by replacing them (with nothing) and rename each file. To rename files in sub-directories add the -Recurse switch to the Get-ChildItem command:
Get-ChildItem c:\example -Filter ABC_* | Where-Object {!$_.PSIsContainer} | Rename-Item -NewName { ($_.BaseName -replace '^ABC_') + $_.Extension }
UPDATE
Actually, this should work as well and is much shorter (no need to append the file extension cause renaming is performed on the file name).
Get-ChildItem c:\example -Filter ABC_* | Where-Object {!$_.PSIsContainer} | Rename-Item -NewName { $_.Name -replace '^ABC_' }
get-childItem ABC_* | rename-item -newname { $_.name -replace 'ABC_','' }
Source: get-help rename-item -full

Resources