I would like to know how to apply Excel column B (Powershell source) to the same folder name as Excel column A value - excel

There are A, B, C, D, E folders in the "image folder".(C:image)
Excel column A value refers to the folder name in "Image Folder".(Destination)
The Excel column B value is the source of the PowerShell and plays a role of changing the name of the file in the A~E folder.
When the Excel column A and the folder name match, the power cell source in Excel column B is applied at once. In other words, it tries to apply the PowerShell source in column B to each file in folders A to F at once. I want to implement it as a powershell source. Can you help? please

As commented, the code you use in the Excel is the same for every subfolder mentioned. That is why I don't think you need that Excel at all, it would just provide more overhead.
Why not do
Change all filenames inside the (1 level) of subfolders inside the rootfolder:
$rootFolder = 'C:\Image'
Get-ChildItem -Path $rootFolder -Filter '*.jpg' -File -Recurse -Depth 1 |
Group-Object -Property #{Expression = {$_.Directory.Name}} |
ForEach-Object {
$dir = $_.Name
$nr = 1
foreach ($file in $_.Group) {
$file | Rename-Item -NewName ('{0}_{1}.jpg' -f $dir, $nr++ )
}
}
Or if you have more folders inside the C:\Image folder and you want to limit to certain subfolders
$rootFolder = 'C:\Image'
$subfolders = 'A','B','C','D','E' # array of folders to process
foreach ($dir in $subfolders) {
$path = Join-Path -Path $rootFolder -ChildPath $dir
Get-ChildItem -Path $path -Filter '*.jpg' -File |
Group-Object -Property #{Expression = {$_.Directory.Name}} |
ForEach-Object {
$dir = $_.Name
$nr = 1
foreach ($file in $_.Group) {
$file | Rename-Item -NewName ('{0}_{1}.jpg' -f $dir, $nr++ )
}
}
}

Related

Referring to Files as variables in multiple ForEach loops [PowerShell]

Im am making some progress on my script that automatically updates links in excel files without opening them. I have successfully made the function work on a single file with inputs of file name and text to replace. Now I am trying to scale this so that it does the same actions for all files in the script directory.
Here is how the script goes with comments on steps:
# This part will be responsible from fetching the week number to replace from the script directory name, currently I am testing with pre-determined number
# $wk = Get-Item -Path $PWD | Select-Object -Property BaseName
# $wknn = "$wk".Substring(13,2) -as [int]
$wknn = 41
$wkold = $wknn-1
$wkprev = $wknn-2
$DefaultFiles = Get-ChildItem | Where-Object {$_.Name -like "*.xls*"}
ForEach($File in $DefaultFiles)
{
# Build ZIP file name
$zipFile = $_ -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Rename file to ZIP
Rename-Item -Path $_ -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text. First replace wk-1 to wk and then wk-2 to wk-1
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Include *.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.FullName) |
Foreach-Object { $_ -replace $wkold, $wknn } |
Foreach-Object { $_ -replace $wkprev, $wkold } |
Set-Content $file.FullName
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempfolder
# Update archive with new files. Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $_
#Move the final .xlsb file back to the script root
move-Item -Path $_ -destination $PSScriptRoot
#Set location to script root to start over
Set-Location -Path $PSScriptRoot
}
}
I am running into problems with the forEach loop. I am unsure on how do I refer to the file name within the first loop at the Build Zip File Name step. And how do I refer to the output file when i Want to move it to the script root afterwards. Also I suspect that stacking of forEach loops may be not as simple and require extra steps within the code, but due to me just starting out in C I dont have the experience and could not find a simple answer to my problem.
I would really appreciate some assistance with the syntax in my code :)
I would create a temporary folder outside the main loop and set the working directory to that folder. Then remove the folder and reset the working directory when all is done.
Also, there is no need to rename the finished zip file first and then move it back to its original location, because you can do that with the Move-Item cmdlet at the same time.
Try:
$wknn = 41
$wkold = $wknn - 1
$wkprev = $wknn - 2
$7zipExe = 'C:\7z\7za.exe' # path to 7za.exe
$sourcePath = $PSScriptRoot # path to where the Excel files are
# Create temporary folder
$tempFolder = Join-Path -Path ([System.IO.Path]::GetTempPath()) -ChildPath ((New-Guid).Guid)
$null = New-Item -ItemType Directory -Path $tempFolder -Force
# retrieve a collection of Excel files (FullNames only).
# If you ONLY want .xlsb files, set the Filter to '*.xlsb'
$excelFiles = (Get-ChildItem -Path $sourcePath -Filter '*.xls*' -File).FullName
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempfolder
foreach($File in $excelFiles) {
# Build ZIP file name
$zipFile = [System.IO.Path]::ChangeExtension($File, '.zip')
# Rename file to ZIP
Rename-Item -Path $File -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
& $7zipExe x "$zipFile" -o"$tempFolder" | Out-Null
# Replace old text with new text. First replace wk-1 to wk and then wk-2 to wk-1
Get-ChildItem -Path $tempFolder -Recurse -Filter '*.rels' -File | ForEach-Object {
(Get-Content -Path $_.FullName -Raw) -replace $wkold, $wknn -replace $wkprev, $wkold |
Set-Content $_.FullName
}
# Update archive with new files. Not using Compress-Archive because it changes the ZIP format
& $7zipExe u -r "$zipFile" *.* | Out-Null
# Rename and Move the updated file back to the original location (overwrite)
Move-Item -Path $zipFile -Destination $File -Force
# remove all files from the temporary folder to start fresh
Remove-Item -Path "$tempfolder\*" -Recurse -Force
}
# Set location back to script root
Set-Location -Path $PSScriptRoot
# remove the temporary folder
Remove-Item -Path $tempfolder -Recurse -Force

How can I copy a column value from excel to csv file without using ComObject

I'm new to Power shell. I have a number of excel files (500+) having a column Animal Count that I would like to save in a new '.csv' file. I have a code to do this using excel Com Objects.
I want to achieve the same without using ComObjects. Could anyone help me in achieving this.
Download PSExcel module from
https://github.com/RamblingCookieMonster/PSExcel
Import it using Import-Module.
then use the following code:
$AnimalCount = #()
$Source = 'D:\Test' # the path to where the Excel files are
ForEach ($File in Get-ChildItem -Path $Source -Filter '*.xlsx' -File) {
$Excel = New-Excel -Path $File
$Cell = ($Excel | Get-WorkSheet | % {$_.Cells | ? {$_.Text -eq "AnimalCount"}})
$count = (($Excel | Get-WorkSheet -Name $Cell.Worksheet).Cells | ? {($_.Start.Row -eq $Cell.Start.Row) -and ($_.Start.Column -eq $Cell.Start.Column + 1)}).Text
$AnimalCount += [PsCustomObject] #{'File' = $File.FullName; 'AnimalCount' = $count }
}
$AnimalCount | Format-Table -AutoSize
$AnimalCount | Export-Csv -Path 'D:\Test\AnimalCount.csv' -UseCulture -NoTypeInformation
The best thing here is that you do not need excel to be installed on the machine that runs this script.

Powershell get infos about files and try to Export them

I try do write a script where i can choose a folder and powershell shows me the Name, Size,.... of all the files in that folder. After that powershell should export the Informations in a Excel Table.
But im stuck and dont know what to do :C
Here is my code that i tried to build
Function Get-Folder($initialDirectory)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")|Out-Null
$foldername = New-Object System.Windows.Forms.FolderBrowserDialog
$foldername.Description = "Select a folder"
$foldername.rootfolder = "MyComputer"
if($foldername.ShowDialog() -eq "OK")
{
$folder += $foldername.SelectedPath
}
return $folder
}
$a = Get-Folder
$folder = $a
Get-ChildItem -Path $folder | SELECT Name, #{Name="Size In KB";Expression={$_.Length / 1Kb}}, Attributes, LastaccessTime, #{n='Owner';e={(get-acl $_.Fullname).Owner}}| Format-Table -AutoSize
Export-Csv "C:\Users\DZimmermann\Desktop\Test.csv" -Delimiter ";" -Append
As commented, using Format-Table -AutoSize simply outputs the info in a table format to console. It returns nothing, so there is nothing to write in the csv file..
Doing like this will create the CSV file and writes the info in there:
Get-ChildItem -Path $folder |
Select-Object Name,
#{Name="Size In KB";Expression={$_.Length / 1Kb}},
Attributes, LastaccessTime,
#{n='Owner';e={(get-acl $_.Fullname).Owner}} |
Export-Csv "C:\Users\DZimmermann\Desktop\Test.csv" -Delimiter ";"
This will not get you the info on screen. If you also want that, capture the result in a variable first:
$result = Get-ChildItem -Path $folder |
Select-Object Name,
#{Name="Size In KB";Expression={$_.Length / 1Kb}},
Attributes, LastaccessTime,
#{n='Owner';e={(get-acl $_.Fullname).Owner}}
#output on screen
$result | Format-Table -AutoSize
# write the CSV file:
$result | Export-Csv "C:\Users\DZimmermann\Desktop\Test.csv" -Delimiter ";"
P.S. judging by the title of this question, I think you only want info about Files, not Directories..
If that is the case, add -File switch to the Get-ChildItem cmdlet (for PS 3 and up). For PS versions below 3 use
Get-ChildItem -Path $folder | Where-Object { !$_.PSIsContainer }

add colum to merged csv file

Ok heres what I have code wise:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
(get-content $a) | set-content $b
This pulls all the data of all the files into one merged file, but I need one additional item, I need to pull the name of the individual files and append it to the first column of the file for multiple files, several hundred at a time.
Not tested but something like this should do it:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
Get-ChildItem $a | % {
Import-Csv $_.Fullname | Add-Member -MemberType NoteProperty -Name 'File Name' -Value $_.Name
} | Export-Csv $b
Assuming the CSV files each have the same column headings, I would lean toward using Import-CSV instead of Get-Content so that you can work with the CSV contents as arrays of objects with named properties.
Then all you need to do is iterate through each item of the array and append a property containing the file path, which you can do using the Add-Member cmdlet. Once that's done, export the array of objects using the Export-CSV cmdlet.
$directory = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\"
$search = $directory + "*.csv"
$exportpath = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
$paths = get-childitem $search
$objectArrays = #()
$paths | %{
$filepath = $_.fullname;
$objectArray = Import-CSV $filepath;
$objectArray | %{
Add-Member -inputobject $_ -Name "SourceFile" -Value $filepath -MemberType NoteProperty};
$objectArrays += $objectArray}
$objectArrays | export-csv -path $exportpath -notype
This puts the SourceFile property as the last column in the outputted CSV file
Ok, simplification... Search target folder, pipe to a ForEach-Object loop (shorthand % used), capture the file name as variable, import the CSV, add the sourcefile using the Select-Object cmdlet, convert it back to a CSV, end loop, pipe to destination file.
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
GCI $a | %{$FileName=$_.Name;Import-CSV $_|Select #{l='SourceFile';e={$FileName}},*|ConvertTo-CSV -NoType} | set-content $b

Renaming many folders in PowerShell

I have over 1000+ files that have to be renamed.
The first set folder and/or files are grouped by location, so the first four characters are the same for each file; there are four-five different locations. I need to delete the first few characters of the folder's name.
Example:
Old File: ABC_Doe, Jane
New File: Doe, Jane
any suggestions as to the quickest way to carry this out?
I've tried all of the following:
1st Attempt
$a = Get-ChildItem C:\example
$b = Where-Object {$_.name -like “*ABC_*”}
$cmdlet_name = “Rename-Item”
$d = (cmdlet_name $a $b)
invoke-expression $d
2nd Attempt
$e = Get-ChildItem C:\example
$f = $e.TrimStart (“ABC_”)
3rd Attempt
Rename-Item -{$_.name -like “*ASD*”, “”}
Try this, get all child items (files only), remove abc_ by replacing them (with nothing) and rename each file. To rename files in sub-directories add the -Recurse switch to the Get-ChildItem command:
Get-ChildItem c:\example -Filter ABC_* | Where-Object {!$_.PSIsContainer} | Rename-Item -NewName { ($_.BaseName -replace '^ABC_') + $_.Extension }
UPDATE
Actually, this should work as well and is much shorter (no need to append the file extension cause renaming is performed on the file name).
Get-ChildItem c:\example -Filter ABC_* | Where-Object {!$_.PSIsContainer} | Rename-Item -NewName { $_.Name -replace '^ABC_' }
get-childItem ABC_* | rename-item -newname { $_.name -replace 'ABC_','' }
Source: get-help rename-item -full

Resources