I need some help with 2 problems to my script:
I have a script that moves all pictures (if there are any) from all the subfolder locations to a new subfolder under the "PRO" folder.
Problem 1:
I've turned on the -verbose command on move-item in hopes of outputting it to a log.txt file, but due to the threads it gives me errors "file is allready being used by another proces".
Problem 2:
Either i turn on the -Force command on the move-item and risk losing pictures when there are files with the same name or i find some way to rename them on the fly....
what i have sofar:
the extensions i want to move:
$movefiles_foto = #("*.tif", "*.tiff","*.gif","*.jpeg","*.jpg","*.jif","*.jfif", "*.jp2","*.jpx", "*.j2k", "*.j2c","*.fpx","*.pcd","*.png","*.JPEG")
the moving scriptblock:
$movefilesfoto = {
Param($extension,$loc)
$archive = "FOTO"
Get-ChildItem -Path $loc -Recurse -File -Filter $extension -Exclude $archive | ForEach-Object {
if ($_.FullName.IndexOf('\PRO\') -gt 0) {
$Destination = Join-Path -Path $_.FullName.Substring(0,$_.FullName.IndexOf('\PRO\') + 5) -ChildPath 'FOTO';
if (-not (Test-Path -LiteralPath $Destination -PathType Container)) {
New-Item -Type Directory -Path $Destination | Out-Null
}
$_ | Move-Item -Destination $Destination -Verbose;
} else {throw ("\PRO\ path not found in '$($_.FullName)'");}}}
starting the jobs part:
write-host "move foto"
foreach ($foto_file in $movefiles_foto)
{
Start-Job -ScriptBlock $movefilesfoto -ArgumentList $foto_file, $location
}
Get-Job | Wait-Job | Receive-Job | Out-File -FilePath $location\log.txt
My apolegies if this is too many questions at once, but wanted to avoid having to ask 3 seperate questions that might be related.
thanks in advance for any input
UPDATE:
verbose output to file is solved, so scratch problem 1!
The verbose info claiming it moves files from the same folder to the same folder disappeared and can't seem to be reproduced. So that leaves problem 2, the renaming
Problem 1:
An article on Stream redirection (full support introduced in PSv5)
Example:
Write-Verbose 'Example message!' -Verbose 4>&1 | Out-File C:\ex.txt -Force
Problem 2:
Function MoveFilesFoto
{
Param([String[]]$Extensions,[String]$Location)
$Archive = 'FOTO'
Get-ChildItem -Path $Location -Recurse -Filter -Include $Extensions -Exclude $Archive |
ForEach-Object {
If ($_.FullName -cmatch '\\PRO\\')
{
$Destination = Join-Path 'Parent' $Archive
If (!(Test-Path $Destination))
{
New-Item -Type Directory -Path $Destination > $Null
}
Move-Item -Path $_.FullName -Destination $Destination -Verbose 4>&1 | Out-File $Destination\Log.txt -Append -Force
}
Else
{
Throw "Could not find \PRO\ in $($_.FullName)"
}
}
}
Related
I am struggling to get the last if statement to work.
I have a blob storage account which contains the directories mentioned and a certificate.
I want to import that certificate to the keyvault.
When I run the pipeline (which contains the below script), it just runs to where I have put the Write-host 'everything..'
Can someone please assist why it won't work, I have tried to separate to 3 if statements, remove the if statement nothing has worked.
param (
[string] $CertificateNames,
[string] $KeyVaultResourceId
)
# Split certificate names by comma or semi-colon
$certificateName = $CertificateNames.Replace(',', ';') -split ';' | ForEach-Object -Process { $_.Trim() } | Select-Object -First 1
# For wildcard certificates, Posh-ACME replaces * with ! in the directory name
$certificateName = $certificateName.Replace('*', '!')
# Set working directory
$workingDirectory = Join-Path -Path "." -ChildPath "pa"
# Set Posh-ACME working directory
$env:POSHACME_HOME = $workingDirectory
Import-Module -Name Posh-ACME -Force
# Resolve the details of the certificate
$currentServerName = ((Get-PAServer).location) -split "/" | Where-Object -FilterScript { $_ } | Select-Object -Skip 1 -First 1
$currentAccountName = (Get-PAAccount).id
# Determine paths to resources
$orderDirectoryPath = Join-Path -Path $workingDirectory -ChildPath $currentServerName | Join-Path -ChildPath $currentAccountName | Join-Path -ChildPath $certificateName
$orderDataPath = Join-Path -Path $orderDirectoryPath -ChildPath "order.json"
$pfxFilePath = Join-Path -Path $orderDirectoryPath -ChildPath "fullchain.pfx"
Write-Host 'everything works up until here.. then breaks'
# If we have a order and certificate available
if ((Test-Path -Path $orderDirectoryPath) -and (Test-Path -Path $orderDataPath) -and (Test-Path -Path $pfxFilePath)) {
Write-Host 'check paths are ok'
$pfxPass = (Get-PAOrder $certificateName).PfxPass
# Load PFX
$certificate = New-Object -TypeName System.Security.Cryptography.X509Certificates.X509Certificate2 -ArgumentList $pfxFilePath, $pfxPass, 'EphemeralKeySet'
# Get the current certificate from key vault (if any)
$azureKeyVaultCertificateName = $certificateName.Replace(".", "-").Replace("!", "wildcard")
$keyVaultResource = Get-AzResource -ResourceId $KeyVaultResourceId
$azureKeyVaultCertificate = Get-AzKeyVaultCertificate -VaultName $keyVaultResource.Name -Name $azureKeyVaultCertificateName -ErrorAction SilentlyContinue
Write-Host 'check if certificate is in kv'
# If we have a different certificate, import it
If (-not $azureKeyVaultCertificate -or $azureKeyVaultCertificate.Thumbprint -ne $certificate.Thumbprint) {
Import-AzKeyVaultCertificate -VaultName $keyVaultResource.Name -Name $azureKeyVaultCertificateName -FilePath $pfxFilePath -Password (ConvertTo-SecureString -String $pfxPass -AsPlainText -Force) | Out-Null
}
Write-Host 'check if upload is success'
}
When the pipeline is run, it breaks and there is no errors:
see screenshot here
Resolved this, the issue was the file paths didn't exist so the if statement couldn't check against an invalid file path.
As there was no errors, this was a bit harder to find the reason, instead I removed the if statement and added Write-Host "test" to see where things were broken in the code.
Im am making some progress on my script that automatically updates links in excel files without opening them. I have successfully made the function work on a single file with inputs of file name and text to replace. Now I am trying to scale this so that it does the same actions for all files in the script directory.
Here is how the script goes with comments on steps:
# This part will be responsible from fetching the week number to replace from the script directory name, currently I am testing with pre-determined number
# $wk = Get-Item -Path $PWD | Select-Object -Property BaseName
# $wknn = "$wk".Substring(13,2) -as [int]
$wknn = 41
$wkold = $wknn-1
$wkprev = $wknn-2
$DefaultFiles = Get-ChildItem | Where-Object {$_.Name -like "*.xls*"}
ForEach($File in $DefaultFiles)
{
# Build ZIP file name
$zipFile = $_ -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Rename file to ZIP
Rename-Item -Path $_ -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text. First replace wk-1 to wk and then wk-2 to wk-1
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Include *.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.FullName) |
Foreach-Object { $_ -replace $wkold, $wknn } |
Foreach-Object { $_ -replace $wkprev, $wkold } |
Set-Content $file.FullName
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempfolder
# Update archive with new files. Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $_
#Move the final .xlsb file back to the script root
move-Item -Path $_ -destination $PSScriptRoot
#Set location to script root to start over
Set-Location -Path $PSScriptRoot
}
}
I am running into problems with the forEach loop. I am unsure on how do I refer to the file name within the first loop at the Build Zip File Name step. And how do I refer to the output file when i Want to move it to the script root afterwards. Also I suspect that stacking of forEach loops may be not as simple and require extra steps within the code, but due to me just starting out in C I dont have the experience and could not find a simple answer to my problem.
I would really appreciate some assistance with the syntax in my code :)
I would create a temporary folder outside the main loop and set the working directory to that folder. Then remove the folder and reset the working directory when all is done.
Also, there is no need to rename the finished zip file first and then move it back to its original location, because you can do that with the Move-Item cmdlet at the same time.
Try:
$wknn = 41
$wkold = $wknn - 1
$wkprev = $wknn - 2
$7zipExe = 'C:\7z\7za.exe' # path to 7za.exe
$sourcePath = $PSScriptRoot # path to where the Excel files are
# Create temporary folder
$tempFolder = Join-Path -Path ([System.IO.Path]::GetTempPath()) -ChildPath ((New-Guid).Guid)
$null = New-Item -ItemType Directory -Path $tempFolder -Force
# retrieve a collection of Excel files (FullNames only).
# If you ONLY want .xlsb files, set the Filter to '*.xlsb'
$excelFiles = (Get-ChildItem -Path $sourcePath -Filter '*.xls*' -File).FullName
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempfolder
foreach($File in $excelFiles) {
# Build ZIP file name
$zipFile = [System.IO.Path]::ChangeExtension($File, '.zip')
# Rename file to ZIP
Rename-Item -Path $File -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
& $7zipExe x "$zipFile" -o"$tempFolder" | Out-Null
# Replace old text with new text. First replace wk-1 to wk and then wk-2 to wk-1
Get-ChildItem -Path $tempFolder -Recurse -Filter '*.rels' -File | ForEach-Object {
(Get-Content -Path $_.FullName -Raw) -replace $wkold, $wknn -replace $wkprev, $wkold |
Set-Content $_.FullName
}
# Update archive with new files. Not using Compress-Archive because it changes the ZIP format
& $7zipExe u -r "$zipFile" *.* | Out-Null
# Rename and Move the updated file back to the original location (overwrite)
Move-Item -Path $zipFile -Destination $File -Force
# remove all files from the temporary folder to start fresh
Remove-Item -Path "$tempfolder\*" -Recurse -Force
}
# Set location back to script root
Set-Location -Path $PSScriptRoot
# remove the temporary folder
Remove-Item -Path $tempfolder -Recurse -Force
I need a script to convert excel to csv format UTF-8. I think it can be done on Powershell, but I can’t. Can you see where the error is? Thank you very much in advance.
$configFiles = Get-ChildItem "c:\HR\test"
foreach ($file in $configFiles) {
$a = -join ("c:\HR\test", "\", $file)
Get-Content $a | Set-Content -path -Encoding utf8 $a
}
You can use Doug Finke's awesome module to get this done: https://github.com/dfinke/ImportExcel
To install it:
Install-Module ImportExcel
Then you can do something like this. Note that Out-File has a few different encoding options for UTF8: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/out-file?view=powershell-5.1#parameters
param(
$excelFilesDir = (Get-ChildItem $PSScriptRoot\excel_files),
$csvFilesDir = "$PSScriptRoot\csv_files"
)
# NOTE: I had to supply the path to the module on my system. You may not have to.
Import-Module "<ABSOLUTE>:\<PATH>\<TO>\<MODULE>\ImportExcel" -Force
# Import-Module ImportExcel
if ( (Test-Path -Path $csvFilesDir) -eq $false) {
New-Item -Path $csvFilesDir -ItemType Directory -Force
}
foreach ($file in $excelFilesDir) {
Import-Excel $file.FullName | `
ConvertTo-Csv -NoTypeInformation | `
Out-File -FilePath "$csvFilesDir\$($file.Name)" -Encoding utf8 -Force
}
I have a PowerShell script that works, it helps me run multiple queries against multiple servers and save each output in different CSV and then merge them together into an Excel file.
$Servers = get-content -Path "Servers.txt"
$DatabaseName ="master"
#$credential = Get-Credential #Prompt for user credentials
$secpasswd = ConvertTo-SecureString "MyPassword" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ("sa", $secpasswd)
$QueriesFolder = "Queries\"
$ResultFolder = "Results\"
ForEach($Server in $Servers)
{
$DateTime = (Get-Date).tostring("yyyy-MM-dd")
ForEach ($filename in get-childitem -path $QueriesFolder -filter "*.sql" | sort-object {if (($i = $_.BaseName -as [int])) {$i} else {$_}} )
{
$oresults = invoke-sqlcmd -ServerInstance $Server -Database $DatabaseName -Credential $credential -InputFile $filename.fullname
write-host "Executing $filename on $Server"
$BaseNameOnly = Get-Item $filename.fullname | Select-Object -ExpandProperty BaseName
$oresults | export-csv $ResultFolder$BaseNameOnly.csv -NoTypeInformation -Force
}
$All_CSVs = get-childitem -path $ResultFolder -filter "*.csv" | sort-object {if (($i = $_.BaseName -as [int])) {$i} else {$_}}
$Count_CSVs = $All_CSVs.Count
Write-Host "Detected the following CSV files: ($Count_CSVs)"
Write-Host " "$All_CSVs.Name"`n"
$ExcelApp = New-Object -ComObject Excel.Application
$ExcelApp.SheetsInNewWorkbook = $All_CSVs.Count
$output = "C:\Users\FrancescoM\Desktop\CSV\Results\" + $Server + " $DateTime.xlsx"
if (Test-Path $output)
{
Remove-Item $output
Write-Host Removing: $output because it exists already
}
$xlsx = $ExcelApp.Workbooks.Add()
for($i=1;$i -le $Count_CSVs;$i++)
{
$worksheet = $xlsx.Worksheets.Item($i)
$worksheet.Name = $All_CSVs[$i-1].Name
$file = (Import-Csv $All_CSVs[$i-1].FullName)
$file | ConvertTo-Csv -Delimiter "`t" -NoTypeInformation | Clip
$worksheet.Cells.Item(1).PasteSpecial()|out-null
}
$xlsx.SaveAs($output)
Write-Host Creating: $output
$ExcelApp.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsx) | Out-Null;
Write-Host "Closing all worksheet"
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($ExcelApp) | Out-Null;
Write-Host "Closing Excel"
[System.GC]::Collect();
[System.GC]::WaitForPendingFinalizers()
Remove-Item "$ResultFolder\*" -Include *.csv
Write-Host "Cleaning all *.csv"
Start-Sleep -Seconds 3
}
In order to make this script more portable I want all the paths mentioned into it to be stored into a variable and then concatenated.
But as soon as I change:
$output = "C:\Users\FrancescoM\Desktop\CSV\Results\" + $Server + " $DateTime.xlsx"
into:
$output = $ResultFolder + $Server + " $DateTime.xlsx"
things get nasty and I receive the error:
Microsoft Excel cannot access the file 'C:\Users\FrancescoM\Documents\Results\0DC80000'.
There are several possible reasons:
• The file name or path does not exist.
• The file is being used by another program.
• The workbook you are trying to save has the same name as a currently open workbook.
At C:\Users\FrancescoM\Desktop\CSV\QueryLauncher.ps1:50 char:2
+ $xlsx.SaveAs($output)
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], COMException
+ FullyQualifiedErrorId : System.Runtime.InteropServices.COMException
I don't understand, I think I'm concatenating things right.
I also followed this StackOverflow post and restarted my computer after adding "C:\Windows\SysWOW64\config\systemprofile\desktop" but the problem isn't fixed.
How can a variable path mess things up with Excel?
Because you are not defining the full path in the $ResultFolder variable, it will be expanded using the current working directory.
Just look at the path you want it to be:
"C:\Users\FrancescoM\Desktop\CSV\Results\" + $Server + " $DateTime.xlsx"
and the resulting path using the partial $ResultFolder variable:
C:\Users\FrancescoM\Documents\Results\0DC80000
Since you want the output file in a folder on your desktop, set the $output to
$output = Join-Path $([Environment]::GetFolderPath("Desktop")) "CSV\Results\$Server $DateTime.xlsx"
EDIT
From your last comment I understand that you want the output to be in a subfolder called "Results" that resides inside the folder the script itself is in.
In that case do this:
# get the folder this script is running from
$ScriptFolder = if ($PSScriptRoot) { $PSScriptRoot } else { Split-Path $MyInvocation.MyCommand.Path }
# the following paths are relative to the path this script is in
$QueriesFolder = Join-Path -Path $ScriptFolder -ChildPath 'Queries'
$ResultFolder = Join-Path -Path $ScriptFolder -ChildPath 'Results'
# make sure the 'Results' folder exists; create if not
if (!(Test-Path -Path $ResultFolder -PathType Container)) {
New-Item -Path $ResultFolder -ItemType Directory | Out-Null
}
Then, when it becomes time to save the xlsx file, create the full path and filename using:
$output = Join-Path -Path $ResultFolder -ChildPath "$Server $DateTime.xlsx"
$xlsx.SaveAs($output)
P.S. I advice to use the Join-Path cmdlet to combine file paths or to make use of [System.IO.Path]::Combine() instead of joining paths together like you do with this line: $oresults | export-csv $ResultFolder$BaseNameOnly.csv. Using the latter can lead to unforeseen pathnames if ever you forget to postfix the first path part with a backslash.
P.S.2 Excel has its own default output path set in Tools->Options->General->Default File Location and has no idea of the relative path for the script. This is why you should save using a Full path and filename.
I have part of the code: at the moment its coming empty in the CSV file. But i need a command to specify the path/folders to look at, how do i modify this for that purpose.
Param(
[String]$path,
[String]$outfile = ".\outfile.csv"
)
$output = #()
ForEach ($item in (Get-ChildItem -Path $path -Recurse -Directory)) {
ForEach ($acl in ($item.GetAccessControl().Access)){
$output += $acl |
Add-Member `
-MemberType NoteProperty `
-Name 'Folder' `
-Value $item.FullName `
-PassThru
}
}
$output | Export-Csv -Path $outfile -NoTypeInformation
Ok, let's do this. I've made it into a function, and removed the OutFile part of it. If you want to output it to a file, pipe it to Export-CSV. If you want it saved as a variable, assign it to a variable. Just simpler this way.
Function Get-RecursiveACLs{
Param(
[String]$Path=$(Throw "You must specify a path")
)
$Output = GCI $Path -Recurse -Directory|%{
$PathName=$_.FullName
$_.GetAccessControl().Access|%{
Add-Member -InputObject $_ -NotePropertyName "Path" -NotePropertyValue $PathName -PassThru
}
}
}
Then it's a simple matter of storing it in a variable like:
$ACLList = Get-RecursiveACLs "C:\Example\Path"
Or piping it to output to a CSV if you would prefer:
Get-RecursiveACLs "C:\Example\Path" | Export-CSV "C:\Results.csv" -NoType
Put the function at the top of your script and call it as needed.