Powershell script to batch replace links in Excel workbooks - excel

I am for quite a while, in my free time, tackling a script that can batch replace external link addresses in multiple excel files within script folder. I have learned, that you can't change external links via usual powershell to excel interaction, as these values are forced to read-only. However, there is a clever way to bypass that by converting the Excel file to a .zip archive and read/change the files inside and then rename them back to excel format.
Through learning and digging around the web, i have compiled this script function that should create a backup, rename to archive and then replace desired text within, renaming the file backwards afterwards.
'''
function Update-ExcelLinks($xlsxFile, $oldText, $newText) {
# Build BAK file name
$bakFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".bak"
# Build ZIP file name
$zipFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Uncomment the next line to create backup before processing XLSX file
# Copy-Item $xlsxFile $bakFile
# Rename file to ZIP
Rename-Item -Path $xlsxFile -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Include *.xml,*.bin.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.PSPath) |
Foreach-Object { $_ -replace $oldText, $newText } |
Set-Content $file.PSPath
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempFolder
# Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $xlsxFile
}
'''
The problem is that it successfully interacts with the desired file and renames it, but refuses to interact with external links information within the archive at "'Excel File.zip'\xl\externalLinks_rels" directory. The link information I am trying to replace is to change the "/wk28/example_file_wk28.xlsb" with "/wk29/example_file_wk29.xlsb" by changing the wk28 string to wk29 for each external link and so on. Does anybody have experience in this field? As I am only starting my scripting adventure and can't quite diagnose the problem within this script.

Related

Save files to Azure fileshare in the sub directory

Below is the runbook code I am using to save the file to azure fileshare. But unable to save in subdirectory.
#Set the context using the storage account name and key
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$s = Get-AzureStorageShare "X1" –Context $context
$ErrorLogFileName = "Test.csv"
$LogItem = New-Item -ItemType File -Name $ErrorLogFileName
$_ | Out-File -FilePath $ErrorLogFileName -Append
Set-AzureStorageFileContent –Share $s –Source $ErrorLogFileName
Here I have a folder structure like X1/X2. But unable to get there and save the Test.csv. infact able to save it X1 folder on the Azure fileshare. Any idea ?
Thanks in Advance
You can specify the -Path parameter for Set-AzureStorageFileContent.
For example, the file share is X1, and there is a folder X2 inside X1. Then you can use the command below:
Set-AzureStorageFileContent –Share "X1" –Source $ErrorLogFileName -Path "X2" -Context $context
By the way, the command you're using is old, please try to consider using the latest az powershell command. For example, you can use Set-AzStorageFileContent instead of Set-AzureStorageFileContent(if you try to use the az module, then please replace all the old commands with the new ones).

Download multiple powershell files from Azure Storage container as a Zip

How to download all the PowerShell files from the Azure storage
account container to a zip folder through power shell cmdlets?
As of now, the below cmdlet helps to download a specific blob by its name
$blob = Get-AzureStorageBlobContent -Container hemac -Blob "CreateAndDeploy-Eventhubs.ps1" -Context $ctx -Destination $f -Force
First set a folder to download all the blobs
1.Provide the full path to a directory you wish to use for downloaded blob
$DestinationFolder = "<C:\DownloadedBlobs>"
Create the destination directory and download the blob
New-Item -Path $DestinationFolder -ItemType Directory -Force
$blob | Get-AzureStorageBlobContent –Destination $DestinationFolder
Now zip the entire folder.
$folderToZip = "C:\DownloadedBlobs"
$rootfolder = Split-Path -Path $folderToZip
$zipFile = Join-Path -Path $rootfolder -ChildPath "ZippedFile.zip"
Then you should use this - docs
Compress-Archive -Path $folderToZip -DestinationPath $zipFile -Verbose
The zipped file will be in the same directory as the download folder

How do I export the results into an Excel spreadsheet and highlight them in green once the files have been deleted from a csv list in Powershell?

I have a powershell script that removes files from a csv list. However, I'm not sure how to export the results once the files have been deleted from the list and mark them in green in a spreadsheet in Excel. How would I approach this? Below is my powershell script:
$files = Get-Content "C:\test\remove.csv"
foreach ($file in $files) {
Remove-Item -Path $file -force
}

Import CSV foreach loop process

I am working on an issue and I can't seem to get the syntax correct.
I have a directory which has a series of csv files which each contain a list of virtual directories and paths from an old IIS6 machine. I am recreating those on a new IIS7.5 machine and I am able to get them added one directory at a time by going to the directory "iis:\sites\Atlanta" and running this command.
Import-Csv C:\Users\MIGXHZ700\Desktop\Atlanta.csv | Where-Object {$_.Path -match "\\"} | ForEach-Object {New-Item $_.Name -Type VirtualDirectory -physicalPath $_.Path}
For the life of me I can't get the syntax right to run this in a script. I think it's just an issue with concatenation, but I am not 100% sure. Here is where I am at with the script.
$dirs = ls C:\Users\[blah]\Desktop\*.csv | foreach-object {
Import-Csv $_ |
Where-Object {$_.Path -match "\\"} |
ForEach-Object {New-Item 'IIS:\Sites\'+$_.Name -Type VirtualDirectory -physicalPath $_.Path}
}
It also might be an issue doing Foreach inside of a Foreach?
Thanks in advance for any help.
'IIS:\Sites\'+$_.Name is not a valid argument to New-Item, because the -Path parameter takes a string argument, but that's an expression. It's an expression that evaluates to a string representing the path of the item you want to create, but you need to evaluate it by enclosing it in parentheses:
New-Item ('IIS:\Sites\' + $_.Name) -Type VirtualDirectory -PhysicalPath $_.Path
BTW, what's your intention for $dirs? It will be assigned the output of the New-Item command, which will be an array of DirectoryInfo objects (the same as what you'd get from $dirs = Get-ChildItem IIS:\Sites\ after creating all those directories). Is that what you want?

Search a file in 4 different path using powershell

I need to search in 4 different path:
C:\Program Files\test1
C:\Program Files\test3
C:\Program Files (x86)\test6
D:\
I am using the following shell command:
Get-ChildItem -Path C:\ -Filter file.txt -Recurse | fl directory > C:\filereport.txt
Can you please help me to use a similar command that would search all the above path and also not cut the path?
In fact, when I extract the file some path are cut. I would need the length to be longer as after 107 characters it is not showed in 1 line
Thanks,
Graig
Try:
$p = "C:\Program Files\test1", "C:\Program Files\test3", "C:\Program Files (x86)\test6", "D:\"
Get-ChildItem -Path $p -Filter file.txt -Recurse | select directory | ac C:\filereport.txt

Resources