A few weeks ago, I had to remove password protection from excel files which were created by an application. I had no passwords. Can this task be done with powershell, using xml transformation?
This is my solution I want to share with you. The powershell script removes passwords and sheet protections from Excel files using powershell. No Excel application and no passwords are needed. The script is working only for .xlsx file types, not for .xls. If you have ideas for improvement, let me know. Thank you.
Add-Type -AssemblyName System.IO.Compression
Add-Type -AssemblyName System.IO.Compression.FileSystem
#-----------------------------------------------------------------------
function Remove-Excel-WriteProtection {
<#
// Removes all password and write protection from existing excel file
// (workbook and worksheets).
// No password needed.
//
// Input: Path to Excel file (must newer xlsx format)
// Output: true if successfull
#>
#-----------------------------------------------------------------------
param(
[Parameter(Mandatory=$true)]
[string]$filePathExcel
)
if( !(Test-Path -Path $filePathExcel) -or
!(Split-Path -Path $filePathExcel -Leaf).EndsWith('xlsx') ) {
return $false
}
$fileItem = Get-Item $filePathExcel
$filePathZip = $fileItem.DirectoryName + '\' + $fileItem.BaseName + '.zip'
$filePathTemp = $fileItem.DirectoryName + '\' + ([System.Guid]::NewGuid()).Guid
Rename-Item -Path $filePathExcel -NewName $filePathZip -Force
Expand-Archive -Path $filePathZip -DestinationPath $filePathTemp -Force
$xml = New-Object System.Xml.XmlDocument
$xml.PreserveWhitespace = $true
$workbookCollection = (Get-ChildItem -Path $filePathTemp -Filter 'workbook.xml' -Recurse -Force)
foreach( $workbook in $workbookCollection ) {
[void]$xml.RemoveAll()
[void]$xml.Load($workbook.FullName)
if( $xml.workbook.fileSharing.readOnlyRecommended -or $xml.workbook.fileSharing.reservationPassword ) {
if( $xml.workbook.fileSharing.readOnlyRecommended ) {
$xml.workbook.fileSharing.readOnlyRecommended = '0'
}
if( $xml.workbook.fileSharing.reservationPassword ) {
$xml.workbook.fileSharing.reservationPassword = ''
}
[void]$xml.Save($workbook.FullName)
}
}
$worksheetCollection = (Get-ChildItem -Path $filePathTemp -Filter 'sheet*.xml' -Recurse -Force)
foreach( $worksheet in $worksheetCollection ) {
[void]$xml.RemoveAll()
[void]$xml.Load($worksheet.FullName)
if( $xml.worksheet.sheetProtection ) {
[void]$xml.worksheet.RemoveChild($xml.worksheet.sheetProtection)
[void]$xml.Save($worksheet.FullName)
}
}
Remove-Item -Path $filePathZip -Force
Compress-Archive -Path ($filePathTemp + '\*') -DestinationPath $filePathZip -Force -CompressionLevel Optimal
Remove-Item -Path $filePathTemp -Recurse -Force
Rename-Item -Path $filePathZip -NewName $filePathExcel -Force
return $true
}
# Remove all passwords for test.xlsx
$result = Remove-Excel-WriteProtection -filePathExcel 'C:\Users\YourName\Desktop\test.xlsx'
Related
I have an odd problem. I have a script that works fine when manually run it. I created a scheduled job in windows and run this script automatically. The script works fine until the last stage of script.
$deploymentfiles_mdm = Get-ChildItem 'D:\DeploymentTriggerApp\*'
Write-Host $deploymentfiles_mdm
$timestamp_app = Get-Date -Format o | ForEach-Object {$_ -replace ":", "."}
Write-Host $timestamp_app
$server = Get-Content 'C:\Users\Administrator\Desktop\Scripts\AutoDeployment\ProdMDMapps.txt'
$User = 'domain\user'
$SecurePassword = Get-Content C:\Users\Administrator\Desktop\Scripts\Password.txt | ConvertTo-SecureString
$UserCred = New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)
if (Test-Path -Path $deploymentfiles_mdm)
{
do{
try
{
$ServerSessions = New-PSSession -ComputerName $server -Credential $UserCred -ErrorAction Stop
Write-Host ("$ServerSessions")
}
catch [Exception]
{
Write-Host("Credential is incorrect or password is expired. Either change credential and run CredentialEncryption.ps1 or communicate with dc admin to open expired password!")
}
}while(!$ServerSessions)
Copy-Item "D:\Deployment_Files\*.zip" -ToSession $ServerSessions -Destination "D:\Deployment_Files\" -ErrorAction SilentlyContinue
try{
Invoke-Command -Session $ServerSessions -ScriptBlock {
param($timestampApp)
$appPath = Get-ChildItem 'D:\MDM\live\bin\'
Expand-Archive -Path 'D:\Deployment_Files\*.zip' -DestinationPath 'D:\Deployment_Files\' -Force
Remove-Item -Path 'D:\Deployment_Files\*.zip'
$nodeProcess = Get-Process | Where-Object { $_.Name -eq "node"}
if($nodeProcess -Or $appPath)
{
Get-Process | Where-Object { $_.Name -eq "node"} | Select-Object -First 1 | Stop-Process -Force
New-Item -Path 'D:\Backups\' -Name $timestampApp -ItemType 'directory'
Get-ChildItem -Path "D:\MDM\live\bin\" -Recurse | Move-Item -Destination "D:\Backups\$timestampApp\"
}
Copy-Item "D:\Deployment_Files\bin" -Destination "D:\MDM\live\" -Recurse -Force -ErrorAction SilentlyContinue
Remove-Item "D:\Deployment_Files\*" -Recurse -Force
Start-Job -ScriptBlock{ node D:\MDM\live\bin\main.js}
} -ArgumentList $timestamp_app
}
catch
{
$_.Exception.Message
}
}
Remove-Item D:\DeploymentTriggerApp\*
In section Start-Job -ScriptBlock{ node D:\MDM\live\bin\main.js} the script can't start the node process. When I manually run it, it runs without any problem.
Any suggestions for that? (The node process needed to be in background job. If any alternative commands to that, I can also try that solution)
Below line solved my problem.
Start-Job -ScriptBlock{& 'C:\Program Files\nodejs\node.exe' D:\MDM\live\bin\main.js}
Hello,
I am trying to create a PowerShell script where it takes the properties of a PowerPoint file and removes them so they no issues are being caused by it but with the code I am making I am trying to copy some code that does the same thing excel and word and just change something over to PowerPoint and it doesn't seem to want to work
Code that I've tried so far (forgive me I'm not the most experienced with PowerShell)
$path = “c:\fso”
Add-Type -AssemblyName Microsoft.Office.Interop.PowerPoint
$PpRemoveDocType = “Microsoft.Office.Interop.Powerpoint.PpRemoveDocInfoType” -as [type]
$pointFiles = Get-ChildItem -Path $path -include *.pot, *.ppt, *.pps -recurse
$objPoint.visible = $false
$objPoint = New-Object -ComObject powerpoint.application
foreach($wb in $pointFiles)
{
$workbook = $objPoint.workbooks.open($wb.fullname)
“Removing document information from $wb”
$workbook.RemoveDocumentInformation($PpRemoveDocType::xlRDIAll)
$workbook.Save()
$objPoint.Workbooks.close()
}
$objPoint.Quit()
This is the Excel code for reference and it works just fine
$path = “c:\fso”
Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlRemoveDocType = “Microsoft.Office.Interop.Excel.XlRemoveDocInfoType” -as [type]
$excelFiles = Get-ChildItem -Path $path -include *.xls, *.xlsx -recurse
$objExcel = New-Object -ComObject excel.application
$objExcel.visible = $false
foreach($wb in $excelFiles)
{
$workbook = $objExcel.workbooks.open($wb.fullname)
“Removing document information from $wb”
$workbook.RemoveDocumentInformation($xlRemoveDocType::xlRDIAll)
$workbook.Save()
$objExcel.Workbooks.close()
}
$objExcel.Quit()
Thank you for the help.
This has worked for me in the past:
function handlePowerpointFiles ($file) {
Write-Host "Processing file: " $file.Fullname
Add-Type -AssemblyName Microsoft.Office.Interop.Powerpoint
$PpRemoveDocType = "Microsoft.Office.Interop.PowerPoint.PpRemoveDocInfoType" -as [type]
$objpp = New-Object -ComObject Powerpoint.Application
$doc = $objpp.Presentations.Open($file.FullName, $false, $null, $false)
$doc.RemoveDocumentInformation($PpRemoveDocType::ppRDIAll)
$doc.Save()
$doc.Close()
$objpp.Quit()
}
The reason I wrapped it into a function, is because I wrote a small tool to handle all major Microsoft Office files, also perform some pre-checks if the file is locked or password protected. Essentially something along the lines of (simplified):
$path = "C:\Documents"
$files = Get-ChildItem -Path $path -include *.doc, *.docx, *.xls, *.xlsx, *.ppt, *.pptx -Recurse
foreach ($fileEntry in $files) {
if (CheckForFileLock $fileEntry.FullName) {
Write-Error "File '$($fileEntry.FullName)' is locked" -ErrorAction Stop
}
if (CheckForPasswordProtection($fileEntry) -eq $true){
Write-Error "File '$($fileEntry.FullName)' is password protected" -ErrorAction Stop
}
Switch ($fileEntry.Extension){
{$_ -in ".xls",".xlsx"} {
handleExcelFiles $fileEntry
}
{$_ -in ".doc",".docx"} {
handleWordFiles $fileEntry
}
{$_ -in ".ppt",".pptx"} {
handlePowerpointFiles $fileEntry
}
}
}
You get the idea.
Hi I am trying to upload published files from Azure Git artifacts to FTP. But randomly I am getting the below error.
This is only happening for files not for any folder or subfolders.
“UploadFile” with “2” argument(s): “The Content-Type header cannot be set to a multipart type for this request.
All the files are available in the artifacts.
Observation & tried:
All files are present in artifacts
Getting the error only for files. (Folder & subfolders are creating successfully)
Stopped the Web App and then tried to upload
Tried Sleep also between uploading two files
After all of these, the result is the same.
Can anyone please help me out?
Get-ChildItem -Path $target_directory
# Upload files recursively
write-host "target_directory - $target_directory"
Set-Location $target_directory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $target_directory -Recurse
write-host 'Uploading started............................'
foreach ($file in $files)
{
write-host "file -" + $file.FullName
if($file.FullName -match "web.config" -or $file.FullName -match ".pdb" -or $file.FullName -match "roslyn" -or $file.FullName -match "obj\Debug"){
write-host "ignoring " + $file.FullName
continue
}
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
write-host "uri -" + $uri.AbsoluteUri
write-host '--------'
if($file.PSIsContainer)
{
write-host 'PSIsContainer - All dir/files within' $file
get-childitem -path $file -Recurse
try
{
$makeDirectory = [System.Net.WebRequest]::Create($uri);
$makeDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$makeDirectory.Method = [System.Net.WebRequestMethods+FTP]::MakeDirectory;
$makeDirectory.GetResponse();
#folder created successfully
}
catch [Net.WebException]
{
try
{
#if there was an error returned, check if folder already existed on server
$checkDirectory = [System.Net.WebRequest]::Create($uri);
$checkDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$checkDirectory.Method = [System.Net.WebRequestMethods+FTP]::PrintWorkingDirectory;
$response = $checkDirectory.GetResponse();
$response.StatusDescription
#folder already exists!
}
catch [Net.WebException]
{
#if the folder didn't exist, then it's probably a file perms issue, incorrect credentials, dodgy server name etc
}
}
continue
}
try{
write-host "Uploading to " $uri.AbsoluteUri " from " $file.FullName
$webclient.UploadFile($uri, $file.FullName)
start-sleep -Seconds 1
}
catch{
##[error]Error message
Write-Host "##[error]Error in uploading " $file -ForegroundColor red
Write-Host $_
$disputedFiles = $disputedFiles + $file.FullName
$_file = #{
FileName = $file.FullName
FTPPath = $uri.AbsoluteUri
}
$_o = New-Object psobject -Property $_file;
$disputedFilesList = $disputedFilesList + $_o
}
}
Write-Host "##[debug] Starting uploading the disputed files....."
foreach($file in $disputedFilesList){
try{
write-host "Uploading to " $file.FTPPath " from " $file.FileName
$webclient.UploadFile($file.FTPPath, $file.FileName)
start-sleep -Seconds 1
}
catch{
write-host "##[error]Error(2) in uploading to " $file.FTPPath " from " $file.FileName
}
}
Write-Host "##[debug] Ending uploading the disputed files....."
remove-item -path $target_directory\* -Recurse
get-childitem -path $target_directory
write-host "Directory Empty after cleanup"
$webclient.Dispose()
I have written a code that have to upload multiple files in azure web app using powershell.
I want to upload folder saved in $appdirectory variable.
$appdirectory="C:\scriptfolder\*"
$webappname="myapitestapp1"
$xml = [xml](Get-AzureRmWebAppPublishingProfile -Name $webappname -ResourceGroupName sibs -OutputFile null)
$xml = [xml]$xml
username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
#$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
its copying files that is from subfolder and uploading but it is not uploading the root directory. I want to upload all folder and files using powershell.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
was wrong.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace('\', '/')
This worked and stopped hiding files
I'm working on a script that uses get-childitem on the other server, but need to change it so it uses credentials of the local account on the other server to do that. When I was just using Active Directory to do that, I was saving the task in our scheduler with my AD login, and it was good on the other server, using the UNC path. But we decided to change it to the local login there recently and I'm getting an error message, trying to use net use. Does anyone know of a good way to do this with the UNC path instead? Or, any idea why the following is giving an error message?
function GetSecureLogin(){
$global:username = "stuff"
$global:password = get-content C:\filename.txt | convertto-securestring
}
function Cleanup([string]$Drive) {
try {
$deleteTime = -42
$now = Get-Date
**#this is saying cannot find path '\\name.na.xxx.net\20xServerBackup\V' name truncated**
Get-ChildItem -Path $Drive -Recurse -Force |Where-Object {$_.LastWriteTime -lt $limit} | Remove-Item -Force
}
Catch{
Write-Host "Failed"
}
}
#####################start of script####################
$share = '\\name.na.xxx.net\20xServerBackup\'
$TheDrive = '\\name.na.xxx.net\20xServerBackup\VMs\'
$global:password = ""
$global:username = ""
GetSecureLogin
net use $share $global:password /USER:$global:username
[array]$DriveArray = #(TheDrive)
try{
$i=0
for ($i = $DriveArray.GetLowerBound(0); $i -le $DriveArray.GetUpperBound(); $i++) {
$tempDrv = $DriveArray[$i]
Cleanup $tempDrv
}
}
catch [Exception] {
Write-Host $_.Exception.Message
}
As you can see, I started using the example at this link with net use, but it's not doing the trick to use credentials to access the other server. powershell unc path cred
I got it to work this way, with New-PSDrive as #robert.westerlund suggests above:
$DestPath = split-path "$Drive" -Parent #this gives format without slash at and and makes powerShell *very happy*
New-PSDrive -Name target -PSProvider FileSystem -Credential $global:cred -Root "$DestPath" | Out-Null
$temp1 = Get-ChildItem -Path target:\VMs\ -Recurse -Force | Where-Object { $_.LastWriteTime -lt $limit}
Get-ChildItem -Path $Drive -Recurse -Force | Where-Object { $_.LastWriteTime -lt $limit} | Remove-Item -Force
Remove-PSDrive target
I had to add the cred part like this too:
$global:cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $global:username, $global:password