Hello,
I am trying to create a PowerShell script where it takes the properties of a PowerPoint file and removes them so they no issues are being caused by it but with the code I am making I am trying to copy some code that does the same thing excel and word and just change something over to PowerPoint and it doesn't seem to want to work
Code that I've tried so far (forgive me I'm not the most experienced with PowerShell)
$path = “c:\fso”
Add-Type -AssemblyName Microsoft.Office.Interop.PowerPoint
$PpRemoveDocType = “Microsoft.Office.Interop.Powerpoint.PpRemoveDocInfoType” -as [type]
$pointFiles = Get-ChildItem -Path $path -include *.pot, *.ppt, *.pps -recurse
$objPoint.visible = $false
$objPoint = New-Object -ComObject powerpoint.application
foreach($wb in $pointFiles)
{
$workbook = $objPoint.workbooks.open($wb.fullname)
“Removing document information from $wb”
$workbook.RemoveDocumentInformation($PpRemoveDocType::xlRDIAll)
$workbook.Save()
$objPoint.Workbooks.close()
}
$objPoint.Quit()
This is the Excel code for reference and it works just fine
$path = “c:\fso”
Add-Type -AssemblyName Microsoft.Office.Interop.Excel
$xlRemoveDocType = “Microsoft.Office.Interop.Excel.XlRemoveDocInfoType” -as [type]
$excelFiles = Get-ChildItem -Path $path -include *.xls, *.xlsx -recurse
$objExcel = New-Object -ComObject excel.application
$objExcel.visible = $false
foreach($wb in $excelFiles)
{
$workbook = $objExcel.workbooks.open($wb.fullname)
“Removing document information from $wb”
$workbook.RemoveDocumentInformation($xlRemoveDocType::xlRDIAll)
$workbook.Save()
$objExcel.Workbooks.close()
}
$objExcel.Quit()
Thank you for the help.
This has worked for me in the past:
function handlePowerpointFiles ($file) {
Write-Host "Processing file: " $file.Fullname
Add-Type -AssemblyName Microsoft.Office.Interop.Powerpoint
$PpRemoveDocType = "Microsoft.Office.Interop.PowerPoint.PpRemoveDocInfoType" -as [type]
$objpp = New-Object -ComObject Powerpoint.Application
$doc = $objpp.Presentations.Open($file.FullName, $false, $null, $false)
$doc.RemoveDocumentInformation($PpRemoveDocType::ppRDIAll)
$doc.Save()
$doc.Close()
$objpp.Quit()
}
The reason I wrapped it into a function, is because I wrote a small tool to handle all major Microsoft Office files, also perform some pre-checks if the file is locked or password protected. Essentially something along the lines of (simplified):
$path = "C:\Documents"
$files = Get-ChildItem -Path $path -include *.doc, *.docx, *.xls, *.xlsx, *.ppt, *.pptx -Recurse
foreach ($fileEntry in $files) {
if (CheckForFileLock $fileEntry.FullName) {
Write-Error "File '$($fileEntry.FullName)' is locked" -ErrorAction Stop
}
if (CheckForPasswordProtection($fileEntry) -eq $true){
Write-Error "File '$($fileEntry.FullName)' is password protected" -ErrorAction Stop
}
Switch ($fileEntry.Extension){
{$_ -in ".xls",".xlsx"} {
handleExcelFiles $fileEntry
}
{$_ -in ".doc",".docx"} {
handleWordFiles $fileEntry
}
{$_ -in ".ppt",".pptx"} {
handlePowerpointFiles $fileEntry
}
}
}
You get the idea.
Related
Hi I am trying to upload published files from Azure Git artifacts to FTP. But randomly I am getting the below error.
This is only happening for files not for any folder or subfolders.
“UploadFile” with “2” argument(s): “The Content-Type header cannot be set to a multipart type for this request.
All the files are available in the artifacts.
Observation & tried:
All files are present in artifacts
Getting the error only for files. (Folder & subfolders are creating successfully)
Stopped the Web App and then tried to upload
Tried Sleep also between uploading two files
After all of these, the result is the same.
Can anyone please help me out?
Get-ChildItem -Path $target_directory
# Upload files recursively
write-host "target_directory - $target_directory"
Set-Location $target_directory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $target_directory -Recurse
write-host 'Uploading started............................'
foreach ($file in $files)
{
write-host "file -" + $file.FullName
if($file.FullName -match "web.config" -or $file.FullName -match ".pdb" -or $file.FullName -match "roslyn" -or $file.FullName -match "obj\Debug"){
write-host "ignoring " + $file.FullName
continue
}
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
write-host "uri -" + $uri.AbsoluteUri
write-host '--------'
if($file.PSIsContainer)
{
write-host 'PSIsContainer - All dir/files within' $file
get-childitem -path $file -Recurse
try
{
$makeDirectory = [System.Net.WebRequest]::Create($uri);
$makeDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$makeDirectory.Method = [System.Net.WebRequestMethods+FTP]::MakeDirectory;
$makeDirectory.GetResponse();
#folder created successfully
}
catch [Net.WebException]
{
try
{
#if there was an error returned, check if folder already existed on server
$checkDirectory = [System.Net.WebRequest]::Create($uri);
$checkDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$checkDirectory.Method = [System.Net.WebRequestMethods+FTP]::PrintWorkingDirectory;
$response = $checkDirectory.GetResponse();
$response.StatusDescription
#folder already exists!
}
catch [Net.WebException]
{
#if the folder didn't exist, then it's probably a file perms issue, incorrect credentials, dodgy server name etc
}
}
continue
}
try{
write-host "Uploading to " $uri.AbsoluteUri " from " $file.FullName
$webclient.UploadFile($uri, $file.FullName)
start-sleep -Seconds 1
}
catch{
##[error]Error message
Write-Host "##[error]Error in uploading " $file -ForegroundColor red
Write-Host $_
$disputedFiles = $disputedFiles + $file.FullName
$_file = #{
FileName = $file.FullName
FTPPath = $uri.AbsoluteUri
}
$_o = New-Object psobject -Property $_file;
$disputedFilesList = $disputedFilesList + $_o
}
}
Write-Host "##[debug] Starting uploading the disputed files....."
foreach($file in $disputedFilesList){
try{
write-host "Uploading to " $file.FTPPath " from " $file.FileName
$webclient.UploadFile($file.FTPPath, $file.FileName)
start-sleep -Seconds 1
}
catch{
write-host "##[error]Error(2) in uploading to " $file.FTPPath " from " $file.FileName
}
}
Write-Host "##[debug] Ending uploading the disputed files....."
remove-item -path $target_directory\* -Recurse
get-childitem -path $target_directory
write-host "Directory Empty after cleanup"
$webclient.Dispose()
I have the following code that converts an excel sheets to csv files. If the csv files do not exist/or exist already but not in use (e.g. opened in excel), the script generates the csv files successfully (overwriting them if they exist already)!
However, if the csv file is opened in excel, then i get an error "Can't access csv file" which i have determined is because its in use by excel (when opened). I know this is 100% the reason because if i have the existing csv file opened in notepad, the script still overwrites the csv file, running successfully.
so i tried implementing an automatic resolution, which is Get-Process 'exce[l]' | Stop-Process -Force , and although it does stop the process (closes excel), I get yet another error:
Convert-ExcelSheetsToCsv : Failed to save csv! Path: 'C:\Users\Documents\Folder1\CSV_Files\COS.csv'. The remote
procedure call failed. (Exception from HRESULT: 0x800706BE)
Convert-ExcelSheetsToCsv : Failed to save csv! Path: 'C:\Users\Documents\Folder1\CSV_Files\.csv'. The RPC server is
unavailable. (Exception from HRESULT: 0x800706BA)
After some research, I disabled my COM-Excel Addins, ran the script again, and the exceptions still occurred again...
why is that?
$currentDir = $PSScriptRoot
$csvPATH = Join-Path -Path $currentDir -ChildPath CSV_Files
New-Item -ItemType Directory -Force -Path $csvPATH | out-null
function Convert-ExcelSheetsToCsv {
param(
[Parameter(Mandatory, ValueFromPipelineByPropertyName, Position=1)]
[ValidateNotNullOrEmpty()]
[Alias('FullName')]
[string]$Path,
[Parameter(Mandatory = $false, Position=0)]
[bool]$AppendFileName,
[Parameter(Mandatory = $false, Position=2)]
[bool]$ExcludeHiddenSheets,
[Parameter(Mandatory = $false, Position=3)]
[bool]$ExcludeHiddenColumns
)
Begin {
$excel = New-Object -ComObject Excel.Application -Property #{
Visible = $false
DisplayAlerts = $false
}
}
Process {
#$root = Split-Path -Path $Path
$filename = [System.IO.Path]::GetFileNameWithoutExtension($Path)
$workbook = $excel.Workbooks.Open($Path)
foreach ($worksheet in ($workbook.Worksheets | Where { <# $_.Visible -eq -1 #> $_.Name -ne 'Security' -and $_.Name -ne 'Notes' })) {
if($ExcludeHiddenColumns) {
$ColumnsCount = $worksheet.UsedRange.Columns.Count
for ($i=1; $i -le $ColumnsCount; $i++)
{
$column = $worksheet.Columns.Item($i).EntireColumn #$worksheet.sheets.columns.entirecolumn.hidden=$true
if ($column.hidden -eq $true)
{
$columnname = $column.cells.item(1,$i).value2
if ($worksheet.Visible -eq 0) #worksheet hidden
{
"`r`nHidden column [{0}] found in hidden [{1}] worksheet. Deleting..." -f $columnname, $($worksheet.name)
}
else {
"`r`nHidden column [{0}] found in [{1}] worksheet. Deleting..." -f $columnname, $($worksheet.name)
}
try {
$column.Delete() | out-null
"`r`nHidden column [{0}] was Deleted! Proceeding with Export to CSV operation...`r`n" -f $columnname
}
catch {
Write-Error -Message "`r`nFailed to Delete hidden column [$columnname] from [$($worksheet.name)] worksheet! $PSItem"
#$_ | Select *
}
#$i = $i - 1
}
}
}
if ($ExcludeHiddenSheets) {
if ($worksheet.Visible -eq -1) #worksheet visible
{
$ws = $worksheet
}
}
else {
$ws = $worksheet
}
if ($AppendFileName) {
$name = Join-Path -Path $csvPATH <# $root #> -ChildPath "${filename}_$($ws.Name).csv"
}
else {
$name = Join-Path -Path $csvPATH <# $root #> -ChildPath "$($ws.Name).csv"
}
try {
$ws.SaveAs($name, 6) #6 to ignore formatting and convert to pure text, otherwise, file could end up containing rubbish
}
catch {
if ($error[0].ToString().Contains("Cannot access"))
{
"`r`n'{0}' is currently in use.`r`n Attempting to override usage by trying to stop Excel process..." -f $name
try {
#Only 'excel' will be matched, but because a wildcard [] is used, not finding a match will not generate an error.
#https://stackoverflow.com/a/32475836/8397835
Get-Process 'exce[l]' | Stop-Process -Force
"`r`nExcel process stopped! Saving '{0}' ..." -f $name
$ws.SaveAs($name, 6)
}
catch {
Write-Error -Message "Failed to save csv! Path: '$name'. $PSItem"
}
}
else {
Write-Error -Message "Failed to save csv! Path: '$name'. $PSItem"
}
}
}
}
End {
$excel.Quit()
$null = [System.Runtime.InteropServices.Marshal]::ReleaseComObject($excel)
}
}
Get-ChildItem -Path $currentDir -Filter *.xlsx | Convert-ExcelSheetsToCsv -AppendFileName 0 -ExcludeHiddenSheets 1 -ExcludeHiddenColumns 1 #0 for false, so that filename of excel file isnt appended, and only sheet names are the names of the csv files
That is because the excel object ends up getting destroyed as well. the correct way to do this is to end the process PRIOR to instantiating the excel object:
Begin {
Get-Process 'exce[l]' | Stop-Process -Force
A few weeks ago, I had to remove password protection from excel files which were created by an application. I had no passwords. Can this task be done with powershell, using xml transformation?
This is my solution I want to share with you. The powershell script removes passwords and sheet protections from Excel files using powershell. No Excel application and no passwords are needed. The script is working only for .xlsx file types, not for .xls. If you have ideas for improvement, let me know. Thank you.
Add-Type -AssemblyName System.IO.Compression
Add-Type -AssemblyName System.IO.Compression.FileSystem
#-----------------------------------------------------------------------
function Remove-Excel-WriteProtection {
<#
// Removes all password and write protection from existing excel file
// (workbook and worksheets).
// No password needed.
//
// Input: Path to Excel file (must newer xlsx format)
// Output: true if successfull
#>
#-----------------------------------------------------------------------
param(
[Parameter(Mandatory=$true)]
[string]$filePathExcel
)
if( !(Test-Path -Path $filePathExcel) -or
!(Split-Path -Path $filePathExcel -Leaf).EndsWith('xlsx') ) {
return $false
}
$fileItem = Get-Item $filePathExcel
$filePathZip = $fileItem.DirectoryName + '\' + $fileItem.BaseName + '.zip'
$filePathTemp = $fileItem.DirectoryName + '\' + ([System.Guid]::NewGuid()).Guid
Rename-Item -Path $filePathExcel -NewName $filePathZip -Force
Expand-Archive -Path $filePathZip -DestinationPath $filePathTemp -Force
$xml = New-Object System.Xml.XmlDocument
$xml.PreserveWhitespace = $true
$workbookCollection = (Get-ChildItem -Path $filePathTemp -Filter 'workbook.xml' -Recurse -Force)
foreach( $workbook in $workbookCollection ) {
[void]$xml.RemoveAll()
[void]$xml.Load($workbook.FullName)
if( $xml.workbook.fileSharing.readOnlyRecommended -or $xml.workbook.fileSharing.reservationPassword ) {
if( $xml.workbook.fileSharing.readOnlyRecommended ) {
$xml.workbook.fileSharing.readOnlyRecommended = '0'
}
if( $xml.workbook.fileSharing.reservationPassword ) {
$xml.workbook.fileSharing.reservationPassword = ''
}
[void]$xml.Save($workbook.FullName)
}
}
$worksheetCollection = (Get-ChildItem -Path $filePathTemp -Filter 'sheet*.xml' -Recurse -Force)
foreach( $worksheet in $worksheetCollection ) {
[void]$xml.RemoveAll()
[void]$xml.Load($worksheet.FullName)
if( $xml.worksheet.sheetProtection ) {
[void]$xml.worksheet.RemoveChild($xml.worksheet.sheetProtection)
[void]$xml.Save($worksheet.FullName)
}
}
Remove-Item -Path $filePathZip -Force
Compress-Archive -Path ($filePathTemp + '\*') -DestinationPath $filePathZip -Force -CompressionLevel Optimal
Remove-Item -Path $filePathTemp -Recurse -Force
Rename-Item -Path $filePathZip -NewName $filePathExcel -Force
return $true
}
# Remove all passwords for test.xlsx
$result = Remove-Excel-WriteProtection -filePathExcel 'C:\Users\YourName\Desktop\test.xlsx'
I have written a code that have to upload multiple files in azure web app using powershell.
I want to upload folder saved in $appdirectory variable.
$appdirectory="C:\scriptfolder\*"
$webappname="myapitestapp1"
$xml = [xml](Get-AzureRmWebAppPublishingProfile -Name $webappname -ResourceGroupName sibs -OutputFile null)
$xml = [xml]$xml
username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
#$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
its copying files that is from subfolder and uploading but it is not uploading the root directory. I want to upload all folder and files using powershell.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
was wrong.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace('\', '/')
This worked and stopped hiding files
I'm working on a script that uses get-childitem on the other server, but need to change it so it uses credentials of the local account on the other server to do that. When I was just using Active Directory to do that, I was saving the task in our scheduler with my AD login, and it was good on the other server, using the UNC path. But we decided to change it to the local login there recently and I'm getting an error message, trying to use net use. Does anyone know of a good way to do this with the UNC path instead? Or, any idea why the following is giving an error message?
function GetSecureLogin(){
$global:username = "stuff"
$global:password = get-content C:\filename.txt | convertto-securestring
}
function Cleanup([string]$Drive) {
try {
$deleteTime = -42
$now = Get-Date
**#this is saying cannot find path '\\name.na.xxx.net\20xServerBackup\V' name truncated**
Get-ChildItem -Path $Drive -Recurse -Force |Where-Object {$_.LastWriteTime -lt $limit} | Remove-Item -Force
}
Catch{
Write-Host "Failed"
}
}
#####################start of script####################
$share = '\\name.na.xxx.net\20xServerBackup\'
$TheDrive = '\\name.na.xxx.net\20xServerBackup\VMs\'
$global:password = ""
$global:username = ""
GetSecureLogin
net use $share $global:password /USER:$global:username
[array]$DriveArray = #(TheDrive)
try{
$i=0
for ($i = $DriveArray.GetLowerBound(0); $i -le $DriveArray.GetUpperBound(); $i++) {
$tempDrv = $DriveArray[$i]
Cleanup $tempDrv
}
}
catch [Exception] {
Write-Host $_.Exception.Message
}
As you can see, I started using the example at this link with net use, but it's not doing the trick to use credentials to access the other server. powershell unc path cred
I got it to work this way, with New-PSDrive as #robert.westerlund suggests above:
$DestPath = split-path "$Drive" -Parent #this gives format without slash at and and makes powerShell *very happy*
New-PSDrive -Name target -PSProvider FileSystem -Credential $global:cred -Root "$DestPath" | Out-Null
$temp1 = Get-ChildItem -Path target:\VMs\ -Recurse -Force | Where-Object { $_.LastWriteTime -lt $limit}
Get-ChildItem -Path $Drive -Recurse -Force | Where-Object { $_.LastWriteTime -lt $limit} | Remove-Item -Force
Remove-PSDrive target
I had to add the cred part like this too:
$global:cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $global:username, $global:password