Checking Who has an Excel File Open In Powershell - excel

Im using Powershell 4.0 on win7 64 bit, I want to see who has an excel file open, or even if the file is open.
Example. I have the excel file "test" on network drive B. If one person opens "test" I understand that will create an excel lock file looking like this "~$test.xls".
So far I have used Test-path to verify that the excel lock file exists. Then I believe I can use Get-Acl to find the owner of that file. Is there a simpler way to find out who has an excel file open? Or will my workaround for checking the ownership of the lock file work?

I still use Netapi32 based functions for achieving this.
Add-Type -TypeDefinition #"
using System;
using System.Runtime.InteropServices;
public class Netapi
{
[DllImport("Netapi32.dll",CharSet=CharSet.Unicode)]
public static extern int NetApiBufferFree(IntPtr buffer);
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
public struct FILE_INFO_3
{
public uint FileID;
public uint Permissions;
public uint NumLocks;
[MarshalAs(UnmanagedType.LPWStr)] public string Path;
[MarshalAs(UnmanagedType.LPWStr)] public string User;
}
[DllImport("Netapi32.dll",CharSet=CharSet.Unicode)]
public static extern uint NetFileEnum(
[In,MarshalAs(UnmanagedType.LPWStr)] string server,
[In,MarshalAs(UnmanagedType.LPWStr)] string path,
[In,MarshalAs(UnmanagedType.LPWStr)] string user,
int level,
out IntPtr bufptr,
int prefmaxlen,
ref Int32 entriesread,
ref Int32 totalentries,
ref Int32 resume_handle);
}
"#
function Get-OpenFiles
{
[CmdletBinding()]
param ( [string]$Server = "localhost",
[string]$User = $null,
[string]$Path = $null)
$struct = New-Object netapi+FILE_INFO_3
$buffer = 0
$entries = 0
$total = 0
$handle = 0
$Level=3 # used to define the type of struct we want, i.e. FILE_INFO_3
$ret = [Netapi]::NetFileEnum($server, $path, $user, $level,
[ref]$buffer, -1,
[ref]$entries, [ref]$total,
[ref]$handle)
$files = #()
if (!$ret)
{
$offset = $buffer.ToInt64()
$increment = [System.Runtime.Interopservices.Marshal]::SizeOf([System.Type]$struct.GetType())
for ($i = 0; $i -lt $entries; $i++)
{
$ptr = New-Object system.Intptr -ArgumentList $offset
$files += [system.runtime.interopservices.marshal]::PtrToStructure($ptr, [System.Type]$struct.GetType())
$offset = $ptr.ToInt64()
$offset += $increment
}
}
else
{
Write-Output ([ComponentModel.Win32Exception][Int32]$ret).Message
if ($ret -eq 1208)
{
# Error Code labeled "Extended Error" requires the buffer to be freed
[Void][Netapi]::NetApiBufferFree($buffer)
}
}
$files
}
Then you can call the Get-OpenFiles and pass a specific path name:
Get-OpenFiles -Path C:\Temp\EXCEL.XLSX
FileID : 205
Permissions : 35
NumLocks : 0
Path : C:\Temp\EXCEL.XLSX
User : mickyb
Using Get-OpenFiles -Path C:\Temp works too:
FileID : 205
Permissions : 35
NumLocks : 0
Path : C:\Temp\EXCEL.XLSX
User : mickyb
FileID : 213
Permissions : 51
NumLocks : 0
Path : C:\Temp\~$Excel.xlsx
User : mickyb
You could also see if a specific user has files open:
Get-OpenFiles -User mickyb

If you copy the contents of the lock file to the console it should contain the name of the user who has the file locked. You may have to make a copy of the lock file in order to read it. I'm not familiar with PowerShell but I assume it has all the power of DOS batch files and a technique similar to what I wrote below could be created.
Here is a batch file I have added to my SendTo folder which allows me to right-click on an Excel file and it will show me who has the file locked. I have tested this with .xlsx and .xlsm files.
#echo off
REM ===================================================================
REM Put this in your SendTo folder and it will let you right-click
REM on an .xlsx/.xlsm file and show you the user name in the lock file
REM
REM If an Excel file is locked, look to see if a hidden lock file exists. If
REM the file is found, make a local temp copy of it and display the contents which
REM should be the name of the user holding the lock.
REM ===================================================================
setlocal
set file="%1"
REM Make sure the file has a compatible extension.
if "%~x1"==".xlsx" goto :ExtensionIsValidExcel
if "%~x1"==".xlsm" goto :ExtensionIsValidExcel
echo.
echo "%~n1%~x1" is not a supported file type.
echo.
pause
exit
:ExtensionIsValidExcel
REM If an Excel file is locked, look to see if a hidden lock file exists. If
REM the file is found, make a local temp copy of it and display the contents which
REM should be the name of the user holding the lock.
IF EXIST %~dp1~$%~n1%~x1 (
ECHO f | Xcopy %~dp1~$%~n1%~x1 "%TEMP%\~temp.txt" /H /R /Y /F
attrib -h -s %TEMP%\~temp.txt
cls
ECHO.
ECHO The Excel file "%~n1%~x1" file is locked by:
ECHO.
REM copy the file to the console to show the user name.
copy %TEMP%\~temp.txt con | Find /v "file(s) copied"
del %TEMP%\~temp.txt
) Else (
cls
ECHO.
ECHO The Excel file "%~n1%~x1" file is not locked in a way this was expecting.
)
ECHO.
ECHO.
pause

Related

readdir/File::Find::Rule is not reading a subdirectory and its contents in perl

I have tried 3 different ways to read contents of a folder and none of them are able to identify a subdirectory in a setup.
Strange part is, when i recreate the folder structure locally , the sub-directory is identified and i get the file i am looking for with all the 3 solutions.
This sub directory is created on the fly along with other files in the folder in the setup. Every time i am able to read all the files which are created but not the sub directory and its contents.
I tried below solutions.
Solution 1:
use File::Find::Rule;
my $dir = '.';
my #subdirs = File::Find::Rule->directory->in($dir);
foreach (#subdirs) {
print "Dir --> $_ \n";
}
my #list = ("*.txt", "*.rex");
my #files = File::Find::Rule->file()->name(#list)->in(#subdirs);
foreach (#files) {
print "File --> $_ \n";
}
--> It does not list the sub directory. The sub-directory contains the file i am looking for. So i am not getting the files.
Variant of solution 1, which directly looks in the folder.
my $dir = getcwd();
my #types = ("*.txt","*.rex");
my #files = File::Find::Rule->file()->name(#types)->in("$dir");
print join("\n", #files);
This also does not print the files, as i see it does not check the sub directory which has the files.
Solution 2:
my $cwd = getcwd();
sub find_rex {
my $f = $File::Find::name;
if ($f =~ /rex$/){
print "$f \n";
}
}
find (\&find_rex, $cwd);
Solution 3:
my #dirlist = '.'; # current dir, or command line arguments
foreach (#dirlist) { &check_dir($_); }
exit 0;
sub check_dir {
my $dir=shift;
print "Dir to search --> $dir \n";
warn "cannot traverse directory $dir\n"
unless (opendir D,$dir);
my #files = map {$dir.'/'.$_} grep {!m/^\.{1,2}$/} readdir D;
closedir D;
foreach (#files) {
if (-d $_) {
&check_dir($_);
}
elsif (-f $_) {
if ($_ =~ /\.rex$/ ){
print "Filename --> $_ \n";
}
}
}
}
All these solutions worked locally for me to get the contents of the sub directory. I ensured the sub-directory had the same permissions locally also to test my code.The solutions work locally but it does not work in the actual setup.
I have run out of ideas. I am able to see the sub folder and the files in it i need when i list them in linux. I have tried the glob as well, but it also does not work.
More details: OS: Suse Linux , Bash/TCSH shell
Can anyone suggest something that i can try. I am not sure whether its a readdir problem or something else.
Has anyone faced this type of strange problem. What could i be doing wrong?
Please do suggest what i can do?
The sub folder which does not get recognized is something like this
2018-07-29T22.57.52
This folder contains the files i am looking for and Perl Modules Find and readdir does not seems to be checking this.
Please do let me know if i need to rephrase my question.

Possible to convert xlsx file to csv file with bash script? [duplicate]

How do you convert multiple xlsx files to csv files with a batch script?
Try in2csv!
Usage:
in2csv file.xlsx > file.csv
Alternative way of converting to csv. Use libreoffice:
libreoffice --headless --convert-to csv *
Please be aware that this will only convert the first worksheet of your Excel file.
Get all file item and filter them by suffix and then use PowerShell Excel VBA object to save the excel files to csv files.
$excelApp = New-Object -ComObject Excel.Application
$excelApp.DisplayAlerts = $false
Get-ChildItem -File -Filter '*.xlsx' | ForEach-Object {
$workbook = $excelApp.Workbooks.Open($_.FullName)
$csvFilePath = $_.FullName -replace "\.xlsx$", ".csv"
$workbook.SaveAs($csvFilePath, [Microsoft.Office.Interop.Excel.XlFileFormat]::xlCSV)
$workbook.Close()
}
You can find the complete sample here How to convert Excel xlsx file to csv file in batch by PowerShell
To follow up on the answer by user183038, here is a shell script to batch rename all xlsx files to csv while preserving the file names. The xlsx2csv tool needs to be installed prior to running.
for i in *.xlsx;
do
filename=$(basename "$i" .xlsx);
outext=".csv"
xlsx2csv $i $filename$outext
done
You need an external tool, in example: SoftInterface.com - Convert XLSX to CSV.
After installing it, you can use following command in your batch:
"c:\Program Files\Softinterface, Inc\Convert XLS\ConvertXLS.EXE" /S"C:\MyExcelFile.xlsx" /F51 /N"Sheet1" /T"C:\MyExcelFile.CSV" /C6 /M1 /V
Needs installed excel as it uses the Excel.Application com object.Save this as .bat file:
#if (#X)==(#Y) #end /* JScript comment
#echo off
cscript //E:JScript //nologo "%~f0" %*
exit /b %errorlevel%
#if (#X)==(#Y) #end JScript comment */
var ARGS = WScript.Arguments;
var xlCSV = 6;
var objExcel = WScript.CreateObject("Excel.Application");
var objWorkbook = objExcel.Workbooks.Open(ARGS.Item(0));
objExcel.DisplayAlerts = false;
objExcel.Visible = false;
var objWorksheet = objWorkbook.Worksheets(ARGS.Item(1))
objWorksheet.SaveAs( ARGS.Item(2), xlCSV);
objExcel.Quit();
It accepts three arguments - the absolute path to the xlsx file, the sheet name and the absolute path to the target csv file:
call toCsv.bat "%cd%\Book1.xlsx" Sheet1 "%cd%\csv.csv"
Adding to #marbel's answer (which is a great suggestion!), here's the script that worked for me on Mac OS X El Captain's Terminal, for batch conversion (since that's what the OP asked). I thought it would be trivial to do a for loop but it wasn't! (had to change the extension by string manipulation and it looks like Mac's bash is a bit different also)
for x in $(ls *.xlsx); do x1=${x%".xlsx"}; in2csv $x > $x1.csv; echo "$x1.csv done."; done
Note:
${x%”.xlsx”} is bash string manipulation which clips .xlsx from the end of the string.
in2csv creates separate csv files (doesn’t overwrite the xlsx's).
The above won't work if the filenames have white spaces in them. Good to convert white spaces to underscores or something, before running the script.

How to duplicate objects in variable with different variable names for PowerShell

I would like to create a duplicate of same objects in different variable names.
The object I required is archive files from dotnetzip.
The following code is the full implementation:
[System.Reflection.Assembly]::LoadFrom($zipFileDirectory + "Ionic.Zip.dll")
$zipfile = [Ionic.Zip.ZipFile]::Read($zipfilename)
foreach ($file in $zipfile)
{
$strSearchItem = [string]$file.FileName
$strSearchItem = $strSearchItem.TrimEnd("/")
$newfile = $file.PSObject.Copy()
for ($i = 0; $i -lt $newfile.Count; $i++)
{
if ($strSearchItem -like $searchFolderName + "/*")
{
$newFile[$i].FileName = $newFile[$i].FileName.Replace($searchFolderName + "/", "")
$newFile[$i].Extract($fileDestination, [Ionic.Zip.ExtractExistingFileAction]::OverWriteSilently)
}
}
}
$zipfile.Dispose()
For this purpose I need to be able to copy $file as separate entity from $zipfile, or at least retain the original default value for $file (making it read-only doesnt seemed viable). Is there any workaround for this matter?
Thanks in advance.
maybe
$newFile = $zipfile.PSObject.Copy()
in reply to #bdrc comment
example adding a comment to the zip
PS>$zf=[ionic.zip.zipfile]::read("c:\temp\zip\test.zip")
PS>$zf.comment
PS>$zf2=$zf.psobject.copy()
PS>$zf2.comment="TEST COMMENT"
PS>$zf2.save("c:\temp\test2.zip")
when opening original file with 7-zip I dont see the comment, I can see it in the new zipfile ...

Know the from a fullname string if path is a folder or a file?

Sorry for the title. I'll try to explain better.
Let's suppose that I run this command in order to get all paths of a directory and I want to redirect them to a text file.
gci e:\mytree -r | % {$_.fullname} | out-file e:\folderstructure.txt
Now I need to recreate this nested structure using new-item cmdlet.
Let's suppose now that I run this command:
gc e:\folderstructure.txt | % {
[system.io.fileinfo]$info = $_
write-host $info $info.extension
}
that produces this output:
E:\mytree\folder1
E:\mytree\folder2
E:\mytree\folder3
E:\mytree\file1.txt .txt
E:\mytree\file12.txt .txt
E:\mytree\folder1\folder.with.dots .dots
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt .txt
E:\mytree\folder3\file4.doc .doc
As you can see folder.with.dots is a folder but "it's seen" as a file (it gives me .dots extension) because it contains dots within name. If I don't know all possible extensions of my files is there any property that can tell me if an object iscontainer or not so that I can use new-item with the right switch file or directory to create it?
I hope you have understood my problem despite my English. Thanks in advance.
edit. UPDATE after JPBlanc answer
Thank you very much. :)
I was trying in this way:
gc e:\folderstructure.txt | % {
[system.io.directoryinfo]$info = $_
if ($info.psiscontainer) {
write-host "$info is a folder" }
else {
write-host "$info is a file" }
}
and the output was:
E:\mytree\folder1 is a file
E:\mytree\folder2 is a file
E:\mytree\folder3 is a file
E:\mytree\file1.txt is a file
E:\mytree\file12.txt is a file
E:\mytree\folder1\folder.with.dots is a file
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt is a file
E:\mytree\folder3\file4.doc is a file
Following your advice:
gc e:\folderstructure.txt | % {
[system.io.directoryinfo]$info = $_
if ((get-item $info).psiscontainer) {
write-host "$info is a folder" }
else {
write-host "$info is a file" }
}
E:\mytree\folder1 is a folder
E:\mytree\folder2 is a folder
E:\mytree\folder3 is a folder
E:\mytree\file1.txt is a file
E:\mytree\file12.txt is a file
E:\mytree\folder1\folder.with.dots is a folder
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt is a file
E:\mytree\folder3\file4.doc is a file
everything works fine. Now I can achieve my goal. Thanks again.
LAST EDIT.
I had another idea. I decided to check if an object is a file or a folder before creating txt file. After some difficulties (for example I've soon discovered that I can't redirect format-table that I was using to hide table headers to export-csv
http://blogs.msdn.com/b/powershell/archive/2007/03/07/why-can-t-i-pipe-format-table-to-export-csv-and-get-something-useful.aspx )
I came up with this solution:
gci e:\mytree -r |
select fullname,#{n='folder';e={ switch ($_.psiscontainer) {
true {1}
false {0}
}
}
} | % {($_.fullname,$_.folder) -join ","} | out-file e:\structure.txt
that gets me this output:
E:\mytree\folder1,1
E:\mytree\folder2,1
E:\mytree\folder3,1
E:\mytree\file1.txt,0
E:\mytree\file12.txt,0
E:\mytree\folder1\folder.with.dots,1
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt,0
E:\mytree\folder3\file4.doc,0
So I can easily split two parameters and use new-item cmdlet accordingly to object type.
Both FileInfo type an DirectoryInfo type has got a property PSIsContainer, that allow you to see if the object is a directory or not.
PS C:\temp> (Get-Item 'Hello world.exe').PSIsContainer
False
PS C:\temp> (Get-Item 'c:\temp').PSIsContainer
True
I modified your code slightly to this:
gc c:\scripts\files.txt | % {
$item = Get-item $_
Write-Host $item.fullName $item.PSIscontainer
}
Now, my output looks like this:
C:\Scripts\mytree\folder1 True
C:\Scripts\mytree\folder2 True
C:\Scripts\mytree\file1.txt False
C:\Scripts\mytree\file2.txt False
C:\Scripts\mytree\folder.with.dots True
C:\Scripts\mytree\folder.with.dots\file inside folder with dots.txt False

Automated script to zip IIS logs?

I'd like to write a script/batch that will bunch up my daily IIS logs and zip them up by month.
ex080801.log which is in the format of exyymmdd.log
ex080801.log - ex080831.log gets zipped up and the log files deleted.
The reason we do this is because on a heavy site a log file for one day could be 500mb to 1gb so we zip them up which compresses them by 98% and dump the real log file. We use webtrend to analyze the log files and it is capable of reading into a zip file.
Does anyone have any ideas on how to script this or would be willing to share some code?
You'll need a command line tool to zip up the files. I recommend 7-Zip which is free and easy to use. The self-contained command line version (7za.exe) is the most portable choice.
Here's a two-line batch file that would zip the log files and delete them afterwards:
7za.exe a -tzip ex%1-logs.zip %2\ex%1*.log
del %2\ex%1*.log
The first parameter is the 4 digit year-and-month, and the second parameter is the path to the directory containing your logs. For example: ziplogs.bat 0808 c:\logs
It's possible to get more elaborate (i.e. searching the filenames to determine which months to archive). You might want to check out the Windows FINDSTR command for searching input text with regular expressions.
Here's my script which basically adapts David's, and zips up last month's logs, moves them and deletes the original log files. this can be adapted for Apache logs too.
The only problem with this is you may need to edit the replace commands, if your DOS date function outputs date of the week.
You'll also need to install 7-zip.
You can also download IISlogslite but it compresses each day's file into a single zip file which I didn't find useful. There is a vbscript floating about the web that does the same thing.
-------------------------------------------------------------------------------------
#echo on
:: Name - iislogzip.bat
:: Description - Server Log File Manager
::
:: History
:: Date Authory Change
:: 27-Aug-2008 David Crow Original (found on stack overflow)
:: 15-Oct-2008 AIMackenzie Slimmed down commands
:: ========================================================
:: setup variables and parameters
:: ========================================================
:: generate date and time variables
set month=%DATE:~3,2%
set year=%DATE:~8,2%
::Get last month and check edge conditions
set /a lastmonth=%month%-1
if %lastmonth% equ 0 set /a year=%year%-1
if %lastmonth% equ 0 set lastmonth=12
if %lastmonth% lss 10 set lastmonth=0%lastmonth%
set yymm=%year%%lastmonth%
set logpath="C:\WINDOWS\system32\LogFiles"
set zippath="C:\Program Files\7-Zip\7z.exe"
set arcpath="C:\WINDOWS\system32\LogFiles\WUDF"
:: ========================================================
:: Change to log file path
:: ========================================================
cd /D %logpath%
:: ========================================================
:: zip last months IIS log files, move zipped file to archive
:: then delete old logs
:: ========================================================
%zippath% a -tzip ex%yymm%-logs.zip %logpath%\ex%yymm%*.log
move "%logpath%\*.zip" "%arcpath%"
del %logpath%\ex%yymm%*.log
We use a script like the following. Gzip is from the cygwin project. I'm sure you could modify the syntax to use a zip tool instead. The "skip" argument is the number of files to not archive off -- we keep 11 days in the 'current' directory.
#echo off
setlocal
For /f "skip=11 delims=/" %%a in ('Dir D:\logs\W3SVC1\*.log /B /O:-N /T:C')do move "D:\logs\W3SVC1\%%a" "D:\logs\W3SVC1\old\%%a"
d:
cd "\logs\W3SVC1\old"
gzip -n *.log
Endlocal
exit
You can grab the command-line utilities package from DotNetZip to get tools to create zips from scripts. There's a nice little tool called Zipit.exe that runs on the command line, adds files or directories to zip files. It is fast, efficient.
A better option might be to just do the zipping from within PowerShell.
function ZipUp-Files ( $directory )
{
$children = get-childitem -path $directory
foreach ($o in $children)
{
if ($o.Name -ne "TestResults" -and
$o.Name -ne "obj" -and
$o.Name -ne "bin" -and
$o.Name -ne "tfs" -and
$o.Name -ne "notused" -and
$o.Name -ne "Release")
{
if ($o.PSIsContainer)
{
ZipUp-Files ( $o.FullName )
}
else
{
if ($o.Name -ne ".tfs-ignore" -and
!$o.Name.EndsWith(".cache") -and
!$o.Name.EndsWith(".zip") )
{
Write-output $o.FullName
$e= $zipfile.AddFile($o.FullName)
}
}
}
}
}
[System.Reflection.Assembly]::LoadFrom("c:\\\bin\\Ionic.Zip.dll");
$zipfile = new-object Ionic.Zip.ZipFile("zipsrc.zip");
ZipUp-Files "DotNetZip"
$zipfile.Save()
Borrowed zip function from http://blogs.msdn.com/daiken/archive/2007/02/12/compress-files-with-windows-powershell-then-package-a-windows-vista-sidebar-gadget.aspx
Here is powershell answer that works wonders:
param([string]$Path = $(read-host "Enter the path"))
function New-Zip
{
param([string]$zipfilename)
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
$FilesToZip = dir $Path -recurse -include *.log
foreach ($file in $FilesToZip) {
New-Zip $file.BaseName
dir $($file.directoryname+"\"+$file.name) | Add-zip $($file.directoryname+"\$($file.basename).zip")
del $($file.directoryname+"\"+$file.name)
}
We use this powershell script: http://gallery.technet.microsoft.com/scriptcenter/31db73b4-746c-4d33-a0aa-7a79006317e6
It uses 7-zip and verifys the files before deleting them
Regex will do the trick... create a perl/python/php script to do the job for you..
I'm pretty sure windows batch file can't do regex.

Resources