How do you convert multiple xlsx files to csv files with a batch script?
Try in2csv!
Usage:
in2csv file.xlsx > file.csv
Alternative way of converting to csv. Use libreoffice:
libreoffice --headless --convert-to csv *
Please be aware that this will only convert the first worksheet of your Excel file.
Get all file item and filter them by suffix and then use PowerShell Excel VBA object to save the excel files to csv files.
$excelApp = New-Object -ComObject Excel.Application
$excelApp.DisplayAlerts = $false
Get-ChildItem -File -Filter '*.xlsx' | ForEach-Object {
$workbook = $excelApp.Workbooks.Open($_.FullName)
$csvFilePath = $_.FullName -replace "\.xlsx$", ".csv"
$workbook.SaveAs($csvFilePath, [Microsoft.Office.Interop.Excel.XlFileFormat]::xlCSV)
$workbook.Close()
}
You can find the complete sample here How to convert Excel xlsx file to csv file in batch by PowerShell
To follow up on the answer by user183038, here is a shell script to batch rename all xlsx files to csv while preserving the file names. The xlsx2csv tool needs to be installed prior to running.
for i in *.xlsx;
do
filename=$(basename "$i" .xlsx);
outext=".csv"
xlsx2csv $i $filename$outext
done
You need an external tool, in example: SoftInterface.com - Convert XLSX to CSV.
After installing it, you can use following command in your batch:
"c:\Program Files\Softinterface, Inc\Convert XLS\ConvertXLS.EXE" /S"C:\MyExcelFile.xlsx" /F51 /N"Sheet1" /T"C:\MyExcelFile.CSV" /C6 /M1 /V
Needs installed excel as it uses the Excel.Application com object.Save this as .bat file:
#if (#X)==(#Y) #end /* JScript comment
#echo off
cscript //E:JScript //nologo "%~f0" %*
exit /b %errorlevel%
#if (#X)==(#Y) #end JScript comment */
var ARGS = WScript.Arguments;
var xlCSV = 6;
var objExcel = WScript.CreateObject("Excel.Application");
var objWorkbook = objExcel.Workbooks.Open(ARGS.Item(0));
objExcel.DisplayAlerts = false;
objExcel.Visible = false;
var objWorksheet = objWorkbook.Worksheets(ARGS.Item(1))
objWorksheet.SaveAs( ARGS.Item(2), xlCSV);
objExcel.Quit();
It accepts three arguments - the absolute path to the xlsx file, the sheet name and the absolute path to the target csv file:
call toCsv.bat "%cd%\Book1.xlsx" Sheet1 "%cd%\csv.csv"
Adding to #marbel's answer (which is a great suggestion!), here's the script that worked for me on Mac OS X El Captain's Terminal, for batch conversion (since that's what the OP asked). I thought it would be trivial to do a for loop but it wasn't! (had to change the extension by string manipulation and it looks like Mac's bash is a bit different also)
for x in $(ls *.xlsx); do x1=${x%".xlsx"}; in2csv $x > $x1.csv; echo "$x1.csv done."; done
Note:
${x%”.xlsx”} is bash string manipulation which clips .xlsx from the end of the string.
in2csv creates separate csv files (doesn’t overwrite the xlsx's).
The above won't work if the filenames have white spaces in them. Good to convert white spaces to underscores or something, before running the script.
Related
I have a htm file with href="example.com/page" somewhere on its source code, how could i get the link between the " "?
So far I have tried modifying this piece of code:
#echo off
setlocal EnableDelayedExpansion
set "str="
set "string=stuff href="example.com/page"end morestuff"
set string=!string:href=^
!
set string=!string:end=^
!
FOR /F skip^=1eol^= %%S in ("!string!") do if NOT DEFINED str set "str=%%S"
echo(!str!
pause > nul
However on line 6 it appears that changing href to href=" breaks the code, and changing end to " also breaks something, would like to know if it is possible to fix this or if there is an alternative for this?
I believe you just want the example.com/page part:
#echo off
set "string=stuff href="example.com/page"end morestuff"
for /f tokens^=2delims^="" %%a in ("%string%") do set "substr=%%a"
echo %substr%
Reading from file as per comment:
#echo off
set "file=file.txt"
for /f tokens^=2delims^="" %%a in ('findstr /IRC:"href=" "%file%"') do set "substr=%%a"
echo %substr%
Seeing that you require reading from an html file, I would recommend using something a little more robust, like powershell.
Create a file with .ps1 extension, paste the content, make sure you put the path and filename of you file in $file_path replacing file.txt:
$file_path = 'file.txt'
$rgx = '(?<=href\=").*?(?=">)'
select-string -Path $file_path -Pattern $rgx -AllMatches | % { $_.Matches } | % { $_.Value }
now you can either run it from cmd:
powershell -File test_url.ps1
Or simply open powershell cli and run directly from there:
.\test_url.ps1
We seem to be seeing more and more questions about executing awk on Excel spreadsheets so here is a Q/A on how to do that specific thing.
I have this information in an Excel spreadsheet "$D/staff.xlsx" (where "$D" is the path to my Desktop):
Name Position
Sue Manager
Bill Secretary
Pat Engineer
and I want to print the Position field for a given Name, e.g. output Secretary given the input Bill.
I can currently save as CSV from Excel to get:
$ cat "$D/staff.csv"
Name,Position
Sue,Manager
Bill,Secretary
Pat,Engineer
and then run:
$ awk -F, -v name="Bill" '$1==name{print $2}' "$D/staff.csv"
Secretary
but this is just a small part of a larger task and so I have to be able to do this automatically from a shell script without manually opening Excel to export the CSV file. How do I do that from a Windows PC running cygwin?
The combination of the following VBS and shell scripts create a CSV file for each sheet in the Excel spreadsheet:
$ cat xls2csv.vbs
csv_format = 6
Dim strFilename
Dim objFSO
Set objFSO = CreateObject("scripting.filesystemobject")
strFilename = objFSO.GetAbsolutePathName(WScript.Arguments(0))
If objFSO.fileexists(strFilename) Then
Call Writefile(strFilename)
Else
wscript.echo "no such file!"
End If
Set objFSO = Nothing
Sub Writefile(ByVal strFilename)
Dim objExcel
Dim objWB
Dim objws
Set objExcel = CreateObject("Excel.Application")
Set objWB = objExcel.Workbooks.Open(strFilename)
For Each objws In objWB.Sheets
objws.Copy
objExcel.ActiveWorkbook.SaveAs objWB.Path & "\" & objws.Name & ".csv", csv_format
objExcel.ActiveWorkbook.Close False
Next
objWB.Close False
objExcel.Quit
Set objExcel = Nothing
End Sub
.
$ cat xls2csv
PATH="$HOME:$PATH"
# the original XLS input file path components
inXlsPath="$1"
inXlsDir=$(dirname "$inXlsPath")
xlsFile=$(basename "$inXlsPath")
xlsBase="${xlsFile%.*}"
# The tmp dir we'll copy the XLS to and run the tool on
# to get the CSVs generated
tmpXlsDir="/usr/tmp/${xlsBase}.$$"
tmpXlsPath="${tmpXlsDir}/${xlsFile}"
absXlsPath="C:/cygwin64/${tmpXlsPath}" # need an absolute path for VBS to work
mkdir -p "$tmpXlsDir"
trap 'rm -f "${tmpXlsDir}/${xlsFile}"; rmdir "$tmpXlsDir"; exit' 0
cp "$inXlsPath" "$tmpXlsDir"
cygstart "$HOME/xls2csv.vbs" "$absXlsPath"
printf "Waiting for \"${tmpXlsDir}/~\$${xlsFile}\" to be created:\n" >&2
while [ ! -f "${tmpXlsDir}/~\$${xlsFile}" ]
do
# VBS is done when this tmp file is created and later removed
printf "." >&2
sleep 1
done
printf " Done.\n" >&2
printf "Waiting for \"${tmpXlsDir}/~\$${xlsFile}\" to be removed:\n" >&2
while [ -f "${tmpXlsDir}/~\$${xlsFile}" ]
do
# VBS is done when this tmp file is removed
printf "." >&2
sleep 1
done
printf " Done.\n" >&2
numFiles=0
for file in "$tmpXlsDir"/*.csv
do
numFiles=$(( numFiles + 1 ))
done
if (( numFiles >= 1 ))
then
outCsvDir="${inXlsDir}/${xlsBase}.csvs"
mkdir -p "$outCsvDir"
mv "$tmpXlsDir"/*.csv "$outCsvDir"
fi
Now we execute the shell script which internally calls cygstart to run the VBS script to generate the CSV files (one per sheet) in a subdirectory under the same directory where the Excel file exists named based on the Excel file name (e.g. Excel file staff.xlsx produces CSVs directory staff.csvs):
$ ./xls2csv "$D/staff.xlsx"
Waiting for "/usr/tmp/staff.2700/~$staff.xlsx" to be created:
.. Done.
Waiting for "/usr/tmp/staff.2700/~$staff.xlsx" to be removed:
. Done.
There is only one sheet with the default name Sheet1 in the target Excel file "$D/staff.xlsx" so the output of the above is a file "$D/staff.csvs/Sheet1.csv":
$ cat "$D/staff.csvs/Sheet1.csv"
Name,Position
Sue,Manager
Bill,Secretary
Pat,Engineer
$ awk -F, -v name="Bill" '$1==name{print $2}' "$D/staff.csvs/Sheet1.csv"
Secretary
Also see What's the most robust way to efficiently parse CSV using awk? for how to then operate on those CSVs.
See also https://stackoverflow.com/a/58879683/1745001 for how to do the opposite, i.e. call a cygwin bash command from a Windows batch file.
Im using Powershell 4.0 on win7 64 bit, I want to see who has an excel file open, or even if the file is open.
Example. I have the excel file "test" on network drive B. If one person opens "test" I understand that will create an excel lock file looking like this "~$test.xls".
So far I have used Test-path to verify that the excel lock file exists. Then I believe I can use Get-Acl to find the owner of that file. Is there a simpler way to find out who has an excel file open? Or will my workaround for checking the ownership of the lock file work?
I still use Netapi32 based functions for achieving this.
Add-Type -TypeDefinition #"
using System;
using System.Runtime.InteropServices;
public class Netapi
{
[DllImport("Netapi32.dll",CharSet=CharSet.Unicode)]
public static extern int NetApiBufferFree(IntPtr buffer);
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
public struct FILE_INFO_3
{
public uint FileID;
public uint Permissions;
public uint NumLocks;
[MarshalAs(UnmanagedType.LPWStr)] public string Path;
[MarshalAs(UnmanagedType.LPWStr)] public string User;
}
[DllImport("Netapi32.dll",CharSet=CharSet.Unicode)]
public static extern uint NetFileEnum(
[In,MarshalAs(UnmanagedType.LPWStr)] string server,
[In,MarshalAs(UnmanagedType.LPWStr)] string path,
[In,MarshalAs(UnmanagedType.LPWStr)] string user,
int level,
out IntPtr bufptr,
int prefmaxlen,
ref Int32 entriesread,
ref Int32 totalentries,
ref Int32 resume_handle);
}
"#
function Get-OpenFiles
{
[CmdletBinding()]
param ( [string]$Server = "localhost",
[string]$User = $null,
[string]$Path = $null)
$struct = New-Object netapi+FILE_INFO_3
$buffer = 0
$entries = 0
$total = 0
$handle = 0
$Level=3 # used to define the type of struct we want, i.e. FILE_INFO_3
$ret = [Netapi]::NetFileEnum($server, $path, $user, $level,
[ref]$buffer, -1,
[ref]$entries, [ref]$total,
[ref]$handle)
$files = #()
if (!$ret)
{
$offset = $buffer.ToInt64()
$increment = [System.Runtime.Interopservices.Marshal]::SizeOf([System.Type]$struct.GetType())
for ($i = 0; $i -lt $entries; $i++)
{
$ptr = New-Object system.Intptr -ArgumentList $offset
$files += [system.runtime.interopservices.marshal]::PtrToStructure($ptr, [System.Type]$struct.GetType())
$offset = $ptr.ToInt64()
$offset += $increment
}
}
else
{
Write-Output ([ComponentModel.Win32Exception][Int32]$ret).Message
if ($ret -eq 1208)
{
# Error Code labeled "Extended Error" requires the buffer to be freed
[Void][Netapi]::NetApiBufferFree($buffer)
}
}
$files
}
Then you can call the Get-OpenFiles and pass a specific path name:
Get-OpenFiles -Path C:\Temp\EXCEL.XLSX
FileID : 205
Permissions : 35
NumLocks : 0
Path : C:\Temp\EXCEL.XLSX
User : mickyb
Using Get-OpenFiles -Path C:\Temp works too:
FileID : 205
Permissions : 35
NumLocks : 0
Path : C:\Temp\EXCEL.XLSX
User : mickyb
FileID : 213
Permissions : 51
NumLocks : 0
Path : C:\Temp\~$Excel.xlsx
User : mickyb
You could also see if a specific user has files open:
Get-OpenFiles -User mickyb
If you copy the contents of the lock file to the console it should contain the name of the user who has the file locked. You may have to make a copy of the lock file in order to read it. I'm not familiar with PowerShell but I assume it has all the power of DOS batch files and a technique similar to what I wrote below could be created.
Here is a batch file I have added to my SendTo folder which allows me to right-click on an Excel file and it will show me who has the file locked. I have tested this with .xlsx and .xlsm files.
#echo off
REM ===================================================================
REM Put this in your SendTo folder and it will let you right-click
REM on an .xlsx/.xlsm file and show you the user name in the lock file
REM
REM If an Excel file is locked, look to see if a hidden lock file exists. If
REM the file is found, make a local temp copy of it and display the contents which
REM should be the name of the user holding the lock.
REM ===================================================================
setlocal
set file="%1"
REM Make sure the file has a compatible extension.
if "%~x1"==".xlsx" goto :ExtensionIsValidExcel
if "%~x1"==".xlsm" goto :ExtensionIsValidExcel
echo.
echo "%~n1%~x1" is not a supported file type.
echo.
pause
exit
:ExtensionIsValidExcel
REM If an Excel file is locked, look to see if a hidden lock file exists. If
REM the file is found, make a local temp copy of it and display the contents which
REM should be the name of the user holding the lock.
IF EXIST %~dp1~$%~n1%~x1 (
ECHO f | Xcopy %~dp1~$%~n1%~x1 "%TEMP%\~temp.txt" /H /R /Y /F
attrib -h -s %TEMP%\~temp.txt
cls
ECHO.
ECHO The Excel file "%~n1%~x1" file is locked by:
ECHO.
REM copy the file to the console to show the user name.
copy %TEMP%\~temp.txt con | Find /v "file(s) copied"
del %TEMP%\~temp.txt
) Else (
cls
ECHO.
ECHO The Excel file "%~n1%~x1" file is not locked in a way this was expecting.
)
ECHO.
ECHO.
pause
Sorry for the title. I'll try to explain better.
Let's suppose that I run this command in order to get all paths of a directory and I want to redirect them to a text file.
gci e:\mytree -r | % {$_.fullname} | out-file e:\folderstructure.txt
Now I need to recreate this nested structure using new-item cmdlet.
Let's suppose now that I run this command:
gc e:\folderstructure.txt | % {
[system.io.fileinfo]$info = $_
write-host $info $info.extension
}
that produces this output:
E:\mytree\folder1
E:\mytree\folder2
E:\mytree\folder3
E:\mytree\file1.txt .txt
E:\mytree\file12.txt .txt
E:\mytree\folder1\folder.with.dots .dots
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt .txt
E:\mytree\folder3\file4.doc .doc
As you can see folder.with.dots is a folder but "it's seen" as a file (it gives me .dots extension) because it contains dots within name. If I don't know all possible extensions of my files is there any property that can tell me if an object iscontainer or not so that I can use new-item with the right switch file or directory to create it?
I hope you have understood my problem despite my English. Thanks in advance.
edit. UPDATE after JPBlanc answer
Thank you very much. :)
I was trying in this way:
gc e:\folderstructure.txt | % {
[system.io.directoryinfo]$info = $_
if ($info.psiscontainer) {
write-host "$info is a folder" }
else {
write-host "$info is a file" }
}
and the output was:
E:\mytree\folder1 is a file
E:\mytree\folder2 is a file
E:\mytree\folder3 is a file
E:\mytree\file1.txt is a file
E:\mytree\file12.txt is a file
E:\mytree\folder1\folder.with.dots is a file
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt is a file
E:\mytree\folder3\file4.doc is a file
Following your advice:
gc e:\folderstructure.txt | % {
[system.io.directoryinfo]$info = $_
if ((get-item $info).psiscontainer) {
write-host "$info is a folder" }
else {
write-host "$info is a file" }
}
E:\mytree\folder1 is a folder
E:\mytree\folder2 is a folder
E:\mytree\folder3 is a folder
E:\mytree\file1.txt is a file
E:\mytree\file12.txt is a file
E:\mytree\folder1\folder.with.dots is a folder
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt is a file
E:\mytree\folder3\file4.doc is a file
everything works fine. Now I can achieve my goal. Thanks again.
LAST EDIT.
I had another idea. I decided to check if an object is a file or a folder before creating txt file. After some difficulties (for example I've soon discovered that I can't redirect format-table that I was using to hide table headers to export-csv
http://blogs.msdn.com/b/powershell/archive/2007/03/07/why-can-t-i-pipe-format-table-to-export-csv-and-get-something-useful.aspx )
I came up with this solution:
gci e:\mytree -r |
select fullname,#{n='folder';e={ switch ($_.psiscontainer) {
true {1}
false {0}
}
}
} | % {($_.fullname,$_.folder) -join ","} | out-file e:\structure.txt
that gets me this output:
E:\mytree\folder1,1
E:\mytree\folder2,1
E:\mytree\folder3,1
E:\mytree\file1.txt,0
E:\mytree\file12.txt,0
E:\mytree\folder1\folder.with.dots,1
E:\mytree\folder1\folder.with.dots\file inside folder with dots.txt,0
E:\mytree\folder3\file4.doc,0
So I can easily split two parameters and use new-item cmdlet accordingly to object type.
Both FileInfo type an DirectoryInfo type has got a property PSIsContainer, that allow you to see if the object is a directory or not.
PS C:\temp> (Get-Item 'Hello world.exe').PSIsContainer
False
PS C:\temp> (Get-Item 'c:\temp').PSIsContainer
True
I modified your code slightly to this:
gc c:\scripts\files.txt | % {
$item = Get-item $_
Write-Host $item.fullName $item.PSIscontainer
}
Now, my output looks like this:
C:\Scripts\mytree\folder1 True
C:\Scripts\mytree\folder2 True
C:\Scripts\mytree\file1.txt False
C:\Scripts\mytree\file2.txt False
C:\Scripts\mytree\folder.with.dots True
C:\Scripts\mytree\folder.with.dots\file inside folder with dots.txt False
I'd like to write a script/batch that will bunch up my daily IIS logs and zip them up by month.
ex080801.log which is in the format of exyymmdd.log
ex080801.log - ex080831.log gets zipped up and the log files deleted.
The reason we do this is because on a heavy site a log file for one day could be 500mb to 1gb so we zip them up which compresses them by 98% and dump the real log file. We use webtrend to analyze the log files and it is capable of reading into a zip file.
Does anyone have any ideas on how to script this or would be willing to share some code?
You'll need a command line tool to zip up the files. I recommend 7-Zip which is free and easy to use. The self-contained command line version (7za.exe) is the most portable choice.
Here's a two-line batch file that would zip the log files and delete them afterwards:
7za.exe a -tzip ex%1-logs.zip %2\ex%1*.log
del %2\ex%1*.log
The first parameter is the 4 digit year-and-month, and the second parameter is the path to the directory containing your logs. For example: ziplogs.bat 0808 c:\logs
It's possible to get more elaborate (i.e. searching the filenames to determine which months to archive). You might want to check out the Windows FINDSTR command for searching input text with regular expressions.
Here's my script which basically adapts David's, and zips up last month's logs, moves them and deletes the original log files. this can be adapted for Apache logs too.
The only problem with this is you may need to edit the replace commands, if your DOS date function outputs date of the week.
You'll also need to install 7-zip.
You can also download IISlogslite but it compresses each day's file into a single zip file which I didn't find useful. There is a vbscript floating about the web that does the same thing.
-------------------------------------------------------------------------------------
#echo on
:: Name - iislogzip.bat
:: Description - Server Log File Manager
::
:: History
:: Date Authory Change
:: 27-Aug-2008 David Crow Original (found on stack overflow)
:: 15-Oct-2008 AIMackenzie Slimmed down commands
:: ========================================================
:: setup variables and parameters
:: ========================================================
:: generate date and time variables
set month=%DATE:~3,2%
set year=%DATE:~8,2%
::Get last month and check edge conditions
set /a lastmonth=%month%-1
if %lastmonth% equ 0 set /a year=%year%-1
if %lastmonth% equ 0 set lastmonth=12
if %lastmonth% lss 10 set lastmonth=0%lastmonth%
set yymm=%year%%lastmonth%
set logpath="C:\WINDOWS\system32\LogFiles"
set zippath="C:\Program Files\7-Zip\7z.exe"
set arcpath="C:\WINDOWS\system32\LogFiles\WUDF"
:: ========================================================
:: Change to log file path
:: ========================================================
cd /D %logpath%
:: ========================================================
:: zip last months IIS log files, move zipped file to archive
:: then delete old logs
:: ========================================================
%zippath% a -tzip ex%yymm%-logs.zip %logpath%\ex%yymm%*.log
move "%logpath%\*.zip" "%arcpath%"
del %logpath%\ex%yymm%*.log
We use a script like the following. Gzip is from the cygwin project. I'm sure you could modify the syntax to use a zip tool instead. The "skip" argument is the number of files to not archive off -- we keep 11 days in the 'current' directory.
#echo off
setlocal
For /f "skip=11 delims=/" %%a in ('Dir D:\logs\W3SVC1\*.log /B /O:-N /T:C')do move "D:\logs\W3SVC1\%%a" "D:\logs\W3SVC1\old\%%a"
d:
cd "\logs\W3SVC1\old"
gzip -n *.log
Endlocal
exit
You can grab the command-line utilities package from DotNetZip to get tools to create zips from scripts. There's a nice little tool called Zipit.exe that runs on the command line, adds files or directories to zip files. It is fast, efficient.
A better option might be to just do the zipping from within PowerShell.
function ZipUp-Files ( $directory )
{
$children = get-childitem -path $directory
foreach ($o in $children)
{
if ($o.Name -ne "TestResults" -and
$o.Name -ne "obj" -and
$o.Name -ne "bin" -and
$o.Name -ne "tfs" -and
$o.Name -ne "notused" -and
$o.Name -ne "Release")
{
if ($o.PSIsContainer)
{
ZipUp-Files ( $o.FullName )
}
else
{
if ($o.Name -ne ".tfs-ignore" -and
!$o.Name.EndsWith(".cache") -and
!$o.Name.EndsWith(".zip") )
{
Write-output $o.FullName
$e= $zipfile.AddFile($o.FullName)
}
}
}
}
}
[System.Reflection.Assembly]::LoadFrom("c:\\\bin\\Ionic.Zip.dll");
$zipfile = new-object Ionic.Zip.ZipFile("zipsrc.zip");
ZipUp-Files "DotNetZip"
$zipfile.Save()
Borrowed zip function from http://blogs.msdn.com/daiken/archive/2007/02/12/compress-files-with-windows-powershell-then-package-a-windows-vista-sidebar-gadget.aspx
Here is powershell answer that works wonders:
param([string]$Path = $(read-host "Enter the path"))
function New-Zip
{
param([string]$zipfilename)
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
$FilesToZip = dir $Path -recurse -include *.log
foreach ($file in $FilesToZip) {
New-Zip $file.BaseName
dir $($file.directoryname+"\"+$file.name) | Add-zip $($file.directoryname+"\$($file.basename).zip")
del $($file.directoryname+"\"+$file.name)
}
We use this powershell script: http://gallery.technet.microsoft.com/scriptcenter/31db73b4-746c-4d33-a0aa-7a79006317e6
It uses 7-zip and verifys the files before deleting them
Regex will do the trick... create a perl/python/php script to do the job for you..
I'm pretty sure windows batch file can't do regex.