My macro will need to read through a very large directory of files and parse data from them. This directory is updated periodically so I am trying to optimize my program to only read files that were added since the last time the program was run.
So far using FileSystemObject it seems I can only read files alphabetically, if I'm not mistaken.
The best solution I have so far is to read all the files every time, create an array containing the file information, sort by DateModified, then open only the files I need. I'm curious to see if I can skip this step by reading files in order of DateModified.
Thanks in advance.
Shell does seem to be a good option here - although I haven't compared performance against a FSO. You could, for example, consider the forfiles command which allows you to retrieve files modified after a specified date?
Some sample code for that would be as follows:
Public Sub RunMe()
Dim fileNames As Collection
Dim path As String
Dim dat As Date
Dim file As Variant
'Set the path and 'find after' date.
path = "c:\user\documents"
dat = #1/1/2018#
'Fetch the files, setting mask as required.
'This example is fetching all .txt files.
Set fileNames = GetFilesModifiedAfter(path, dat, "*.txt")
'Process the list of files.
If Not fileNames Is Nothing Then
For Each file In fileNames
' ... do stuff here.
Debug.Print path & "\" & file
Next
End If
End Sub
Private Function GetFilesModifiedAfter( _
path As String, _
after As Date, _
Optional mask As String) As Collection
Dim cmd As String
Dim piped() As String
Dim result As Collection
Dim i As Long
'Build the command string.
'Date must be formatted as MM/DD/YYYY.
cmd = "cmd.exe /s /c forfiles /p " & _
"""" & path & """" & _
" /d +" & Format(after, "mm/dd/yyyy")
'Add the mask if passed-in.
If mask <> vbNullString Then cmd = cmd & " /m " & mask
'Execute the command and split by /r/n.
piped = Split(CreateObject("WScript.Shell").Exec(cmd).StdOut.ReadAll, vbCrLf)
'Leave if nothing is returned.
If UBound(piped) = -1 Then Exit Function
'Poplate the result collection,
'and remove the leading and trailing inverted commas.
Set result = New Collection
For i = 0 To UBound(piped)
If Len(piped(i)) > 2 Then
result.Add Mid(piped(i), 2, Len(piped(i)) - 2)
End If
Next
'Return the result collection.
Set GetFilesModifiedAfter = result
End Function
Update
I've just done some testing and it seems FSO is quicker, certainly on Folders containing fewer than 100 files. It'd be interesting to run this on really large folders (say a thousand files) as instinctively I feel Shell might have a performance advantage. However, for now, here's the FSO version:
Private Function GetFilesModifiedAfter2( _
path As String, _
after As Date, _
mask As String) As Collection
Dim fso As Object, file As Object
Dim result As Collection
'Instance of objects.
Set fso = CreateObject("Scripting.FileSystemObject")
Set result = New Collection
'Iterate the files and test date last modified property.
For Each file In fso.GetFolder(path & "\").Files
If file.Name Like mask And file.DateLastModified > after Then
result.Add file.Name
End If
Next
'Return the result collection.
Set GetFilesModifiedAfter2 = result
End Function
Related
I have gazillion files in a folder, they have been uploaded there every 30 minutes for a few years now.
I have a macro to open and modify files with or after specified last modified date, however it checks every file in the folder to determine if it satisfies the condition or not which takes forever.
Is there a way to first sort files in descending order by modified date and set macro to go from the top and just break once a specified date is reached?
Many thanks in advance!
If you need to keep all the files in one folder then something like this might be faster: you only need to check the modified date until you reach the first "too old" file.
Sub tester()
Const FOLDER As String = "C:\Tester\"
Dim arr, f
arr = AllFilesNewestFirst(FOLDER & "*")
For Each f In arr
If Len(f) > 0 Then
'process files until last modified is >10 days ago
If FileDateTime(FOLDER & f) < Date - 10 Then Exit For
'process this file
Debug.Print "Processing: " & f
End If
Next f
End Sub
'return an array of all files matching `pattern`, using dir
Function AllFilesNewestFirst(pattern)
Dim s As String
Dim oShell As Object
Dim oExec As Object, cmd
Set oShell = CreateObject("WScript.Shell")
cmd = "cmd /c dir """ & pattern & """ /A-D-H-S /b /o-d"
s = oShell.Exec(cmd).StdOut.readall()
AllFilesNewestFirst = Split(s, vbCrLf)
End Function
Perhaps use DateLastModified. If it's before/after a date, then you can run the rest of the macro:
Sub ShowFileAccessInfo(filespec)
Dim fs, f, s
Set fs = CreateObject("Scripting.FileSystemObject")
Set f = fs.GetFile(filespec)
s = UCase(filespec) & vbCrLf
s = s & "Created: " & f.DateCreated & vbCrLf
s = s & "Last Accessed: " & f.DateLastAccessed & vbCrLf
s = s & "Last Modified: " & f.DateLastModified
MsgBox s, 0, "File Access Info"
End Sub
Admittedly, it requires each file to be opened, but it would work.
Edit: Using the above, you could even create a quick function:
Function modified_date(filespec) As Date
Dim fs, f, s
Set fs = CreateObject("Scripting.FileSystemObject")
Set f = fs.GetFile(filespec)
modified_date = Format(f.datelastmodified, "MM/DD/YYYY")
End Function
Sub t()
Dim myFile As String
myFile = "C:\User\Documents\SomeFile.pdf"
Dim oldestDate As Date
oldestDate = Format("10/1/2010", "MM/DD/YYYY")
Dim fileModifiedDate As Date
fileModifiedDate = modified_date(myFile)
Debug.Print "File was last modified on: " & filemodified; Date
Debug.Print "You want to run this on files modified after: " & oldestDate
If oldestDate <= fileModifiedDate Then
Debug.Print "This file is modified AFTER your chosen date"
' Do stuff
End If
End Sub
OK, so this is going to get painful, but it should be faster once you get it working. We'll use the DIR command of the CMD shell and send the output to a file.
Use the Shell VBA command to execute CMD with a "dir /o:d /t:w *. > mylisting.txt". This will create a date-sorted listing of your files (oldest first) and put it in a file called mylisting.txt.
Wait around until the shell command finishes. Maybe loop 10 ten times, sleeping for a few seconds and then looking for your file to be created. Handle the error if the file doesn't show up.
Read the file into a new worksheet as text file.
Starting at the bottom of that new sheet, parse out the file name and the write date. Regular expression functions might be useful for this.
Execute your file-by-file logic.
Loop up to the top entry until the write date < your trigger date.
Clean up.
Hello Dear StackOverFlowers,
My question may be trival but I'm currently out of option to think of after searching all afternoon
Context: I have a excel worksheet with 120 rows or so that I need to use to create files with.
Data is structured as follow:
The A column contains destination file names
B column has the corresponding data that needsto be written in each file
Giving us the following general layout
data file layout
So, to get data from B column written in each A column named files, I wrote the followin VBScript snippet:
Option Explicit
Sub writeExportedMsgToXML()
' wrote that tiny script not to have to copy pate 117 messages by hand to have ops put them back on Q
Dim currentRow As Integer
' modify to match your data row start and end
For currentRow = 2 To 11
Dim messageID As String
Dim messageitSelf As String
messageID = Trim(ActiveSheet.Range("A" & currentRow))
messageitSelf = ActiveSheet.Range("B" & currentRow)
Dim subDirectory As String
subDirectory = "xmls"
Dim filePath As String
filePath = ActiveWorkbook.Path & "\" & subDirectory & "\" & messageID & ".xml"
MsgBox (messageitSelf) ' for test purpose
Open filePath For Output As #1
Write #1, messageitSelf
Close #1
Next currentRow
End Sub
The script does mostly what it's intended for Except , and this is the source of my question today, it enclose the file content between double quotes as you can see below:
file content enclosed in double quotes
So, in a case where a file named F1.xml should just contain <foo><bar>Baz</bar></foo>
My script transform it as "<foo><bar>Baz</bar></foo>"
What I tried
Replacing file writing part with the following
Dim objStream
Set objStream = CreateObject("ADODB.Stream")
objStream.Charset = "UTF-8"
Dim subDirectory As String
subDirectory = "xmls"
Dim filePath As String
filePath = ActiveWorkbook.Path & "\" & subDirectory & "\" & messageID & ".xml"
objStream.Open
objStream.WriteText messageitSelf
objStream.SaveToFile filePath
objStream.Close
With same outcome
Any clues on what I'm missing/Doing wrong ?
Should I declare messageitSelf as a different type ?
Any help would be appreciated :)
Thank you
Write# statements surround strings with double quotes:
Unlike the Print # statement, the Write # statement inserts commas between items and quotation marks around strings as they are written to the file.
Use Print# instead:
Dim fn As Long
fn = VBA.FreeFile
Open filePath For Output As #fn
Print #fn, messageitSelf
Close #fn
I've written some VBA code using file objects to go into a folder, search for particular files (CSV) that meet certain criteria (contain "HR" in filename and created within specified date range), and copy/paste information out of that file into a master file. The master file is typically a compilation of 250+ workbooks.
The macro works as it should, but it takes about 12 minutes to run, which is a bit excessive. I believe it takes so long to run because it is indexing a folder with 30,000+ files in it.
I've copied the relevant lines of my code below, if anyone is aware of any modifications I could make that would decrease the duration of my macro, I would really appreciate it. I'm relatively new to VBA and coding in general, so I'm learning as I go with these sorts of things! Thanks!
Dim FilePath As String
Dim FileName As String
Dim objFSO As FileSystemObject
Dim objFolder As Folder
Dim objFile As file
Dim fileDate As Date
Dim firstDate As Date
Dim secondDate As Date
'Defining the user-input variables
Worksheets("Sheet1").Activate
firstDate = Cells(2, "E").Value
secondDate = Cells(3, "E").Value
'FilePath to information, defining file objects
FilePath = "\\SRV-1\process\DUMP\"
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFolder = objFSO.GetFolder(FilePath)
'Going through Dump folder and finding high resolution files created within the date range
For Each objFile In objFolder.Files
'Checking to see if the file contains the string "HR", indicating high resolution.
If InStr(1, objFile.Name, "HR") Then GoTo Line1 Else GoTo Line3
Line1:
'Storing the file as a variable and checking its creation date
FileName = objFile.Name
OpenFile = FilePath & FileName
fileDate = FileDateTime(OpenFile)
'Checking to see if the file was created between the user input master roll start/end dates
If firstDate < fileDate And secondDate > fileDate Then GoTo Line2 Else GoTo Line3
Line2:
Do stuff: open dump workbook, copy/pase certain range into main workbook, close dump workbook, next objFile
Line3:
Next objFile
This ought to show some improvement, considering the ratio of HR files to total files (250 / 30,000).
Using Dir Function, Minimize reliance on FileSystemObject
The idea here is to use the Dir function first to get a list of all file names that contain the "HR" substring, and only use the FileSystemObject against those files to get the timestamp information -- there's no use incurring the overhead of FSO on every file in that directory.
Then, we process only those files which match the "HR" criteria:
Sub usingDir()
Dim folderPath As String
Dim fileName As String
Dim filesToProcess As New Collection
Dim item As Variant
Dim fileDate As Date
Dim firstDate As Date
Dim secondDate As Date
'Defining the user-input variables
firstDate = Cells(2, "E").Value
secondDate = Cells(3, "E").Value
folderPath = "\\SRV-1\process\DUMP\"
' Gets a collection of files matching the "HR" criteria
fileName = Dir(folderPath)
Do While Not fileName = ""
If InStr(fileName, "HR") > 0 Then
'Only processing files with "HR"
filesToProcess.Add (folderPath & fileName)
End If
fileName = Dir
Loop
'Now we deal only with the "HR" files:
With CreateObject("Scripting.FileSystemObject")
For Each item In filesToProcess
' Check the date last modified
fileDate = .GetFile(item).DateLastModified ' modify as needed
If firstDate < fileDate And secondDate > fileDate Then
'
'
Debug.Print item
'your code to Do Stuff goes here
'
'
'
End If
Next
End With
End Sub
UPDATE: Without Using the FileSystemObject
This was nagging at me, and I figured there must be a way to get the timestamp information without relying on FileSystemObject. There is. We'll still use Dir to traverse the files, but now we'll eliminate any reference to FileSystemObject and replace with some fancy WinAPI function calls. Check out Chip Pearson's article here and download the .bas modules. You'll need the following two files imported to your VBProject:
modGetSetFileTimes
modTimeConversionFunctions
And then you can do something like this:
Option Explicit
Sub withoutFSO()
Dim folderPath As String
Dim FileName As String
Dim filesToProcess As New Collection
Dim item As Variant
Dim fileDate As Date
Dim firstDate As Date
Dim secondDate As Date
'Defining the user-input variables
firstDate = Cells(2, "E").Value
secondDate = Cells(3, "E").Value
folderPath = "\\Your\Path"
' Gets a collection of files matching the "HR" criteria and our Date range
FileName = Dir(folderPath)
Do While Not FileName = ""
'Only processing files with "HR"
If InStr(FileName, "HR") > 0 Then
' Only process files that meet our date criteria
fileDate = CDate(modGetSetFileTimes.GetFileDateTime(CStr(item), FileDateLastModified))
If firstDate < fileDate And secondDate > fileDate Then
filesToProcess.Add (folderPath & FileName)
End If
End If
FileName = Dir
Loop
'Now we deal only with the matching files:
For Each item In filesToProcess
Debug.Print item
Debug.Print fileDate
'your code to Do Stuff goes here
'
'
'
Next
End Sub
This should be an improvement even over my original answer, and, if combined with a more efficient manner of retrieving data (i.e., using ADO instead of Workbooks.Open, if possible) then you should be very optimized.
Take a look at Power Query -- it's a Microsoft add-in for Excel versions 2012 & 2013, and built-in to 2016. Setting up PQ to do this will be amazingly fast, and the 'script' is reusable! No VBA needed.
You can search and combine the multiple files on the specified criteria, but then merge or append to the new/master file, too. For efficiency, rather than processing each file individually, might I suggest gathering up all the data files (by your criteria), combining them to one table, then use the new table to merge/append to the new/master
Hope this helps...
In addition to using the Dir function instead of FileSystemObject, if you cannot automate PowerQuery, and all you need is the data and not the formatting, consider making a direct data connection to the source workbooks using ADODB.
Add a reference to Microsoft ActiveX Data Objects 6.1 Library (via Tools -> References...). There may be versions other than 6.1; choose the highest.
Then you can use something like the following code:
Dim fso As New Scripting.FileSystemObject
Dim filepath As Variant
For Each filepath In filesToProcess
' Check the date last modified
fileDate = fso.GetFile(item).DateLastModified ' modify as needed
If firstDate < fileDate And secondDate > fileDate Then
Dim connectionString As String
connectionString = _
"Provider=Microsoft.ACE.OLEDB.12.0;" & _
"Data Source=""" & filepath & """;" & _
"Extended Properties=""Excel 12.0;HDR=No"""
Dim worksheetName As String
worksheetName = "Sheet1"
' There can be multiple worksheets per workbook.
' If you are only interested in one worksheet per workbook, then fill in worksheetName somehow
' Otherwise, you will probably need an inner loop to iterate over all the worksheets
Dim sql As String
sql = _
"SELECT * " & _
"FROM [" & worksheetName & "$]"
Dim rs As New ADODB.Recordset
rs.Open sql, connectionString
destinationWorksheet.Range("A1").CopyFromRecordset rs
rs.Close
Set rs = Nothing
End If
Next
It took a long time because for each interation you pass the information to the main worksheet.
In this case is better use a multidimensional array to keep the information and in the end of the process you pass the the array info in the main worksheet.
I dont know what information you get in each worksheet, soo i cant create an didatical example for you.
I have an issue and I need your help. here is the problem. I have inside a folder some excel files that I have to open automatically in order to make some operations. Those files have the same name except the number of the files like this:
Folder name : Extraction_Files
Files name : - "System_Extraction_Supplier_1"
- "System_Extraction_Supplier_2"
- "System_Extraction_Supplier_3"
The number of files can change so i used a loop Do While to count the number of files, then the plan is to use a loop for I =1 to ( number of files) to open all of theme.
please read my code. I know that i used a wrong way to read file name using a loop for but I share it because I don't have an other idea.
Here is my code :
Sub OpenFiles ()
Dim MainPath as String
Dim CommonPath as String
Dim Count As Integer
Dim i As Integer
' the main path is " C:\Desktop\Extraction_Files\System_Extraction_Supplier_i"
'with i = 1 to Count ( file number )
CommonPath = "C:\Desktop\Extraction_Files\System_Extraction_Supplier_*"
'counting automatically the file number
Filename = Dir ( CommonPath )
Do While Filename <> ""
Count = Count + 1
Filename = Dir ()
Loop
'the issue is below because this code generate a MsgBox showing a MainPath with the index i like this
'"C:\Desktop\Extraction_Files\System_Extraction_Supplier_i"
' so vba can not find the files
For i = 1 To count
MainPath = "C:\Desktop\Extraction_Files\System_Extraction_Supplier_" & "i"
MsgBox MainPath &
Workbooks.Open MainPath
Next
End Sub
what is the best approach to this?
Why not count as you open them. You're already identifying them so why not open each file as you go:
Sub OpenFiles()
Dim Filename As String
Dim CommonPath As String
Dim Count As Integer
CommonPath = "C:\Desktop\Extraction_Files\"
Filename = Dir(CommonPath & "System_Extraction_Supplier_*")
Do While Filename <> ""
MsgBox Filename
Workbooks.Open CommonPath & Filename
Count = Count + 1
Filename = Dir()
Loop
End Sub
PS. It might be worth adding .xl* or similar to the end of your search pattern to prevent Excel trying to open files that aren't Excel files:
Filename = Dir(CommonPath & "System_Extraction_Supplier_*.xl*")
If you want to open all folders, in a specific folder, which start with "NewFile_", one loop only is needed:
Sub OpenFolders()
Dim path As String: path = ""C:\Desktop\Extraction_Files\""
Dim fileStart As String: fileStart = "System_Extraction_Supplier_"
Dim Fso As Object
Dim objFolder As Object
Set Fso = CreateObject("Scripting.FileSystemObject")
Set objFolder = Fso.GetFolder(path)
For Each objSubFolder In objFolder.subfolders
If InStr(1, objSubFolder.Name, fileStart) Then
Shell "explorer.exe " & objSubFolder, vbNormalFocus
Debug.Print objSubFolder.Name
End If
Next objSubFolder
End Sub
Folders in vba are opened with the Shell "explorer.exe " command. The code opens every folder in "C:\yourFile\", which contains NewFile_ in the name. This check is done with If InStr(1, objSubFolder.Name, fileStart) Then.
I'm trying to have the program to copy the files with certain characters. The files to be copied over should be between the today's date and 100 days before today. My program can run, but nothing show up on the new folder. I did make sure that the file is between those date. I don't get any error, so I have no idea where to fix. I have tried other methods, none of them working.
I try to mix the code from http://www.rondebruin.nl/win/s3/win026.htm. I was playing abround with it, only copy_folder() is working. I'm getting runtime error '53' - File not found on Copy_Certain_Files_In_Folder() and Copy_Files_Dates() give me nothing as well.
Anyway, what is wrong with my code and how can I incorporate the FileExt to my code below? Thanks!
Sub CopyPasteFiles()
Dim FSO As Object
Dim FromPath As String
Dim ToPath As String
Dim Fdate As Date
Dim FileExt As String
Dim objFile As Object
Dim objFolder As Object
FromPath = "C:\Users\Run" '<< Change
ToPath = "C:\Users\Test" '<< Change
FileExt = "*BT.csv"
If Right(FromPath, 1) <> "\" Then
FromPath = FromPath & "\"
End If
Set FSO = CreateObject("scripting.filesystemobject")
If FSO.FolderExists(FromPath) = False Then
MsgBox FromPath & " doesn't exist"
Exit Sub
End If
If FSO.FolderExists(ToPath) = False Then
MsgBox ToPath & " doesn't exist"
Exit Sub
End If
For Each objFolder In FSO.GetFolder(FromPath).SubFolders
For Each objFile In objFolder.Files
Fdate = Int(objFile.DateCreated)
If Fdate >= Date And Fdate <= Format(DateAdd("d", -100, Date), "dd mmmm yyyy") Then
objFile.Copy ToPath
End If
Next objFile
Next objFolder
MsgBox "You can find the files from " & FromPath & " in " & ToPath
End Sub
Okay, I tried to add some comments to give you some direction.
The first problem you had, is you weren't doing anything with the root folder - you were trying to go right into the subfolders, which is probably why you said it "highlighted" the lines on the outer loop layer. (The highlighted line is the line that will be executed when you hit F8 next.)
What I did was break the copy action into another procedure so you can call it recursively on any subfolders. It's just one way to do it - there are other, probably simpler ways, but it's what came to mind for me as I'm kind of used to digging in folders and recordsets recursively this way.
Another problem you had was your method of comparing dates. The format of the .DateCreated property comes in with date and time. You can directly compare this to the Now() function, which returns date and time - but if you try to compare to the Date() function it won't work because it's a different format.
I wasn't sure what you were trying to do with the file extension bit. I assumed you wanted to use it as a filter, so that's what I did with it.
A few notes:
You currently are telling the user at the end that "you can find the files from " but you are not checking if that is true. You may want to add a check after the .Copy action and then add your results to an array or something so you can show the user a list of files that successfully copied and files that did not. When I was testing, I created the folders you had in my Users directory, and I got an error when trying to copy of not having the required permissions.
Right now the From path, To path, and extension filter are all hard-coded. If you plan to distribute this or will be using it yourself in multiple locations, you could use the BrowseForFolder method to present the user with a folder browser dialog and allow them to select the From and To folders. You could also use InputBox to get a filter from the user. Just a thought.
Anyways, here is what I did with your code. I changed the variable names to my naming convention simply because that is what I'm used to - you can change them however you want.
Option Explicit
Public Sub CopyPasteFiles()
'Declare variables
Dim SRfso As Scripting.FileSystemObject
Dim strFrom As String
Dim strTO As String
Dim strExtFilter As String
Dim SRfolderA As Scripting.Folder
Dim SRfolderB As Scripting.Folder
'Are you always going to hardcode these or do you want to be able to browse for a folder?
strFrom = "C:\Users\Run" '<< Change
strTO = "C:\Users\Test" '<< Change
'I'm not sure what your intent is with this - I assumed you wanted to filter by file extension.
strExtFilter = "*BT.CSV"
'Prep the folder path
If Right(strFrom, 1) <> "\" Then
strFrom = strFrom & "\"
End If
'Intialize the FileSystemObject
Set SRfso = New Scripting.FileSystemObject
'Verify input and output folders exist. Inform user if they don't.
If SRfso.FolderExists(strFrom) = False Then
MsgBox strFrom & " doesn't exist"
Exit Sub
End If
If SRfso.FolderExists(strTO) = False Then
MsgBox strTO & " doesn't exist"
Exit Sub
End If
'Get the input folder using the FileSystemObject
Set SRfolderA = SRfso.GetFolder(strFrom)
'Call the routine that copies the files
MoveTheFiles SRfolderIN:=SRfolderA, strFolderOUT:=strTO ', strExtFilter:=strExtFilter
'Inform the user where they can find the files. CAUTION: You may be misinforming the user.
MsgBox "You can find the files from " & strFrom & " in " & strTO
End Sub
Private Sub MoveTheFiles(ByRef SRfolderIN As Scripting.Folder, _
ByRef strFolderOUT As String, _
Optional ByRef strExtFilter As String = "*.*", _
Optional ByRef blnSUBFOLDERS As Boolean = True)
'This routine copies the files. It requires two arguments. First, it requires the root folder as folder object from the scripting library. _
Second, it requires the output path as a string. There are two optional arguments. The first allows you _
to use a text filter as a string. The second is a boolean that tells us whether or not to move files in subfolders - the default is true.
'Delcare variables
Dim SRfileA As Scripting.File
Dim SRfolderCol As Scripting.Folders
Dim SRfolderA As Scripting.Folder
Dim datCreated As Date
Dim lngFX As Long
Dim blnResult As Boolean
'Find the file extension in the filter
lngFX = InStrRev(strExtFilter, ".", , vbTextCompare)
'Move the files from the root folder
For Each SRfileA In SRfolderIN.Files
'Only work with files that contain the filter criteria
If Ucase(Mid(SRfileA.Name, InStrRev(SRfileA.Name, ".", , vbTextCompare) - (Len(strExtFilter) - lngFX) + 1, Len(strExtFilter))) Like Ucase(strExtFilter) Then
'Only work with files that were created within the last 100 days
datCreated = SRfileA.DateCreated
If datCreated <= Now And (datCreated >= DateAdd("d", -100, Now())) Then
SRfileA.Copy strFolderOUT
End If
End If
Next
'Check if the calling procedure indicated we are supposed to move subfolder files as well
If blnSUBFOLDERS Then
'Check that we have subfolders to work with
Set SRfolderCol = SRfolderIN.SubFolders
If SRfolderCol.Count > 0 Then
For Each SRfolderA In SRfolderIN.SubFolders
MoveTheFiles SRfolderIN:=SRfolderA, strFolderOUT:=strFolderOUT, strExtFilter:=strExtFilter, blnSUBFOLDERS:=blnSUBFOLDERS
Next
End If
End If
End Sub