If folder exists based on string, copy files to existing folder - string

I have a series of folders that I individually download via FTP on a daily basis.
Folder names are formatted as follows:
[DATE] [ID NO] [ID NAME]
Example:
W:\20150101 G0101 Building 1
W:\20150101 G0102 Building 2
W:\20150101 G0103 Building 3
[ID NO] and [ID NAME] are always the same. [DATE] changes each day.
I am trying to write a batch file to copy the contents of each FTP folder on a daily basis containing a known string to an equivalent local folder. Once the contents of each FTP folder are copied, the FTP folders are removed.
For example:
Always copy the folder containing the string "G0101" on FTP drive "W:\" to "C:\Building 1"
Always copy the folder containing the string "G0102" on FTP drive "W:\" to "C:\Building 2"
I have been playing around with IF EXIST but I can’t quite get the syntax right. Any assistance would be most appreciated!
IF EXIST "C:\00\*G0101*" XCOPY "C:\00\*G0101*\*.*" "C:\00\Building 1"
Regards
Martin :-)

You can only include a wildcard in the last element of the path. So, you need to enumerate the source folders and for each of them execute the xcopy operation
for /d %%a in ("c:\00\*G0101*") do xcopy "%%~fa\*.*" "c:\00\Building 1"
The for /d will search the folders matching the wildcard and for each of them a reference will be stored in the replaceable parameter %%a and the code in the do clause will be executed. %%~fa is just the reference to the folder with full path.
The code is written to be used inside a batch file. For usage in command line, replace all the double percent signs with single percent signs

Related

Copy files in subdirs to azure storage with ADF

I have a folder structures like this:
folder1/folder2
/YearNumber1
/monthYear1
/somefile.csv, tbFiles.csv
/monthYear2
/somefile2.csv, tbFiles2.csv
...(many folders as above)
/YearNumber2
/montYear11
/somefileXXYYZz.csv, otherFile.csv
/monthYear12
/someFileRandom.csv. dedFile.csv
...(many folders as above)
Source:
Binary, linked via fileshare linked service
Destination:
Binary, on azure blob storage
I don't want to retain the structure, just need to copy all csv files.
Using CopyActivity:
Wildcard Path: #concat('folder1/folder2/','*/','*/',) / '*.csv'
with recursive
But it copies nothing, 0 Bytes.
You can use the below options in the CopyActivity Source Setting:
1. File path type
Allowed wildcards are: * (matches zero or more characters) and ? (matches zero or single character); use ^ to escape if your actual folder name has wildcard or this escape char inside.
See official MS docs for more examples in Folder and file filter examples.
wildcardFolderPath - The folder path with wildcard characters under your file system configured in dataset to filter source folders.
wildcardFileName - The file name with wildcard characters under your file system + folderPath/wildcardFolderPath to filter source files.
2. recursive - When set to true the data is read recursively from the subfolders.
Example:
If there are only .csv files in your source directories you can simply specify wildcardFileName as just *

Copy a set of files using ADF

I have 10 files in a folder and want to move 4 of them in a different location.
I tried 2 approaches to achieve this -
using lookup to retrieve the filenames from a json file- then feeding it to a for each iterator
using metadata to get file names from source folder and then adding if condition inside a for each to copy the files.
But in both the cases, all the files in source folder gets copied.
Any help would be appreciated.
Thanks!!
There a 3 ways you can consider selecting your files depending on the requirement or blockers.
Checkout official MS doc: Copy activity properties
1. Dynamic content for FilePath property in Source Dataset.
2. You can use Wildcard character in the source folder and file path in the source Dataset.
Allowed wildcards are: * (matches zero or more characters) and ?
(matches zero or single character); use ^ to escape if your actual
folder name has wildcard or this escape char inside. See more
examples in Folder and file filter
examples.
3. List of Files
Point to a text file that includes a list of files you want to copy,
one file per line, which is the relative path to the path configured
in the dataset. When using this option, do not specify file name in
dataset. See more examples in File list
examples.
Example:
Parameterize source dataset and set source file name to that which passes the expression evaluation in IfCondition Activity.

ADF Azure Data-Factory loop over folder syntax - wilcard?

i'm tryimg to loop over a diffrent countries folder that got fixed sub folder named survey (i.e Spain/survey , USA/survey ).
where and how I Need to define a wildcard / parameter for the countries so I could loop over all the files that in the survey folder ?
what is the right wildcard syntax ? ( the equivalent of - like 'survey%' in SQL) ?
I tried several ways to define it with no success and I would be happy to get some help on this - Thanks !
In case if the list of paths are static, you can create a parameter or add it in a SQL database and get that result from a lookup activity.
Pass the output to a for each activity and within foreach activity use a copy activity.
You can parameterize the input dataset to get the file paths thereby you need not think of any wildcard characters but use the actual paths itself.
Hope this is helpful.

pentaho create archive folder with MM-YYYY

I would like to archive every file in a folder by putting it in another archive folder with a name like this: "Archive/myfolder-06-2014"
My problem is how to retrieve the current month and year and then how to create a folder (if it does not already exist) with these data.
This solution may be a little awkward (due to the required fuss) but it seems to work. The idea is to precompute the target filename in a seperate transformation and store it as a system variable (TARGET_ZIP_FILENAME):
The following diagrams show the settings of selected components.
Get the current time...
Provide the pattern of the target filename as a string constant...
Extract the month and year as formatted integers...
Replace the month in the pattern (the year will work equivalently)
Set the resulting filename as a system variable
The main job will call the transformation and use the system variable as the zip target filename.
Also you have to make sure that the setting Create Parent folder is active:

Using log parser to parse lot of logs in different folders

I recently started to use Log Parser with visual interface.
The logs that I want to parse come from IIS, and they are related to SharePoint. For example, I want to know how many people were visiting particular web pages, etc.
And it seems that IIS creates logs in different folders (I don't know why) and every day there is a new log file in a different folder.
So my question is, is it possible to approach all those files in different folders?
I know you can use From-clause, put different folders, but it is too difficult especially if in the future new folders are added. The goal is to create one script which would be executed.
So for example in a folder log named LogFIles, I have folders folder1, folder2, folder3, folder4, etc. and in each folder there are log files log1, log2, log 3, logN, etc.
So my query should be like this: Select * FROM path/LogFiles/*/*.log but the log parser doesn't accept it, so how to realize it?
You can use the -recurse option when calling logparser.
For example:
logparser file:"query.sql" -i:IISW3C -o:CSV -recurse
where query.sql contains:
select *
from .\Logs\*.log
and in my current directory, there is a directory called "Logs" that contains multiple sub-directories, each containing log files. Such as:
\Logs\server1\W3SVC1
\Logs\server1\W3SVC2
\Logs\server2\W3SVC1
\Logs\server2\W3SVC2
etc.
You can merge the logs then query the merged log
what i have to do is
LogParser.exe -i:w3c "select * into E:\logs\merged.log from E:\logs\WEB35\*, E:\logs\WEB36\*, E:\logs\WEB37\*" -o:w3c
I prefer powershell like this:
Select-String C:\Logs\diag\*.log -pattern "/sites/Very" | ?{$_.Line -match "Important"}
LogParser's help does not list the -recurse option so I'm not sure it it's still supported. However, this is what I did to get around it:
Let's say you use the following command to execute logparser -
logparser "SELECT * INTO test.csv FROM 'D:\samplelog\test.log'" -i:COM -iProgID:Sample.LogParser.Scriptlet -o:CSV
Then simply create a batch script to "recurse" through the folder structure and parse all files in it. The batch script that does this looks like this -
echo off
for /r %%a in (*) do (
for %%F in ("%%a") do (
logparser "SELECT * INTO '%%~nxF'.csv FROM '%%a'" -i:COM -iProgID:Sample.LogParser.Scriptlet
REM echo %%~nxF
)
)
Execute it from the path where the log files that need to be parsed are located.
This can further be customized to spit all parsed logs in one file using append (>>) operator.
Hope this helps.
Check this out: https://stackoverflow.com/a/31024196/4502867 by using powershell to recursively get file items in sub directories and parse them.

Resources