Write to file name based on logger - log4net

I want to write to a file name that is named after the logger I create.
For example if I call LogManager.GetLogger("MyClass") then I want it to log to MyClass.log.
Is this possible?

Related

Azure Data Factory: output dataset file name from input dataset folder name

I'm trying to solve following scenario in Azure Data Factory:
I have a large number of folders in Azure Blob Storage. Each folder contains varying number of files in parquet format. Folder name contains the date when data contained in the folder was generated, something like this: DATE=2021-01-01. I need to filter the files and save them into another container in delimited format and each file should have the date indicated in source folder name in it's file name.
So when my input looks something like this...
DATE=2021-01-01/
data-file-001.parquet
data-file-002.parquet
data-file-003.parquet
DATE=2021-01-02/
data-file-001.parquet
data-file-002.parquet
...my output should look something like this:
output-data/
data_2021-01-01_1.csv
data_2021-01-01_2.csv
data_2021-01-01_3.csv
data_2021-01-02_1.csv
data_2021-01-02_2.csv
Reading files from subfolders and filtering them and saving them is easy. Problems start when I'm trying to set output dataset file name dynamically. I can get the folder names using Get Metadata activity and then I can use ForEach activity to set them into variables. However, I can't figure out how to use this variable in filtering data flow sinks dataset.
Update:
My Get Metadata1 activity, set the container input as:
Set the container input as follows:
My debug info is as follows:
I think I've found the solution. I'm using csv files for example.
My input looks something like this
container:input
2021-01-01/
data-file-001.csv
data-file-002.csv
data-file-003.csv
2021-01-02/
data-file-001.csv
data-file-002.csv
My debug result is as follows:
Using Get Metadata1 activity to get the folder list and then using ForEach1 activity to iterate this list.
Inside the ForEach1 activity, we now using data flow to move data.
Set the source dataset to the container and declare a parameter FolderName.
Then add dynamic content #dataset().FolderName to the source dataser.
Back to the ForEach1 activity, we can add dynamic content #item().name to parameter FolderName.
Key in File_Name to the tab. It will store the file name as a column eg. /2021-01-01/data-file-001.csv.
Then we can process this column to get the file name we want via DerivedColumn1.
Addd expression concat('data_',substring(File_Name,2,10),'_',split(File_Name,'-')[5]).
In the Settings of sink, we can select Name file as column data and File_Name.
That's all.

Data Factory Data Flow sink file name

I have a data flow that merges multiple pipe delimited files into one file and stores it in Azure Blob Container. I'm using a file pattern for the output file name concat('myFile' + toString(currentDate('PST')), '.txt').
How can I grab the file name that's generated after the dataflow is completed? I have other activities to log the file name into a database, but not able to figure out how to get the file name.
I tried #{activity('Data flow1').output.filePattern} but it didn't help.
Thank you
You can use GetMeta data activity to get the file name that is generated after the data flow.

Handling logs and writing to a file in python?

I have a module name acms and inside that have number of python files.The main.py has calls to other python files.I have added logs in those files, which are displayed on console but i also want to write these logs in a file called all.log, i tried with setting log levels and logger in a file called log.py but didnt get the expected format,since am new to python am getting difficulty in handling logs
Use the logging module and use logger = logging.getLogger(__name__). Then it will use the correct logger with the options that you have set up.
See the thinkpad-scripts project for its logging. Also the logging cookbook has a section for logging to multiple locations.
We use the following to log to the console and the syslog:
kwargs = {}
dev_log = '/dev/log'
if os.path.exists(dev_log):
kwargs['address'] = dev_log
syslog = logging.handlers.SysLogHandler(**kwargs)
syslog.setLevel(logging.DEBUG)
formatter = logging.Formatter(syslog_format)
syslog.setFormatter(formatter)
logging.getLogger('').addHandler(syslog)

How can I write my properties file if I want to get a log file every hours using log4j?

I have make my properties file ok,but what should I do if I want to put the log file in a folder relate to the date?
For example,today is 12/29 2015,at 10:30,I started my java project,the log4j.propertites about the log like the following ones:
log4j.appender.inforlog=org.apache.log4j.DailyRollingFileAppender
log4j.appender.inforlog.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.inforlog.File=D:/inforLogs/2015/12/searchrecord
when it comes to 11:00,there will be a log file named searchrecord.2015-12-29-10 in "D:/inforLogs/2015/12/", when it comes to 01/01 2016,the log file will alse in file "D:/inforLogs/2015/12/",but I want to make it in file "D:/inforLogs/2016/01/" by write the properties file properly,what should I do?
I have resolve the problem myself,here is the properties file
log4j.appender.inforlog.DatePattern='s/'yyyy'/'MM'/searchrecord-'dd'_'HH'.log'
log4j.appender.inforlog.File=D:/inforLog

Rename the file and the name of the class inside in the same time

For php files, is there any plugin that renames the file and the name of the class inside in the same time?

Resources