Azure create filename from blob name in nLog.config - azure

I am newbie to Azure and just digging out to my first task. we are creating logs file for error logs.
I want to create 4 diff. which has logs of 6 hours from starting of the day. Please find below my nlog.config code:
<target type="AzureBlobStorage"
name="Trace-BlobStorageLogger"
layout=""
connectionString=""
container=""
blobName="nlog-storage-trace-test-${date:format=dd-MM-yyyy}.txt"/>
Right now, it generating one file for whole day, but due to storage capacity once it is full then after logs are not logging.
We want to divide it into 6 hours each into 4 files. We want file to be created something like:
nlog-storage-trace-test-10-06-2020-0000-0600.txt
nlog-storage-trace-test-10-06-2020-0600-1200.txt
and so on.
What change is needed in blobName in target tag in nlog.config file or nay other change that fulfills my requirement.
Thanks

The "new" NLog.Extensions.AzureBlobStorage will reduce the number of writes, so it stay below 50000 file-operations per day:
https://www.nuget.org/packages/NLog.Extensions.AzureBlobStorage/
But if you want the filename to "expire" every 6 hours, then I guess you can use cachedSeconds do this:
<target type="AzureBlobStorage"
name="Trace-BlobStorageLogger"
blobName="nlog-storage-trace-test-${date:format=dd-MM-yyyy_hhmm:cachedSeconds=21600}.txt"/>
Alternative you can write/register your own custom NLog LayoutRenderer:

Related

Rotate logfiles on an hourly basis by appending date and hour

I wanted to implement a log rotation option in linux. I have a *.trc file where all the logs are getting written. I wanted a new log file to be created every hour. I have done some analysis and found the below
I have done some analysis and got to know about the logrotate option. Where we need to update the rotation details for a specific file in the logrotate.conf file
I wanted to know if there is an option without using the logrotate option. I wanted to rotate the logfiles on an hourly basis, so something like appending date and hour information to the log file and create new files based on the current hour information.
Im looking for some suggestions on how to implement the log rotation using the second option specified above.
Any details on the above would be really helpful
If you have control over the process that creates the logs, you could just timestamp the file at the moment of creation. This will remove the need to rename the log.
Before you write every line you check the time. If one hour passed after that file was created, you close the current file and open a new one with a new timestamp.
If you do not have control over the process, you can pipe the output of your process (stdout,stderr) to multilog, which is a binary that's part of the package daemon-tools in most Linux distros.
https://cr.yp.to/daemontools/multilog.html

Filter recent files in Logic Apps' SFTP when files are added/modified trigger

I have this Logic App that connects to an SFTP server and it's triggered by the "files are added or modified" trigger. It's set to run every 10 minutes, looking for new/modified files and copying them to an Azure storage account.
The problem is that this SFTP server path is set to overwrite a set of files every X minutes (I have no control over this) and so, pretty often the Logic App overlaps with the update process of these files and downloads files that are still being written. The result is corrupted files.
Is there a way to add a filter to the When files are added or modified (properties only) so that it only takes into consideration files with a modified date of, at least, 1 minute old?
That way, files that are currently being written won't be added to the list of files to download. The next run of the Logic App would then fetch this ignored files and so on.
UPDATE
I've found a Trigger Conditions in the trigger's setting but I can't find any documentation about it.
According to test the trigger "When files are added or modified", it seems we can not add a filter in the trigger to filter the records which are modified at least 1 minute ago. We can just get the List of Files LastModified datetime and loop them, use "If" condition to judge if we should download it.
Update:
The expression in the screenshot is:
sub(ticks(utcNow()), ticks(triggerBody()?['LastModified']))
Update workaround
Is it possible to add a "Delay" action when the last modified time less than 1 minute ? For example, if the last modified time less than 60 seconds, use "Delay" to wait 5 minutes until the overwrite operation complete, then do the download.
I check the sample #equals(triggers().code, 'InternalServerError'), actually it uses the condition functions in Logical comparison functions, so the key word is make sure the property you want to filter is in the trigger or triggerBody or you will get the below error.
So I change the expression to like #greater(triggerBody().LastModified,'2020-04-20T11:23:00Z'), this could filter the file modified less than 2020-04-20T11:23:00Z not trigger the flow.
Also you could use other function like less ,greaterOrEquals etc in the Logical comparison functions.

Is it possible to archive logs based on size and on time using NLog

I am using NLog.
I would like to have a size-based file archival and a time-based file archival.
Meaning, every time the log file exceeds 10 MB a new log file is created. Also, every day a new log file is created.
It is clear how to do each of the above separately (https://github.com/NLog/NLog/wiki/FileTarget-Archive-Examples) but not how to use them in tandem.
Without any details of the expected fileName-Layout, then this will work just fine in NLog 4.5 (and newer):
<target type="file" name="logfile" fileName="App-${shortdate}.log" archiveAboveSize="1000000" maxArchiveFiles="30" />
It will produce the following filenames (newest first)
App-20200216.log
App-20200216.2.log
App-20200216.1.log
App-20200215.log
App-20200214.log
App-20200214.1.log
See also: https://github.com/NLog/NLog/wiki/File-target#archive-old-log-files

How to retrive Files generated in the past 120 minutes in Linux and also moved to another location

For one of my Project, I have a certain challenge where I need to take all the reports generated in a certain path, I want this to be an automated process in "Linux". I know the way how to get the file names which have been updated in the past 120 mins, but not the files directly. Now my requirements are in such a way
Take a certain files that have been updated in past 120 mins from the path
/source/folder/which/contains/files
Now do some bussiness logic on this generated files which i can take care of
Move this files to
/destination/folder/where/files/should/go
I know how to achieve #2 and #3 but not sure of #1. Can someone help me how can i achieve this.
Thanks in Advance.
Write a shell script. Sample below. I haven't provided the commands to get the actual list of file names as you said you know how to do that.
#!/bin/sh
files=<my file list>
for file in $files; do
cp $file <destination_dirctory>
done

NLOG Rollover Configuration

Is there a configuration in NLOG that accomplishes the following
1)A new log file should be created , when the current file exceeds a particular size for ex:- 5 MB
2)The old log files should be deleted after a configured amount of time period like for ex: - 1 day
You can find answer for your question (and examples) on page:
Size-based file archival - log files can be automatically archived by moving them to another location after reaching certain size and
Time-based file archival - log files can calso be automatically archived based on time
Try to use the second one and change log files every days. Than you can keep maximum number of archived files.

Resources