Can I force the archiving of Nlog files - nlog

I've made a page in my website where the admin can see all the errorfiles.
I've set the archiving to 'day', so for every day where an error occured I have a file called error.yyyyMMdd.txt (in a subfolder called archives) and for today I have the file 'error.txt'
What happens is that when I have a few days without errors, the file 'error.txt' is not touched, so the file 'errors.txt' is not from today, but from lets say 5 days ago, and in the archives subfolder, I don't have an error file for the situation five days ago.
Is there a way to 'force' Nlog to perform it's archiving and thereby create the archive file?

This isn't possible (yet).
If this will be added, it will be added as an method on the FileTarget class

Related

Rotate logfiles on an hourly basis by appending date and hour

I wanted to implement a log rotation option in linux. I have a *.trc file where all the logs are getting written. I wanted a new log file to be created every hour. I have done some analysis and found the below
I have done some analysis and got to know about the logrotate option. Where we need to update the rotation details for a specific file in the logrotate.conf file
I wanted to know if there is an option without using the logrotate option. I wanted to rotate the logfiles on an hourly basis, so something like appending date and hour information to the log file and create new files based on the current hour information.
Im looking for some suggestions on how to implement the log rotation using the second option specified above.
Any details on the above would be really helpful
If you have control over the process that creates the logs, you could just timestamp the file at the moment of creation. This will remove the need to rename the log.
Before you write every line you check the time. If one hour passed after that file was created, you close the current file and open a new one with a new timestamp.
If you do not have control over the process, you can pipe the output of your process (stdout,stderr) to multilog, which is a binary that's part of the package daemon-tools in most Linux distros.
https://cr.yp.to/daemontools/multilog.html

Filter recent files in Logic Apps' SFTP when files are added/modified trigger

I have this Logic App that connects to an SFTP server and it's triggered by the "files are added or modified" trigger. It's set to run every 10 minutes, looking for new/modified files and copying them to an Azure storage account.
The problem is that this SFTP server path is set to overwrite a set of files every X minutes (I have no control over this) and so, pretty often the Logic App overlaps with the update process of these files and downloads files that are still being written. The result is corrupted files.
Is there a way to add a filter to the When files are added or modified (properties only) so that it only takes into consideration files with a modified date of, at least, 1 minute old?
That way, files that are currently being written won't be added to the list of files to download. The next run of the Logic App would then fetch this ignored files and so on.
UPDATE
I've found a Trigger Conditions in the trigger's setting but I can't find any documentation about it.
According to test the trigger "When files are added or modified", it seems we can not add a filter in the trigger to filter the records which are modified at least 1 minute ago. We can just get the List of Files LastModified datetime and loop them, use "If" condition to judge if we should download it.
Update:
The expression in the screenshot is:
sub(ticks(utcNow()), ticks(triggerBody()?['LastModified']))
Update workaround
Is it possible to add a "Delay" action when the last modified time less than 1 minute ? For example, if the last modified time less than 60 seconds, use "Delay" to wait 5 minutes until the overwrite operation complete, then do the download.
I check the sample #equals(triggers().code, 'InternalServerError'), actually it uses the condition functions in Logical comparison functions, so the key word is make sure the property you want to filter is in the trigger or triggerBody or you will get the below error.
So I change the expression to like #greater(triggerBody().LastModified,'2020-04-20T11:23:00Z'), this could filter the file modified less than 2020-04-20T11:23:00Z not trigger the flow.
Also you could use other function like less ,greaterOrEquals etc in the Logical comparison functions.

Quartz Scheduling to delete files

I am using the file component with an argument quartz scheduler in order to pull some files from a given directory on every hour. Then i transform the data from the files and move the content to other files in other directory. After that I am moving the input files to an archive directory. When a file is moved to this directory it should stay there only a week and then it should be deleted automatically. The problem is that Im not really sure how can i start a new cron job because I dont really know when any of the files is moved to that archive directory. Maybe is something really trivial but I am pretty new to camel and I dont know the solution. Thank you in advance.
Use option "filterFile"
Every file has modified timestamp and you can use this timestamp to filter file that are older than 1 week. Under file component, there exist an option filterFile
filterFile=${date:file:yyyyMMdd}<${date:now-7d:yyyyMMdd}
Above evaluation comes from file language, ${date:file:yyyyMMdd} denote modified timestamp of the file in form (year)(month)(day) and ${date:now-7d:yyyyMMdd} denote current time minus 7 days in form (year)(month)(day).

Uploading delta changes to perforce and label

Every month I need to sync big chunk of files from git and upload it to perforce and create a p4 label with all files uploaded, and from p4 I do builds, this is odd but we cannot change this now. so far I am uploading the files to new directory in p4 every time and creating label with all uploaded files, so I get a clean label for "this month's build" though most of the files didn't change compared to last month's upload, this happens through shell script, is there a way to upload only the delta changes and create label with old files (not uploaded as it didn't change) + changed/brand new files uploaded? any high level directions will help, thanks.
Do I understand correctly that "I am uploading the files to new directory in p4 every time" means that, each time you perform a 'p4 submit', the files in your changelist are actually different file names, not just newer revisions of the same files?
That is, if you did 'p4 describe' of the first changelist you submitted, would it look like:
//depot/dir/r1/A.c#1
//depot/dir/r1/B.c#1
//depot/dir/r1/C.c#1
while the next time you ran your tool, and did a 'p4 describe' of the changelist, it would look like:
//depot/dir/r2/A.c#1
//depot/dir/r2/B.c#1
//depot/dir/r2/C.c#1
If that is what you are doing, then if you could instead copy the files to the same location each time, so that the first time your tool ran you got:
//depot/dir/A.c#1
//depot/dir/B.c#1
//depot/dir/C.c#1
and the second time your tool ran you got:
//depot/dir/A.c#2
//depot/dir/B.c#2
//depot/dir/C.c#2
If you did things that way, then you could use the 'revertunchanged' feature of Perforce, and the files that were unchanged from the previous run of your tool would not be submitted, and the revision number would not change, so that over time you might get:
//depot/dir/A.c#7
//depot/dir/B.c#4
//depot/dir/C.c#19
Each time you run your tool, you can still create a label, because the label includes not just the file name but also the revision number, so the first label you created might include:
//depot/dir/A.c#1
//depot/dir/B.c#1
//depot/dir/C.c#1
while the 19th label you created might include:
//depot/dir/A.c#7
//depot/dir/B.c#4
//depot/dir/C.c#19
Hopefully this will get you pointed in a direction that is more suited to your goals.

NLOG Rollover Configuration

Is there a configuration in NLOG that accomplishes the following
1)A new log file should be created , when the current file exceeds a particular size for ex:- 5 MB
2)The old log files should be deleted after a configured amount of time period like for ex: - 1 day
You can find answer for your question (and examples) on page:
Size-based file archival - log files can be automatically archived by moving them to another location after reaching certain size and
Time-based file archival - log files can calso be automatically archived based on time
Try to use the second one and change log files every days. Than you can keep maximum number of archived files.

Resources