I am trying to rename a file from multiple directories
I have a file called slave.log in multiple directories like slave1,slave2......slave17. so daily log rotation happens and creates a new file with current dateformat in it whereas the file data contains a previous day data . I want to rename those files with previous date format .
I have written shell script which works fine but the problem here is , I need to pass the path as parameter but likewise I have 17 directories so i cant schedule 17 cron enteries to run ...I have only basic knowledge about scripting . Please help me with best solution for this scenario
Related
I have data folders created on daily basis in datalake. Folder path is dynamic from JSON Format
Source Folder Structure
SAPBW/Master/Text
Destination Folder Structure
SAP_BW/Master/Text/2019/09/25
SAP_BW/Master/Text/2019/09/26
SAP_BW/Master/Text/2019/09/27
..
..
..
SAP_BW/Master/Text/2019/10/05
SAP_BW/Master/Text/2019/09/06
SAP_BW/Master/Text/2019/09/07
..
..
SAP_BW/Master/Text/2019/09/15
SAP_BW/Master/Text/2019/09/16
SAP_BW/Master/Text/2019/09/17
I want to delete the folders created before 5 days for each folder of sinkTableName
So, in DataFactory, i have Called the folder path in a for each loop as
#concat(item().DestinationPath,item().SinkTableName,'/',item().LoadTypeName,'/',formatDateTime(adddays(utcnow(),-5),item().LoadIntervalFormat),'/')"
Need syntax to delete the files in each folder based on the JSON.
Unable to find the way to delete folder wise and setup the delete activity depending on the dates prior to five days from now
I see that you are doing a concatenation , which I think is the way to go . But I see that you are using the expression formatDateTime(adddays(utcnow(),-5) , which will give you something like 2019-10-15T08:23:18.9482579Z which i don't think is desired . I suggest to try with #formatDateTime(adddays(utcnow(),-5) ,'yyyy/MM/dd'). Let me know how it goes .
I am using the file component with an argument quartz scheduler in order to pull some files from a given directory on every hour. Then i transform the data from the files and move the content to other files in other directory. After that I am moving the input files to an archive directory. When a file is moved to this directory it should stay there only a week and then it should be deleted automatically. The problem is that Im not really sure how can i start a new cron job because I dont really know when any of the files is moved to that archive directory. Maybe is something really trivial but I am pretty new to camel and I dont know the solution. Thank you in advance.
Use option "filterFile"
Every file has modified timestamp and you can use this timestamp to filter file that are older than 1 week. Under file component, there exist an option filterFile
filterFile=${date:file:yyyyMMdd}<${date:now-7d:yyyyMMdd}
Above evaluation comes from file language, ${date:file:yyyyMMdd} denote modified timestamp of the file in form (year)(month)(day) and ${date:now-7d:yyyyMMdd} denote current time minus 7 days in form (year)(month)(day).
I have multiple zip files in a folder with names like below:
"abc.zip-20181002084936558425"
How to rename all of them with one command to get result like below:
"abc-20181002084936558425.zip"
I want the timestamp before the extension for multiple filenames. Now every file has different timestamp so renaming should consider that. Can I rename multiple files like this using single command.
Providing your file are really all with the same name convention and you being in the right directory :
for i in *.zip-*; do newName=${i//.zip};mv $i $newName".zip";done
should do the trick.
I have a process that periodically gets files from a server and copy them with SFTP to a local directory. It should not overwrite the file if it already exists. I know with something like Winston I can automatically rotate the log file when it fills up, but in this case I need a similar functionality to rotate files if they already exist.
An example:
The routine copies a remote file called testfile.txt to a local directory. The next time it's run the same remote file is found and copied. But now I want to rename the first testfile.txt to testfile.txt.0 so it's not overwritten. And so on - after a while I'd have a directory of files with the name testfile.txt.N and the most recent testfile.txt.
What you can do is you can append date and time on the file name that gives every filename a unique name and also helps you archive it.
For example you text.txt can be either 20170202_181921_test.txt or test_20170202_181921.txt
You can use a JavaScript Date Object to get date and time.
P.S show your code of downloading files so that I can add more to that.
For one of my Project, I have a certain challenge where I need to take all the reports generated in a certain path, I want this to be an automated process in "Linux". I know the way how to get the file names which have been updated in the past 120 mins, but not the files directly. Now my requirements are in such a way
Take a certain files that have been updated in past 120 mins from the path
/source/folder/which/contains/files
Now do some bussiness logic on this generated files which i can take care of
Move this files to
/destination/folder/where/files/should/go
I know how to achieve #2 and #3 but not sure of #1. Can someone help me how can i achieve this.
Thanks in Advance.
Write a shell script. Sample below. I haven't provided the commands to get the actual list of file names as you said you know how to do that.
#!/bin/sh
files=<my file list>
for file in $files; do
cp $file <destination_dirctory>
done