I have set of old rrd files, so I need to convert them to xml files and again to rrd files in a new server. In order to create xml files from whole directory, I have used following simple bash script.
#!/bin/bash
cd /varcacti/rra
for i in ./traffic_in*;
do
/usr/local/rrdtool/bin/rrdtool dump $i /home/newuser/rrd_dump_xml/$i.xml;
done
This provides a set of xml files but the extension is not what I required actually. This provides,
traffic_in_1111.rrd>>>traffic_in_1111.rrd.xml
But I need traffic_in_1111.rrd>>>traffic_in_1111.xml. can someone help me to modify my code?
You want
/home/newuser/rrd_dump_xml/"${i%.rrd}".xml
#..........................^^^^^^^^^^^
which removes the rrd extension from the end of the string.
You might want to be more specific with the files you're looping over:
for i in ./traffic_in*.rrd
Related
I am writing winscp script in VBA to synchronize certain files from remote to local.
The code I am using is
""synchronize -filemask=""""*.xlsx"""" local C:\Users\xx\Desktop /JrnlDetailSFTPDirect""
There are three xlsx files: 14.xlsx, 12.xlsx, 13.xlsx. However, seems like it is running through all the files even though it is not synchronizing them. Besides, one folder under JrnlDetailSFTPDirect is also downloaded from remote, which is not expected.
Is it possible to avoid looping through all the files, just selecting those three files and downloading them?
Thanks
There are separate masks for files and folders.
To exclude all folders, use */ exclude mask:
synchronize -filemask="*.xlsx|*/" local C:\Users\xx\Desktop /JrnlDetailSFTPDirect
See How do I transfer (or synchronize) directory non-recursively?
I cannot tell anything regarding the other problem, as you didn't show us names of the files. Ideally, append a session log file to your question. Use /log switch like:
winscp.com /log=c:\writablepath\winscp.log /command ...
I wonder if you can configure logstash in the following way:
Background Info:
Every day I get a xml file pushed to my server, which should be parsed.
To indicate a complete file transfer afterwards I get an empty .ctl (custom file) transfered to the same folder.
The files both have the following name schema 'feedback_{year}{yearday}_UTC{hoursminutesseconds}_51.{extention}' (e.g. feedback_16002_UTC235953_51.xml). So they have the same file name but one is with .xml and the other is a .ctl file.
Question:
Is there a way to configure logstash to wait parsing the xml file until the according .ctl file is present?
EDIT:
Is there maybe a way to archiev that with filebeat?
EDIT2:
It would also be enough to be able to configure logstash in a way that it will wait x minutes before starting to process a new file, if that is easier.
Thanks for any help in advance
Your problem is that you don't want to start the parser before the file transfer hasn't been completed. So, why don't push the data to a file (file-complete.xml) when you find your flag file (empty.ctl)?
Here is the possible logic for a script and runs using crontab:
if empty.ctl exists:
Clear file-complete.xml
Add the content of file.xml to file-complete.xml.
Remove empty.ctl
This way, you'd need to parse the data from file-complete.xml. I think is simpler to debug and configure.
Hope it helps,
Hellooo.
So I'm wanting to make a script for my girlfriend that uses an external file to append words to a URL, then download the links and iterate.
The awkward thing is she doesn't want to tell me too much (I suspect the result of using the script will be for my benefit :P), so I'm not certain about the function, kind of guessing.
The aim is to get the script to contain a base URL. The script will iterate through an external file that contains a list of word and then append each word to the link. Then the script will then open that link. Then iterate through, append, open etc.
Can someone help me out a bit with this? I'm a bit new to scripting.
Should I set up an external file to hold the base url and then refer to that as well?
I'm thinking somthing along the lines of:
url=$(grep * url.txt)
for i in $(cat file.txt);
do
>> $url
wget $url
done
What and how much do I need to change and add?
Thanks for any help.
I have a file named source which has below content in it :
which-2.16.tar.gz
which-2.17.tar.gz
which-2.21.tar.gz
I wrote a script named downloader with the below content :
#!/bin/bash
url="http://ftp.gnu.org/gnu/which" #source url
while read line
do
wget "$url/$line" #download url = source url + file name from the file
done <source #feeding filenames from the source file.
On running downloader will download the files mentioned in source file from the ftp site mentioned in url. Voila !!
I guess you could employ a similar concept.
I tried to use log4viewer for the first time. So my question is basic-level. It is possible to set Log4view to a folder, which has zipped log-files in it? And if yes, how can I configure Log4view correctly? I tried but can't find an example of to do that.
c:\Folder1\zippedLogfiles001.zip
c:\Folder1\zippedLogfiles002.zip
c:\Folder1\zippedLogfiles003.zip (up to 300 logfiles)
...
I heard it is possible that Log4view can read automatically from a folder, so that it doesn't need to unzip the log files manually.
Log4View doesn't work with zipped log files.
To open a log file with Log4View, it has to be a regular, uncompressed text or xml file.
Ulrich
For one of my Project, I have a certain challenge where I need to take all the reports generated in a certain path, I want this to be an automated process in "Linux". I know the way how to get the file names which have been updated in the past 120 mins, but not the files directly. Now my requirements are in such a way
Take a certain files that have been updated in past 120 mins from the path
/source/folder/which/contains/files
Now do some bussiness logic on this generated files which i can take care of
Move this files to
/destination/folder/where/files/should/go
I know how to achieve #2 and #3 but not sure of #1. Can someone help me how can i achieve this.
Thanks in Advance.
Write a shell script. Sample below. I haven't provided the commands to get the actual list of file names as you said you know how to do that.
#!/bin/sh
files=<my file list>
for file in $files; do
cp $file <destination_dirctory>
done