Web implementation of "tail -f filename"? - tail

I have a log file and want to create a webpage (possibly Python but not strictly) that will work much like unix "tail -f filename" command works (show new log lines when they are written to file).
So that user will continuously see log right in browser.
How would you implement this?

Tailon is a python webapp that, among other things, provides tail -f like functionality. In addition, wtee (a sister project of tailon) can make all its stdin viewable in the browser - its use is identical to the unix tee command: tail -f filename | wtee

I implemented this using jquery (.ajax) and php (json).
The flow is essentially as follows:
user calls an html page on their browser
html page contains an initial jquery .ajax call to a remote php script on the server that performs the required function, in this case, retrieving a few of the last lines of the file being 'tailed'
if no new lines are available, the php script just loops (wile the ajax caller waits, ie longpolling), and can be configured to time out if necessary (returning an appropriate value back to the ajax calling function on the client)
when new lines are detected by the php script, they are wrapped in a json response and sent back to the ajax calling function on the browser, which then appends it to the existing content of the page.
The javascript function will then recursively make the same ajax call, effectively sitting in an infinite loop.
In my specific implementation, i did the following:
both the ajax call on the client AND the php script on the server have timeouts to handle, for example, broken connections nicely. Also ensures the ajax call does not wait forever.
the ajax call passes a line number as a reference back to the server to tell it what the last line number was that it received, so the server knows which lines to return. Initially the value is zero, and the server will immediately return the last 10 lines of the file
when the php script is called, it uses the clients last line number to do a quick check on the file; if new lines have already been added it returns them immediately, if not it sits in a loop (1 second) and then instead checks the files ctime (or mtime) to detect when new lines are written. This is more effective than counting the lines in the file (which could be huge) every second.
See my longpolling/realtime tail implementation using jquery and php here:
https://github.com/richardvk/web_file_tail

Scullog, having capability of sharing the local drive to the browser. Stream the log file via Socket.IO over browser. It run on any platform such as windows/linux/mac. It run as service or standalone mode.

You read the file and print the last lines to the page. You might also use a GET-variable to you define number of rows to output using ?n=x where x is the number of lines.

Related

Generate hash of newly downloaded file

I'd like my bash script to perform an action every time new file is downloaded to /Downloads (generate hash of downloaded file and send it to API). So far I've been trying to make use of "inotify-tools", but it works only for newly created file and that won't do.
Script should work like this:
I download a file via browser (normal way)
Script notices new file and is executed automatically
Thanks in advance for help :D
You can use /etc/crontab to check ~/Downloads folder at startup and every n minutes. Script that will run every nth minute can do either
Keep the number of files. If number decreases script updates cache. And if number increases then gets the latest created file (or modified) and sends that file's hash to the api via curl.
Keep the name of files. If a file no longer exists, script then updates the cache of file names. If a new file appears again hashes and sends hash to the api via curl.
You can keep cache of files under /tmp.
If you can provide an example scenario I can write a simple script

Insert data into database using sqlldr in coldfusion

I have a CSV file that I got from a website. I need to upload that same CSV file into my database using SQLLDR in ColdFusion. For some reason I'm not able to insert the data into database.
Below is my code. Using this code I was not able to insert the data into database from the CSV file. It is working file from a batch file, but it is not working using cfexecute. By that I mean, I'm getting a blank screen. No errors, no exceptions, nothing. Checked in the logs but I did not get any errors there either. Only thing what I can see is that the data is not inserted into the database.
FYI, we are using Linux environment, so the path is slightly different.
<cfset CTLPATH="/home/mosuser/apps/nodal/ctl">
<cfset LOGPATH="/home/mosuser/apps/nodal/logs">
<cfexecute name="/opt/oracle/product/12.1.0/client_1/bin/sqlldr"
arguments="userid/password#Sid control=#CTLPATH#/mpimReport.ctl
log=#LOGPATH#/#PathfileName#_load.log data=#filelist##PathfileName#.csv
bad=#LOGPATH#/#PathfileName#_error.txt">
</cfexecute>
Update:
As suggested, dumping the error variable qryerr showed:
Message 2100 not found; No message file for product=RDBMS,
facility=ULMessage 2100 not found; No message file for product=RDBMS,
facility=UL
Add a few parameters to your <cfexecute> call.
timeout - in the order of the number of seconds expect the process to take
variable - the name of the variable to hold the STDOUT output of sqlldr
errorVariable - name of the variable to hold the STDERR output of sqlldr
After doing that you can get the output of the call and inspect it for any error messages and other info.
Adding the timeout is the crucial step - this makes cfexecute block until either the program terminates or the timeout is reached. Without a timeout, ColdFusion simply kicks off the process and immediately continues executing the rest of the current page.

How to call a bash script automatically when directory contents chage

My goal is to run a bash script automatically whenever any new file is added to a particular directory or any subdirectory of that particular directory.
Detail Scenario:
I am creating an automated process for file submission from teachers to students and vice versa. Sender will upload file and it will be stored inside the Uploads directory in the LAMP server in the format, ex. "name_course-name_filename.pdf". I want some method so that when any file stored inside the Uploads folder, the same time a script will be called and send that file to the list of receives.
From the database I can find the list of receiver for that particular course and student.
The only concern of mine is, how to call a script automatically and make it work on individual file whenever the content of the directory changes. Cron will do in intervals but not a real time work.
Linux provides a nice mechanism for that purpose which is called inotify. inotify is mostly available as a C API. But there have been developed shell utilities as well. You should use inotifywait from inotifytools (pkg name in debian) for this. Here comes a basic example:
#!/bin/bash
directory="/tmp" # or whatever you are interested in
inotifywait -m -e create "$directory" |
while read folder eventlist eventfile
do
echo "the following events happened in folder $folder:"
echo "$eventlist $eventfile"
done
Update:
If the problem goes complicated, for example you'll have to monitor recursive, dynamic directory structures, you should have a look at incron It's a cron like daemon which executes scripts on certain events. But the events are file system events rather than timer events.
There is another option to 'inotifywait':
-d --daemon
Same as --monitor, except run in the background logging events to a file
that must be specified by --outfile. Implies --syslog.
For completeness:
-m --monitor
Instead of exiting after receiving a single event, execute indefinitely.
The default behaviour is to exit after the first event occurs.
Within the do-done block of your 'while' statement, you might parse each event report for interesting details then use 'case-esac' to take action based on each event that you care about.
For something that you plan to rely on for your operations, you might also consider replacing the hard-coded '$directory' with some sort of configuration file. Such a file might include the path and filename, the interesting events for that path and file, and a script to run when those events happened.
The script might take the list of events as parameters and then 'case-esac' again.
Just one man's ramblins,
~~~ 8d;-Dan

File reading error

I've two methods. One is for image comparison and one is for file reading. what i do is I call a process that compares images between two folders and creates a logfile. Now the second method reads that logfile and parse the data.
But when i call the second method, it says, file not exist, this is because the exe take a little time to make the logfile.
I have used Thread.Sleep() but it still don't work, neither i can use that file.exist method as if i use it and if file does not exist then it will skip that method/file which i don't want.
You can use Process.WaitForExit to wait until your first process is finished, then you know when your log file will be complete/exist when calling your second method.

GPSD simple query's

I need some information from my GPSD server running on my NTP master server.
Amount of satellites it is seeing
Which satellites it is using for the positon fix (maybe also SNR)
Which satellites it is seeing since there are a lott of them (is this possible?)
I am going to output this to PHP, so it must be simple
The GPSD source contains the file gpsd.php which can deliver the current position and satellite info ("skyview") either as a finished HTML page or as a JSON string. So you need to make sure a web server with PHP support runs on your master server and you can call http://ntp-server/path/to/gpsd.php to get it. Append ?op=json to the URL to return the JSON result.
You can get just the php file here: https://github.com/yazug/gpsd/raw/master/gpsd.php
Beat Bolli: I think you meant this one: https://github.com/yazug/gpsd/raw/master/gpsd.php.in (they have renamed it)
It suggests to use ?poll; function, but it hangs to me when I try to read the response...

Resources