Is there a way to view the console of CruiseControl.NET remotely? - cruisecontrol.net

Is there a way to remotely view the terminal output of CruiseControl.NET remotely? At present, I am running the following command using an instance of Git Bash on the terminal log file located over a Windows share:
tail -f <filename>
This somewhat works (and is really nice in conjunction with using "grep -v" to filter out unwanted lines from the output), but the terminal updates new output slowly and sometimes misses new lines written to the console output file. Is there a plugin or built in way to hook into ccnet and remotely view the console output without having to monitor a file over a Windows share?

I believe that you could take a look at log4net appenders, since CCNet uses this logging library to write output to file, and it's configuarable through config file (respectfully for service or console).
There are many different appenders in log4net:
https://logging.apache.org/log4net/release/sdk/log4net.Appender.html
Hopefully one of them will be suited better for your needs (I can't recommend any, haven't used much log4net..)

Related

Where to save CLI application log file that will work cross-platform?

I have an application that saves error logs into the same directory where the script was run. This is bad because it creates error.log files in random places. So I want to save the file into /var/log on GNU/Linux and I would like to have the same for Windows and macOS (and probably any other OS where NodeJS runs).
Is there a cross-platform way to get a log directory? So I can write logs there?
Also how to work with that directory if it's owned by root. Or is there a better place to save NodeJS application logs?
The only directory cross-platform NodeJS directory is /tmp using:
const log_filename = path.join(os.tmpdir(), 'lips.error.log');
What is the best place to save logs where users can look them up? Same directory and /tmp directory are not good options in my option. What are the other options to save logs files?
What is the usual place where log files should be saved cross-platform in any CLI application?
That's part of the fun -- there is no single directory to store logs, so you'll need to handle each supported OS separately.
Linux has options, but it would be good to go for the XDG specification way, as it's a reasonable standard that won't fill up system logs, and it will make it simple to isolate the applications logs.
macOS can use the ~/Library/logs folder.
Windows can use the AppData folder, usually AppData/Local.

Node.js logging gets an issue when log files are rolling by OS.

I have Node.js application which is running under Linux system and uses log4js logging library. Log files are daily rolled by Linux rolling system. The question is the following: somebody know how force the log4js to recreate and use original log file when it's renamed somehow?
Note, Log4js continue writes log into renamed file (seems, stream descriptor is kept).
Please note also that I don't want to use log4js DaylyRollingFileAppender because my log folder contains log files generated by various languages (Java, Python, JavaScript, Bash...).
I assume that you use logrotate.
You can try using 'reload' directive in logrotate config file. (see for details: http://www.linuxcommand.org/man_pages/logrotate8.html)
It seems like that log3js can handle SIGHUP. (Issue: https://github.com/nomiddlename/log4js-node/issues/343, Pull request: https://github.com/nomiddlename/log4js-node/pull/403)
Secondary solution is configure it to use 'copytruncate' directive, but it has trade-offs:
https://serverfault.com/questions/688658/rsyslog-with-logrotate-reload-rsyslog-vs-copytruncate

Listing which files Terminal opens at load time

When I open Terminal I would like to know which files are being loaded (e.g .bashrc, .bash_profile etc). Although I'm doing this in OSX, I guess it's also relevant to logging in via SSH to a Linux server.
Is there any way I can log this out or to check the files loaded?
You could use the fs_usage tool, which "presents an ongoing display of system call usage information pertaining to filesystem activity".
Alternatively, the opensnoop tool which, as the description states "tracks file opens. As a process issues a file open, details such as UID, PID and pathname are printed out"

Run text editor with Groovy such as VIM or Nano

We are using groovy to build our java applications. The goal of the groovy script is to checkout from SVN, run MVN, download release notes from Jira, allow user to edit release notes, then upload everything to S3 for public consumption.
My question comes in on the editing of the release notes. I would like to have groovy run vim on a txt file and allow the user to edit the file and quit vim. On vim exit I would like the script to continue along. This should run on a headless linux server. I have it working locally on my mac using the open command but we are moving our builds to a central AWS linux server.
Something like this is what I'm looking for:
println "Downloading release notes..."
"vi RELEASE-NOTES.txt".execute().waitFor()
println "Uploading the edited release notes here"
FWIW the solution I am using on Mac OS is:
"open -eW RELEASE-NOTES.txt".execute().waitFor();
It opens TextEdit, allows me to edit and save the file, on quitting TextEdit the app moves on and uploads my edited file.
I think the VI solution fails because you no longer have a console (but you weren't very specific about this).
I'm embarassed to say I don't know how to fix this on a Unix machine, on DOS I would use "command /c vi" to run it in a new command shell, but even then it might be a trick to give that shell a "Console" for you to input in (in windows it would open up a new window).
Something like what you said MIGHT work if you piped the user input/output to the app stdin/out, but I bet some linux guru knows a better way.
Problem is that "Groovy" owns the console allocated to user I/O. If you could put groovy in the background (equivilent of ctrl-z) or switch to another console (perhaps using Screen or Linux's multiple consoles) you might be able to pull it off, but I probably wouldn't try it myself.
Redirecting I/O might be a better bet, or just go find a non-gui text editor written in Java and integrate it (Might be your best bet).
It doesn't seem possible, as per this answer, because java's way of handling processes is just piping streams of bytes, which doesn't relate to piping video/tty/whatever. A possible solution is to open a new window with xterm. Since you are on a headless linux, i'm not sure this will work. Are you connecting via SSH? What about trying connect to the server using ssh -X and then run the command? (Or does the "headless" implies the minus X doesn't work? :-) )
println "Downloading release notes..."
['xterm', '-e', 'vi RELEASE_NOTES.txt'].execute().waitFor()
println "Uploading the edited release notes here"
And... have you thought about writing such a script in shell? Seems a bit more suited, IMO.
I think #Bill K's suggestion would be great, but after a quick googling i couldn't found any java CLI editor.

CentOS/auditd: file creation at directory to trigger a script

I need to audit the directory and call a script with the file-path parameter as the file is created there. Reading the man of auditctl i can't find a way to do it.
There're references in the web to inotify or iwatch services, that should do what i need, but i'd rather use the standard auditd functionality not installing an extra software.
If that's really not possible to use auditd to track the file creation and call the script for that file, a short sample of iwatch/inotify command to do the trick will be appreciated and accepted.
For the CentOs environment pyinotify module was used which handles directories watch pretty well and triggers the desired scripts.
Unfortunately i wasn't able to find solution using pure auditd.
The list of examples of how do someone use pyinotify is here.

Resources