I'm trying to use Seq, which is a tool for logging management, mostly supported in .NET.
There are also tools like seqcli for sending logs to a seq server as shown here:
https://docs.datalust.co/docs/
The thing is, I'm using a springboot App, and according to the docs, I'm using GELF and Seq deployed as docker containers in a remote server. Everything is on Linux.
I managed to send some logs from a file using this command:
./seqcli ingest ../spring-boot-*.log
and I can see them on the remote server, but I'm not able to send logs in realtime. the docs says that I can send logs in real time from STDIN but no more details about it, I have no idea how can I achieve this.
https://docs.datalust.co/docs/command-line-client#extraction-patterns
Any suggestion?
I was digging a little more throughout the docs and I found this:
tail -c 0 -F /var/log/syslog | seqcli ingest
which I converted to this:
tail -c 0 -F ../spring-boot-app.log | ./seqcli ingest
If someone runs into the same problem, look some more here:
https://blog.datalust.co/parsing-plain-text-logs-with-seqcli/
Related
I familiar to Node.js and just a little bit about web programming. One of my CLI Node app need to let user see the running program logs. I hope she can open a separated browser and point to something like http://localhost:12345, then she got live log keeping scrolling in the page without any human interaction.
Is that a simple method to do such kind of application? I know programming RESTful, not sure if it helps.
If I understood your question correctly, you are trying to show live server side logs to the user. For that you will have to tail the log file and pipe the output to response or pipe the stdout (if you're not writing logs to file) to the response using socket.io connection. socket.io is a way of providing live updates to the users without them sending https request every time. You can see example here .
TL;DR: how to attach a docker container bash to a node.js stream?
I need to implement a relay between a docker container bash and the final user. The app is a remote compile/run for c/cpp, Python and JS. Some references (repl.it, cpp.sh). To accomplish that my plan it's to:
Instantiate a Ubuntu docker container with the requirements for
compiling and running the user code.
Prompt some bash commands for compiling/running the user code.
And finally, output the resulting from the bash console to the user.
I've found some repo's with interesting code they are: compilebox, dockerrode and docker-api.
The 1st, do the task using containers and some promise/async black magic to compile, pipe out to a file and send to the user through HTTP (get/post). My problem with this one is because I need to establish a shell-like environment for my user. My goal is to bring a bash window over the browser.
The 2nd and 3rd implements API based on the official HTTP docker engine API (I took the v1.24 because this one has an overview for layman's like me). Both have examples of some sort of IO Stream between two entities. Like this, the duplexstream, but because of some mistake of implementation, the IO doesn't work properly (Issue#455).
So my problem is: how to attach a docker container bash to a node.js stream? So when it's done, everything that the user types in the app on the browser it's sent via HTTP to the bash container and the output is sent back as well.
A little bit of background
Below you will find a diagram of the relationship between the different components of a Node app I'm currently working on. Here is the link on GitHub. It is an application that I use to archive videos which have strong journalistic importance, and between the moment I watch them, and the moment I get the time to use them for my reports, they are usually removed from youtube. Hence, by archiving them, this information no longer gets lost.
What I'm trying to achieve in plain English
download_one_with_pytube.py is basically a piece of code that downloads a video given an id, and so it reports its progress of the download by printing to the console the percentage of that progress.
What I'm trying to achieve in terms of output piping
Here is a pseudo shell set of piped commands
Array of IDs of videos | for each URL | python download(video Ids) | print progress | response.send(progress)
The difficulty I have is to actually spawn the python code passing it the video id dynamically, and then pipe the progress all the way back to the server's response.
Resources I've consulted & Stuff I tried
I've tried, the whole day yesterday, without success, implementing my own Classes of objects inheriting from EventEmitter, or even implementing my own deplex stream class to pipe that output all the back to my express web server so that progress can be served to the browser.
Advanced Node.js | Working with Streams : Implementing Readable and Writable Streams
Util | Node.js v9.3.0 Documentation
How to create duplex streams with Node.js - a Nodejs programming tutorial for web developers | CodeWinds
class Source extends Readable
Pipe a stream to a child process · Issue #4374 · nodejs/node-v0.x-archive
Developers - Pipe a stream to a child process
Deferred process.exit() to allow STDOUT pipe to flush by denebolar · Pull Request #1408 · jsdoc3/jsdoc
The problem
I think the problem is that i get confused with the direction the pipes should take.
What I've managed so far
All i've managed to do is 'pipe' the outpout of the python script back to downloadVideos.js
How the app is strutured
Through express (server.js in the diagram), I exposed my node app (running through a forever daemon) so that devices on the same LAN as the server can access [server IP address]:3333/startdownload and trigger the app execution.
Looking at concrete lines of code in my repo
How can I pipe the output of this console.log here all the way back to server at this line of code here ?
A simple working example using Node's included http
I've got a GIST here of http server running that illustrates what I'm trying to achieve. However due to my app architecture being more real world than this simple example, I have several files and require statements in between the output I'm trying to pipe and the res.send statement.
Conclusion
I really appreciate any help anyone can provide me on this.
We could code together live using Cloud9 shared workspaces making this process easier.
Here is the link to application., but I would have to send an invite for it to be accessible I guess.
I'd like to have a process that captures both access and error logs, without the logs being written to disk. I'd like to use node.js process.stdin to the logs. Any idea if nginx can be setup to stream the logs to another process instead of to disk?
No, that's not possible, and there's a trac here: https://trac.nginx.org/nginx/ticket/73
However, as in the comment to the trac, you could easily use pipe the logs from the file using tail -F /path/to/access/log | your-node-script.js. Please note that this will still write to disk and then read, so consider the IOPs usage.
Another option is to send Nginx's logs to a node application that acts as a syslog server. Doing that in Nginx is quite trivial (see: http://nginx.org/en/docs/syslog.html ). You will then need to create a simple Node.js server that listens to port 514 UDP and processes the log. See an example in the highlighted lines here: https://github.com/cconstantine/syslog-node/blob/e243e2ae7ddc8ef9214ba3450a8808742e53d37b/server.js#L178-L200
I am new to logstash and I am reading about it from couple of days. Like most of the people, I am trying to have a centralized logging system and store data in elasticsearch and later use kibana to visualize the data. My application is deployed in many servers and hence I need to fetch logs from all those servers. Installing logstash forwarder in all those machines and configuring them seems to be a very tedious task (I will do it if this is the only way). Is there a way for logstash to access those logs by mentioning the URL to logs somewhere in conf file instead of logstash forwarders forwarding it to logstash? FYI, my application is deployed on tomcat and the logs are accessible via URL http://:8080/application_name/error.log.
Not directly but there are a few close workarounds - the idea is to create a program/script that will use curl (or it's equivalent) to effectively perform a "tail -f" of the remote log file, and the run that output into logstash.
Here's a bash script that does the trick:
url-tail.sh
This bash script monitors url for changes and print its tail into
standard output. It acts as "tail -f" linux command. It can be helpful
for tailing logs that are accessible by http.
https://github.com/maksim07/url-tail
Another similar one is here:
https://gist.github.com/habibutsu/5420781
There are others out there, written in php or java: Tail a text file on a web server via HTTP
Once you have that running the question is how to get it into logstash - you could:
Pipe it into stdin and use the stdin input
Have it append to a file and use the file input
Use the pipe input to run the command itself and read from the
stdout of the command
The devil is in the details thought, particularly with logstash, so you'll need to experiment but this approach should work for you.