Sending Python App logs directly to SOLR - python-3.x

I am currently using python logging with file handler to write the logs. However, was wondering if there is a way to send logs straight to an external system such as SOLR? I see there is a way of sending it to logstash, however would prefer if I could send it straight to SOLR. Is there any way to do that?

Related

how to use papertrail (send logs) in nest js?

is there any way to use papertrail in nestjs application ? I am added loggers like this
https://docs.nestjs.com/techniques/logger
Now I want to send this logs to papertrail.. is there any way to send ?
I got the example of nodejs . but I didn't find any example of nestjs
You could send it to SQL or NoSQL. Eventually, you want to store a log record.
Or you could save it on your local server by creating your own log file.

How to write a simple Node.js App that taks dump output to browser?

I familiar to Node.js and just a little bit about web programming. One of my CLI Node app need to let user see the running program logs. I hope she can open a separated browser and point to something like http://localhost:12345, then she got live log keeping scrolling in the page without any human interaction.
Is that a simple method to do such kind of application? I know programming RESTful, not sure if it helps.
If I understood your question correctly, you are trying to show live server side logs to the user. For that you will have to tail the log file and pipe the output to response or pipe the stdout (if you're not writing logs to file) to the response using socket.io connection. socket.io is a way of providing live updates to the users without them sending https request every time. You can see example here .

Send log to a specific Graylog Index via Nxlog configuration

I am currently using nxlog to send the server logs to a graylog2 server and all the messages are going to the default index in Graylog. I am trying to send the messages to a particular index which should be configurable from the nxlog conf file.
We cannot achieve this via Nxlog configuration. This problem can be solved by using Streams functionality provided by Graylog http://docs.graylog.org/en/2.4/pages/streams.html . We can create a stream with a particular rule to figure out the input source and then redirect the logs to a particular index which is configured when we are creating a stream.

Where to find what queries are hitting to gremlin server via gremlin-javascript

I am using gremlin-javascript module of nodejs to query titan database. Everything is working fine but I want to monitor what is actually hitting the gremlin server and anything else that i can get to know about that query. I already checked the gremlin-server log in logs folder inside the titan folder . I can not find anything of use in those logs. Any help in this regard will be extremely useful. thanks
For a client side solution with gremlin-javascript, there is currently no quick and easy way to log to the console outgoing queries or protocol messages sent to Gremlin Server.
You could either:
Implement your own function that wraps calls to the Gremlin client methods you call (typically client.execute()), and logs arguments. If using Node.js v6+, this could be a nice use case for an ES2015 Proxy object. This is the safest, non intrusive approach.
Monkeypatch the client.prototype.messageStream method, and log parameters. As of v2.3.2, this low level method gets called whether you're doing client.execute() or client.stream(). This is riskier and trickier.
Quick and dirty: edit the source code in ./node_modules/gremlin/lib/GremlinClient.js and add this after line 405 (prototype.messageStream definition):
console.log('query:', script);
console.log('params:', bindings);
There's currently an open issue about logging of ingoing messages but this could be developed to include outgoing messages as well (queries with parameters, down to protocol messages).

Can logstash read directly from remote log?

I am new to logstash and I am reading about it from couple of days. Like most of the people, I am trying to have a centralized logging system and store data in elasticsearch and later use kibana to visualize the data. My application is deployed in many servers and hence I need to fetch logs from all those servers. Installing logstash forwarder in all those machines and configuring them seems to be a very tedious task (I will do it if this is the only way). Is there a way for logstash to access those logs by mentioning the URL to logs somewhere in conf file instead of logstash forwarders forwarding it to logstash? FYI, my application is deployed on tomcat and the logs are accessible via URL http://:8080/application_name/error.log.
Not directly but there are a few close workarounds - the idea is to create a program/script that will use curl (or it's equivalent) to effectively perform a "tail -f" of the remote log file, and the run that output into logstash.
Here's a bash script that does the trick:
url-tail.sh
This bash script monitors url for changes and print its tail into
standard output. It acts as "tail -f" linux command. It can be helpful
for tailing logs that are accessible by http.
https://github.com/maksim07/url-tail
Another similar one is here:
https://gist.github.com/habibutsu/5420781
There are others out there, written in php or java: Tail a text file on a web server via HTTP
Once you have that running the question is how to get it into logstash - you could:
Pipe it into stdin and use the stdin input
Have it append to a file and use the file input
Use the pipe input to run the command itself and read from the
stdout of the command
The devil is in the details thought, particularly with logstash, so you'll need to experiment but this approach should work for you.

Resources