Impossible to use grok date match correctly - logstash
I have this message
2016/02/22 08:40:10 [error] 2127#0: *193 open()
"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg"
failed (2: No such file or directory), client: 192.168.144.95, server:
api.magritte.arte.tv, request: "GET
/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg
HTTP/1.1", host: "api.magritte.arte.tv", referrer:
"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"
And I parse it this way
grok {
match => { "message" => "(?<timestamp>%{YEAR}/%{MONTHNUM2}/%{MONTHDAY} %{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER:tid}:( \*%{NUMBER:cid})? %{GREEDYDATA:errormessage}(?:, client: (?<client>%{IP}|%{HOSTNAME}))(?:, server: %{IPORHOST:server})(?:, request: %{QS:request})?(?:, upstream: \"%{URI:upstream}\")?(?:, host: %{QS:host})?(?:, referrer: \"%{URI:referrer}\")?"}
}
date {
match => [ "timestamp" , "YYYY/MM/dd HH:mm:ss" ]
}
When a new message arrives the following behaviour occurs
message sent to rabbit : OK
message read from rabbit : OK
problem when logstash read message
"reason"=>"failed to parse [timestamp]",
"caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid
format: \"2016/02/22 08:40:10\" is malformed at \"/02/22
08:40:10\""}}}}, :level=>:warn}
But I have no idea where my error is. using http://grokconstructor.appspot.com/do/match#result all seems OK
The full log in the logstash is
{:timestamp=>"2016-02-22T08:43:29.968000+0100", :message=>"Failed action. ", :status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2016.02.22", :_type=>"nginx_error", :_routing=>nil}, #<LogStash::Event:0x75f8f9a0 #metadata_accessors=#<LogStash::Util::Accessors:0x402f1514 #store={}, #lut={}>, #cancelled=false, #data={"message"=>"2016/02/22 08:40:10 [error] 2127#0: *193 open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory), client: 192.168.144.95, server: api.magritte.arte.tv, request: \"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\", host: \"api.magritte.arte.tv\", referrer: \"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos\"", "#version"=>"1", "#timestamp"=>"2016-02-22T07:40:10.000Z", "path"=>"/var/log/nginx/api.magritte.arte.tv_error.log", "host"=>["magritte.arte.tv", "\"api.magritte.arte.tv\""], "type"=>"nginx_error", "application"=>"api", "timestamp"=>"2016/02/22 08:40:10", "severity"=>"error", "pid"=>2127, "tid"=>0, "cid"=>193, "errormessage"=>"open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory)", "client"=>"192.168.144.95", "server"=>"api.magritte.arte.tv", "request"=>"\"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\"", "referrer"=>"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"}, #metadata={}, #accessors=#<LogStash::Util::Accessors:0x27ca0e3f #store={"message"=>"2016/02/22 08:40:10 [error] 2127#0: *193 open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory), client: 192.168.144.95, server: api.magritte.arte.tv, request: \"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\", host: \"api.magritte.arte.tv\", referrer: \"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos\"", "#version"=>"1", "#timestamp"=>"2016-02-22T07:40:10.000Z", "path"=>"/var/log/nginx/api.magritte.arte.tv_error.log", "host"=>["magritte.arte.tv", "\"api.magritte.arte.tv\""], "type"=>"nginx_error", "application"=>"api", "timestamp"=>"2016/02/22 08:40:10", "severity"=>"error", "pid"=>2127, "tid"=>0, "cid"=>193, "errormessage"=>"open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory)", "client"=>"192.168.144.95", "server"=>"api.magritte.arte.tv", "request"=>"\"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\"", "referrer"=>"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"}, #lut={"type"=>[{"message"=>"2016/02/22 08:40:10 [error] 2127#0: *193 open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory), client: 192.168.144.95, server: api.magritte.arte.tv, request: \"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\", host: \"api.magritte.arte.tv\", referrer: \"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos\"", "#version"=>"1", "#timestamp"=>"2016-02-22T07:40:10.000Z", "path"=>"/var/log/nginx/api.magritte.arte.tv_error.log", "host"=>["magritte.arte.tv", "\"api.magritte.arte.tv\""], "type"=>"nginx_error", "application"=>"api", "timestamp"=>"2016/02/22 08:40:10", "severity"=>"error", "pid"=>2127, "tid"=>0, "cid"=>193, "errormessage"=>"open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory)", "client"=>"192.168.144.95", "server"=>"api.magritte.arte.tv", "request"=>"\"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\"", "referrer"=>"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"}, "type"], "[type]"=>[{"message"=>"2016/02/22 08:40:10 [error] 2127#0: *193 open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory), client: 192.168.144.95, server: api.magritte.arte.tv, request: \"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\", host: \"api.magritte.arte.tv\", referrer: \"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos\"", "#version"=>"1", "#timestamp"=>"2016-02-22T07:40:10.000Z", "path"=>"/var/log/nginx/api.magritte.arte.tv_error.log", "host"=>["magritte.arte.tv", "\"api.magritte.arte.tv\""], "type"=>"nginx_error", "application"=>"api", "timestamp"=>"2016/02/22 08:40:10", "severity"=>"error", "pid"=>2127, "tid"=>0, "cid"=>193, "errormessage"=>"open() \"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\" failed (2: No such file or directory)", "client"=>"192.168.144.95", "server"=>"api.magritte.arte.tv", "request"=>"\"GET /static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg HTTP/1.1\"", "referrer"=>"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"}, "type"]}>>], :response=>{"create"=>{"_index"=>"logstash-2016.02.22", "_type"=>"nginx_error", "_id"=>"AVMH7uSoo1ZDC2Pzezhl", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2016/02/22 08:40:10\" is malformed at \"/02/22 08:40:10\""}}}}, :level=>:warn}
I think it's a quote issue ...
1
Message in nginx logfile 2016/02/22 08:40:10 [error] 2127#0: *193
open()
"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg"
failed (2: No such file or directory), client: 192.168.144.95, server:
api.magritte.arte.tv, request: "GET
/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg
HTTP/1.1", host: "api.magritte.arte.tv", referrer:
"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"
It's the one I use to do my grok parsing (logstash sourceside)
rabbitmq message payload
{"message":"2016/02/22 08:40:10 [error] 2127#0: *193 open()
\"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\"
failed (2: No such file or directory), client: 192.168.144.95, server:
api.magritte.arte.tv, request: \"GET
/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg
HTTP/1.1\", host: \"api.magritte.arte.tv\", referrer:
\"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos\"","#version":"1","#timestamp":"2016-02-22T07:40:10.000Z","path":"/var/log/nginx/api.magritte.arte.tv_error.log","host":["magritte.arte.tv","\"api.magritte.arte.tv\""],"type":"nginx_error","application":"api","timestamp":"2016/02/22
08:40:10","severity":"error","pid":2127,"tid":0,"cid":193,"errormessage":"open()
\"/etc/nginx/nginx/html/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg\"
failed (2: No such file or
directory)","client":"192.168.144.95","server":"api.magritte.arte.tv","request":"\"GET
/static-cdn.arte.tv/resize-preprod/nQa5oWnNDknADSxe0mPEMd5McUA=/940x530/smart/default/prog_img/IMG_APIOS/051000/051700/051757-001_1137283_32_202.jpg
HTTP/1.1\"","referrer":"https://api.magritte.arte.tv/api/oauth/user/documentation/opa/endpoint/27/-api-opa-v2-videos"}
Some backslashes are added
This backslash prevent logstash (target side) to correctly handle the messages.
With this, it works :)
grok {
match => { "message" => "(?<timestamp>%{YEAR}/%{MONTHNUM2}/%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}) \[%{LOGLEVEL:severity}\] %{POSINT:p_id}#%{NUMBER:t_id}:( \*%{NUMBER:c_id})? %{GREEDYDATA:errormessage}(?:, client: (?<client>%{IP}|%{HOSTNAME}))(?:, server: %{IPORHOST:server})(?:, request: %{QS:request})?(?:, upstream: %{QS:upstream})?(?:, host: %{QS:vhost})?(?:, referrer: \"%{URI:referrer}\")?"}
}
date {
match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss" ]
}
A grok pattern is available, that can solve your problem.
grok {
match => ["message","%{DATESTAMP:timestamp}" ]
}
Try y instead of Y in date filter.
joda.time.format.DateTimeFormat
date {
match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss" ]
}
Related
ngx-socket-io connected false but flask_socketio logs show 200 OK
Any tips to debug would help, this is just the basic implementation. I am using the exact tutorial as here to setup the ngx socket client: https://www.npmjs.com/package/ngx-socket-io flask-socketio for the server https://flask-socketio.readthedocs.io/en/latest/ Server-side shows no error, but the registered handler methods to emit/receive seem blind. Server Logs: 127.0.0.1 - - [28/Jan/2021 10:58:51] "GET /socket.io/?EIO=3&transport=polling&t=NT8Dtbg HTTP/1.1" 200 418 0.000640 (12285) accepted ('127.0.0.1', 37400) fa1a1d40bbc349c384a121302ef567c7: Received request to upgrade to websocket 127.0.0.1 - - [28/Jan/2021 10:58:51] "GET /socket.io/?EIO=3&transport=polling&t=NT8Dtcd&sid=fa1a1d40bbc349c384a121302ef567c7 HTTP/1.1" 200 235 0.000408 127.0.0.1 - - [28/Jan/2021 10:58:51] "GET /socket.io/?EIO=3&transport=polling&t=NT8DtdE&sid=fa1a1d40bbc349c384a121302ef567c7 HTTP/1.1" 200 235 0.000268 fa1a1d40bbc349c384a121302ef567c7: Upgrade to websocket successful 5002732d42184ba6b453e7d4f35e864e: Received packet PING data None 5002732d42184ba6b453e7d4f35e864e: Sending packet PONG data None Client logs: config: {url: "http://127.0.0.1:5000/", options: {…}} emptyConfig: {url: "", options: {…}} eventObservables$: {} ioSocket: Socket acks: {} connected: false disconnected: true flags: {} ids: 0 io: Manager autoConnect: true backoff: Backoff {ms: 1000, max: 5000, factor: 2, jitter: 0.5, attempts: 0} connecting: [Socket] decoder: Decoder {reconstructor: null, _callbacks: {…}} encoder: Encoder {} encoding: false engine: Socket {secure: false, agent: false, hostname: "127.0.0.1", port: "5000", query: {…}, …} lastPing: Thu Jan 28 2021 11:00:56 Server code (no print) #socketio.on('connect') def test_connect(): print('\n\nClient connected') Why can't I get the connection working? Thanks
Double check npm/python environment of current versions of socketio and engineio. https://github.com/miguelgrinberg/python-socketio d7e95928d73c42fab431e94ce2df40fc: Sending packet OPEN data {'sid': 'd7e95928d73c42fab431e94ce2df40fc', 'upgrades': ['websocket'], 'pingTimeout': 60000, 'pingInterval': 25000} Client connected d7e95928d73c42fab431e94ce2df40fc: Sending packet MESSAGE data 0
How to parse stream of logs aggregated from multiple files with logstash?
I have logs from GitLab installed on Kubernetes. Amongst other pods, there is Sidekiq which has a very peculiar structure of logs - it gathers multiple files that all then go into the stdout (see example at the end or official documentation). I want to gather all these logs by Filebeat, send them to Logstash and process them in a sane way (parse JSONs, get important data from line logs, etc. Also, I would like to add info about the original file) and send the output to elasticsearch. However, I am struggling with how to do that - as a newbie regarding Logstash I am not sure how it works under the hood - and so far, I was able to come up only with grok that matches line with the file name. From one perspective it should be relatively easy - I just need to use some sort of a state to mark which file is being processed in the log stream but in the first place I am not sure if Filebeat somehow passes information about the stream to Logstash (important to distinguish from which pod logs came) and secondly whether Logstash allows this state-based processing of log stream. Is it possible to parse these logs and add the original filename as a field this state-based way? Could you possibly point me in the right direction? filter { grok { match => {"message" => "\*\*\* %{PATH:file} \*\*\*"} } if [file] == "/var/log/gitlab/production_json.log" { json { match => { ... } } } else if [file] == "/var/log/gitlab/application_json.log" { grok { match => { ... } } } } Please notice that even for each file, there might be multiple types of logs (/var/log/gitlab/sidekiq_exporter.log) *** /var/log/gitlab/application.log *** 2020-11-18T10:08:28.568Z: Cannot obtain an exclusive lease for Namespace::AggregationSchedule. There must be another instance already in execution. *** /var/log/gitlab/application_json.log *** {"severity":"ERROR","time":"2020-11-18T10:08:28.568Z","correlation_id":"BsVuSTdkM45","message":"Cannot obtain an exclusive lease for Namespace::AggregationSchedule. There must be another instance already in execution."} *** /var/log/gitlab/sidekiq_exporter.log *** [2020-11-18T10:08:32.076+0000] 10.103.149.75 - - [18/Nov/2020:10:08:32 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:08:42.076+0000] 10.103.149.75 - - [18/Nov/2020:10:08:42 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:08:43.771+0000] 10.103.149.75 - - [18/Nov/2020:10:08:43 UTC] "GET /liveness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:08:52.076+0000] 10.103.149.75 - - [18/Nov/2020:10:08:52 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:02.076+0000] 10.103.149.75 - - [18/Nov/2020:10:09:02 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:12.076+0000] 10.103.149.75 - - [18/Nov/2020:10:09:12 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:22.076+0000] 10.103.149.75 - - [18/Nov/2020:10:09:22 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:32.076+0000] 10.103.149.75 - - [18/Nov/2020:10:09:32 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:42.076+0000] 10.103.149.75 - - [18/Nov/2020:10:09:42 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:43.771+0000] 10.103.149.75 - - [18/Nov/2020:10:09:43 UTC] "GET /liveness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:09:52.076+0000] 10.103.149.75 - - [18/Nov/2020:10:09:52 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:10:02.076+0000] 10.103.149.75 - - [18/Nov/2020:10:10:02 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:10:12.076+0000] 10.103.149.75 - - [18/Nov/2020:10:10:12 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" 2020-11-18T10:10:15.783Z 10 TID-oslmgxbxm PagesDomainSslRenewalCronWorker JID-e4891c8d6d57d73f401da697 INFO: start 2020-11-18T10:10:15.807Z 10 TID-oslmgxbxm PagesDomainSslRenewalCronWorker JID-e4891c8d6d57d73f401da697 INFO: done: 0.024 sec [2020-11-18T10:10:22.076+0000] 10.103.149.75 - - [18/Nov/2020:10:10:22 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:10:32.076+0000] 10.103.149.75 - - [18/Nov/2020:10:10:32 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:10:42.076+0000] 10.103.149.75 - - [18/Nov/2020:10:10:42 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:10:43.771+0000] 10.103.149.75 - - [18/Nov/2020:10:10:43 UTC] "GET /liveness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" *** /var/log/gitlab/application_json.log *** {"severity":"ERROR","time":"2020-11-18T10:49:11.565Z","correlation_id":"H9wDObekY74","message":"Cannot obtain an exclusive lease for Ci::PipelineProcessing::AtomicProcessingService. There must be another instance already in execution."} *** /var/log/gitlab/application.log *** 2020-11-18T10:49:11.564Z: Cannot obtain an exclusive lease for Ci::PipelineProcessing::AtomicProcessingService. There must be another instance already in execution. 2020-11-18T10:49:11.828Z 10 TID-gn2cjsz0a ProjectServiceWorker JID-ccb9b5b0f74ced684e15af75 INFO: done: 0.275 sec 2020-11-18T10:49:11.835Z 10 TID-gn2dwudy2 Namespaces::ScheduleAggregationWorker JID-7db9fe9200701bbc7dc7360c INFO: start 2020-11-18T10:49:11.844Z 10 TID-gn2dwudy2 Namespaces::ScheduleAggregationWorker JID-7db9fe9200701bbc7dc7360c INFO: done: 0.009 sec 2020-11-18T10:49:11.888Z 10 TID-oslmgxbxm ArchiveTraceWorker JID-999cc768143b644d051cfe82 INFO: done: 0.21 sec *** /var/log/gitlab/sidekiq_exporter.log *** [2020-11-18T10:49:12.076+0000] 10.103.149.75 - - [18/Nov/2020:10:49:12 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:49:22.076+0000] 10.103.149.75 - - [18/Nov/2020:10:49:22 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:49:32.076+0000] 10.103.149.75 - - [18/Nov/2020:10:49:32 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" [2020-11-18T10:49:42.076+0000] 10.103.149.75 - - [18/Nov/2020:10:49:42 UTC] "GET /readiness HTTP/1.1" 200 15 "-" "kube-probe/1.17+" 2020-11-18T10:49:43.216Z 10 TID-gn2cjsz0a Namespaces::RootStatisticsWorker JID-c277b38f3daa09648934d99f INFO: start 2020-11-18T10:49:43.243Z 10 TID-gn2cjsz0a Namespaces::RootStatisticsWorker JID-c277b38f3daa09648934d99f INFO: done: 0.027 sec [2020-11-18T10:49:43.771+0000] 10.103.149.75 - - [18/Nov/2020:10:49:43 UTC] "GET /liveness HTTP/1.1" 200 15 "-" "kube-probe/1.17+"
You can give all the logs path in filebeat.yml for filebeat to read the logs and send it to logstash. Example filebeat.yml for gitlab: ###################### Filebeat Configuration Example ######################### #=========================== Filebeat inputs ============================= filebeat.inputs: - paths: - /var/log/gitlab/gitlab-rails/application_json.log fields: - type: gitlab-application-json fields_under_root: true encoding: utf-8 - paths: - /var/log/gitlab/sidekiq_exporter.log fields: - type: gitlab-sidekiq-exporter fields_under_root: true encoding: utf-8 - paths: - /var/log/gitlab/gitlab-rails/api_json.log fields: - type: gitlab-api-json fields_under_root: true encoding: utf-8 - paths: - /var/log/gitlab/gitlab-rails/application.log fields: - type: gitlab-application fields_under_root: true encoding: utf-8 #============================= Filebeat modules =============================== filebeat.config.modules: # Glob pattern for configuration loading path: ${path.config}/modules.d/*.yml # Set to true to enable config reloading reload.enabled: false #----------------------------- Logstash output -------------------------------- output.logstash: # The Logstash hosts hosts: ["10.127.55.155:5066"] #================================ Processors ===================================== # Configure processors to enhance or manipulate events generated by the beat. processors: - add_host_metadata: ~ - add_cloud_metadata: ~ Now, in logstash, you can create different grok pattern to filter these logs. Here is a sample logstash.yml, input { beats { port => "5066" } } filter { if [type] == "gitlab-sidekiq-exporter" { grok { match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[(?<timestamp>%{MONTHDAY}/%{MONTH}/%{YEAR}\:%{TIME}) %{TZ:timezone}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-) %{QS:referrer} %{QS:agent}" } overwrite => [ "message" ] } } filter { mutate { remove_tag => [ "_grokparsefailure" ] } } output { #filtered logs are getting indexed in elasticsearch elasticsearch { hosts => ["10.127.55.155:9200"] user => elastic password => elastic action => "index" index => "gitlab" } stdout { codec => rubydebug } #filtered logs can be seen as console output as well, you can comment this out as well, this is for debugging purpose only } Note: The beat input port in logstash.yml should be same, as given in output.logstash in filebeat.yml You can append the logstash.yml for filtering out application_json.log and application.log similar to that of sidekiq_exporter.log For creating and validating grok pattern to filter the logs, you can use online Grok Debugger. Here, I have used the Grok Debugger to create a pattern for filtering sidekiq_exporter.log Pattern: %{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-) %{QS:referrer} %{QS:agent}
Errors when publishing my nuxtjs website on SSR mode
I have some issues while trying to publish a nuxtjs site. Usually, I was using the generate command, but for this one I need to go full SSR, so I'm going for nuxt start. But after building and starting the app, it's a mess. The build goes perfectly in the console, and the application start. The problem is when I try to access the site, it loads partially, but I got all these errors in the browser: manifest.3a7efd91c5f63f114507.js Failed to load resource: the server responded with a status of 404 () vendor.7519259bf7bdf608079e.js Failed to load resource: the server responded with a status of 404 () app.a5cb9356f53306f973dc.js Failed to load resource: the server responded with a status of 404 () default.1f3ad14df16ee86595af.js Failed to load resource: the server responded with a status of 404 () index.260dc65b69022a31ad58.js Failed to load resource: the server responded with a status of 404 () /_nuxt/pages/spot/_slug.e57cc2e78d8e0b160fe7.js Failed to load resource: the server responded with a status of 404 () manifest.3a7efd91c5f63f114507.js Failed to load resource: the server responded with a status of 404 () default.1f3ad14df16ee86595af.js Failed to load resource: the server responded with a status of 404 () index.260dc65b69022a31ad58.js Failed to load resource: the server responded with a status of 404 () vendor.7519259bf7bdf608079e.js Failed to load resource: the server responded with a status of 404 () app.a5cb9356f53306f973dc.js Failed to load resource: the server responded with a status of 404 () Nothing seems wrong during the build. When I use nuxt start, I get this: $ nuxt start nuxt:axios BaseURL: http://localhost:3042/api (Browser: /api) +0ms OPEN http://localhost:3042 Here's my server conf file: # Site global server { listen 443 ssl http2; server_name www.mywebsite.com; access_log off; location = /robots.txt { access_log off; log_not_found off; } location = /favicon.ico { access_log off; log_not_found off; } location / { proxy_pass http://127.0.0.1:3042/; include /etc/nginx/conf.d/proxy.conf; root /var/www/mywebsite/site; add_header Access-Control-Allow-Origin *; } location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico|txt|srt|swf)$ { root /var/www/mywebsite/site/; expires 30d; } ssl_certificate /etc/letsencrypt/live/www.mywebsite.com/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/www.mywebsite.com/privkey.pem; } # Redirection server { listen 80; server_name mywebsite.com www.mywebsite.com; location / { return 301 https://www.mywebsite.com$request_uri; } } And my nuxt config file: const pkg = require('./package') module.exports = { mode: 'universal', loading: { color: '#bb2b4d' }, router: { linkActiveClass: '-active', base: '/' }, css: ['#/assets/icons/css/icons.css', '#/assets/snickles/snickles.css'], plugins: ['~plugins/vue-filters.js', '~plugins/vue-modal.js'], minify: { removeEmptyAttributes: false, collapseWhitespace: true, conservativeCollapse: true, collapseBooleanAttributes: true, removeTagWhitespace: false, removeStyleLinkTypeAttributes: true }, modules: [ '#nuxtjs/axios' ], axios: { }, env: { api: { spots: `https://rest.mywebsite.com/spots` } }, proxy: { }, build: { extend(config, ctx) { // Run ESLint on save if (ctx.isDev && ctx.isClient) { config.module.rules.push({ enforce: 'pre', test: /\.(js|vue)$/, loader: 'eslint-loader', exclude: /(node_modules)/ }) } } }, postcss: [require('autoprefixer')], vendor: ['moment', 'vue-js-modal'] } Did I forget anything? The most strange part is that it works perfectly well when I do the same on my own pc and not on my server. I checked the npm and node versions, they are the same (latest to date). Also, if testing with a demo template from NuxtJS, it works perfectly with the exact same server configuration. By the way, the server is a debian 8, with all packages up to date. Thanks in advance for any hint. Edit: If of any use, the error log: 2018/02/14 19:12:54 [error] 12981#12981: *239930 open() "/var/www/mywebsite/site/_nuxt/pages/spot/_slug.e57cc2e78d8e0b160fe7.js" failed (2: No such file or directory), client: xxx.xxx.xxx.xxx., server: www.mywebsite.com, request: "GET /_nuxt/pages/spot/_slug.e57cc2e78d8e0b160fe7.js HTTP/2.0", host: "www.mywebsite.com", referrer: "https://www.mywebsite.com/" 2018/02/14 19:12:57 [error] 12981#12981: *239930 open() "/var/www/mywebsite/site/_nuxt/manifest.3a7efd91c5f63f114507.js" failed (2: No such file or directory), client: xxx.xxx.xxx.xxx, server: www.mywebsite.com, request: "GET /_nuxt/manifest.3a7efd91c5f63f114507.js HTTP/2.0", host: "www.mywebsite.com", referrer: "https://www.mywebsite.com/" 2018/02/14 19:12:57 [error] 12981#12981: *239930 open() "/var/www/mywebsite/site/_nuxt/vendor.7519259bf7bdf608079e.js" failed (2: No such file or directory), client: xxx.xxx.xxx.xxx, server: www.mywebsite.com, request: "GET /_nuxt/vendor.7519259bf7bdf608079e.js HTTP/2.0", host: "www.mywebsite.com", referrer: "https://www.mywebsite.com/" 2018/02/14 19:12:57 [error] 12981#12981: *239930 open() "/var/www/mywebsite/site/_nuxt/app.a5cb9356f53306f973dc.js" failed (2: No such file or directory), client: xxx.xxx.xxx.xxx, server: www.mywebsite.com, request: "GET /_nuxt/app.a5cb9356f53306f973dc.js HTTP/2.0", host: "www.mywebsite.com", referrer: "https://www.mywebsite.com/" Again, it’s working perfectly fine with other nuxt projects, with a similar configuration. Indeed it can’t find these files in this folder, as they’re not in it — which is perfectly normal. It’s up to the app to get the routes to these files, which it usually does pretty well, with the same directory output (as I said, it’s supposed not to be in a _nuxt folder). Thanks.
This was tagged with nginx, so, here's the nginx way of solving the problem. After the troubleshooting through the comments, you report receiving the following in your error_log: 2018/02/14 19:12:57 [error] 12981#12981: *239930 open() "/var/www/mywebsite/site/_nuxt/manifest.3a7efd91c5f63f114507.js" failed (2: No such file or directory), client: xxx.xxx.xxx.xxx, server: www.mywebsite.com, request: "GET /_nuxt/manifest.3a7efd91c5f63f114507.js HTTP/2.0", host: "www.mywebsite.com", referrer: "https://www.mywebsite.com/" Subsequently, running find / -type f -name manifest.3a7efd91c5f63f114507.js, or similar, results in the corresponding file being located in /var/www/mywebsite/site/.nuxt/dist. As such, your nginx configuration is wrong, because you make it look for these things in the incorrect folder — your config has root /var/www/mywebsite/site/; instead. The proper way would may be to use a prefix-based location together with the alias directive: location /_nuxt/ { alias /var/www/mywebsite/site/.nuxt/dist/; } However, if /_nuxt/ has stuff that may have to be proxy_pass'ed to the upstream, and you want to continue using the pcre-based location you already have had in your original config, then an alternative solution like below is also an option (otherwise, you obviously would have to remove it as redundant, to make sure that the prior prefix-based location works, see http://nginx.org/r/location): location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico|txt|srt|swf)$ { rewrite ^/_nuxt(/.*) $1 break; root /var/www/mywebsite/site/.nuxt/dist/; }
spark-jobserver cannot build on Spark 1.6.2
I'm trying to run the spark-jobserver 0.6.2 with Spark 1.6.2 Currently what I'm doing is this: git clone https://github.com/spark-jobserver/spark-jobserver.git git checkout tags/v0.6.2 -f sbt job-server/package At this point the system crashes with this error: [info] Compiling 35 Scala sources to /test_jobserver/spark-jobserver/job-server/target/scala-2.10/classes... [error] [error] while compiling: /test_jobserver/spark-jobserver/job-server/src/spark.jobserver/util/SparkMasterProvider.scala [error] during phase: jvm [error] library version: version 2.10.6 [error] compiler version: version 2.10.6 [error] reconstructed args: -deprecation -classpath /test_jobserver/spark-jobserver/job-server/target/scala-2.10/classes:/test_jobserver/spark-jobserver/akka-app/target/scala-2.10/classes:/test_jobserver/spark-jobserver/job-server-api/target/scala-2.10/classes:/home/marco/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:/home/marco/.ivy2/cache/com.typesafe/config/bundles/config-1.3.0.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-cluster_2.10/jars/akka-cluster_2.10-2.3.15.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-remote_2.10/jars/akka-remote_2.10-2.3.15.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-actor_2.10/jars/akka-actor_2.10-2.3.15.jar:/home/marco/.ivy2/cache/io.netty/netty/bundles/netty-3.8.0.Final.jar:/home/marco/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar:/home/marco/.ivy2/cache/org.uncommons.maths/uncommons-maths/jars/uncommons-maths-1.2.2a.jar:/home/marco/.ivy2/cache/io.spray/spray-json_2.10/bundles/spray-json_2.10-1.3.2.jar:/home/marco/.ivy2/cache/io.spray/spray-can_2.10/bundles/spray-can_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-io_2.10/bundles/spray-io_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-util_2.10/bundles/spray-util_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-http_2.10/bundles/spray-http_2.10-1.3.3.jar:/home/marco/.ivy2/cache/org.parboiled/parboiled-scala_2.10/jars/parboiled-scala_2.10-1.1.7.jar:/home/marco/.ivy2/cache/org.parboiled/parboiled-core/jars/parboiled-core-1.1.7.jar:/home/marco/.ivy2/cache/io.spray/spray-caching_2.10/bundles/spray-caching_2.10-1.3.3.jar:/home/marco/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.4.2.jar:/home/marco/.ivy2/cache/io.spray/spray-routing_2.10/bundles/spray-routing_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-httpx_2.10/bundles/spray-httpx_2.10-1.3.3.jar:/home/marco/.ivy2/cache/org.jvnet.mimepull/mimepull/jars/mimepull-1.9.5.jar:/home/marco/.ivy2/cache/com.chuusai/shapeless_2.10/jars/shapeless_2.10-1.2.4.jar:/home/marco/.ivy2/cache/io.spray/spray-client_2.10/bundles/spray-client_2.10-1.3.3.jar:/home/marco/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar:/home/marco/.ivy2/cache/org.joda/joda-convert/jars/joda-convert-1.8.1.jar:/home/marco/.ivy2/cache/joda-time/joda-time/jars/joda-time-2.9.3.jar:/home/marco/.ivy2/cache/com.typesafe.slick/slick_2.10/bundles/slick_2.10-2.1.0.jar:/home/marco/.ivy2/cache/com.h2database/h2/jars/h2-1.3.176.jar:/home/marco/.ivy2/cache/commons-dbcp/commons-dbcp/jars/commons-dbcp-1.4.jar:/home/marco/.ivy2/cache/commons-pool/commons-pool/jars/commons-pool-1.5.4.jar:/home/marco/.ivy2/cache/org.flywaydb/flyway-core/jars/flyway-core-3.2.1.jar:/home/marco/.ivy2/cache/org.apache.shiro/shiro-core/bundles/shiro-core-1.2.4.jar:/home/marco/.ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.8.3.jar:/home/marco/.ivy2/cache/org.scoverage/scalac-scoverage-runtime_2.10/jars/scalac-scoverage-runtime_2.10-1.1.1.jar:/home/marco/.ivy2/cache/org.scoverage/scalac-scoverage-plugin_2.10/jars/scalac-scoverage-plugin_2.10-1.1.1.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-core_2.10/jars/spark-core_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.apache.avro/avro-mapred/jars/avro-mapred-1.7.7-hadoop2.jar:/home/marco/.ivy2/cache/org.apache.avro/avro-ipc/jars/avro-ipc-1.7.7-tests.jar:/home/marco/.ivy2/cache/org.apache.avro/avro-ipc/jars/avro-ipc-1.7.7.jar:/home/marco/.ivy2/cache/org.apache.avro/avro/jars/avro-1.7.7.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-core-asl/jars/jackson-core-asl-1.9.13.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-mapper-asl/jars/jackson-mapper-asl-1.9.13.jar:/home/marco/.ivy2/cache/org.xerial.snappy/snappy-java/bundles/snappy-java-1.1.2.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-compress/jars/commons-compress-1.4.1.jar:/home/marco/.ivy2/cache/org.tukaani/xz/jars/xz-1.0.jar:/home/marco/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.10.jar:/home/marco/.ivy2/cache/com.twitter/chill_2.10/jars/chill_2.10-0.5.0.jar:/home/marco/.ivy2/cache/com.twitter/chill-java/jars/chill-java-0.5.0.jar:/home/marco/.ivy2/cache/com.esotericsoftware.kryo/kryo/bundles/kryo-2.21.jar:/home/marco/.ivy2/cache/com.esotericsoftware.reflectasm/reflectasm/jars/reflectasm-1.07-shaded.jar:/home/marco/.ivy2/cache/com.esotericsoftware.minlog/minlog/jars/minlog-1.2.jar:/home/marco/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-1.2.jar:/home/marco/.ivy2/cache/org.apache.xbean/xbean-asm5-shaded/bundles/xbean-asm5-shaded-4.4.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-client/jars/hadoop-client-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-common/jars/hadoop-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-annotations/jars/hadoop-annotations-2.2.0.jar:/home/marco/.ivy2/cache/com.google.code.findbugs/jsr305/jars/jsr305-1.3.9.jar:/home/marco/.ivy2/cache/commons-cli/commons-cli/jars/commons-cli-1.2.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-math/jars/commons-math-2.1.jar:/home/marco/.ivy2/cache/xmlenc/xmlenc/jars/xmlenc-0.52.jar:/home/marco/.ivy2/cache/commons-httpclient/commons-httpclient/jars/commons-httpclient-3.1.jar:/home/marco/.ivy2/cache/commons-codec/commons-codec/jars/commons-codec-1.4.jar:/home/marco/.ivy2/cache/commons-net/commons-net/jars/commons-net-2.2.jar:/home/marco/.ivy2/cache/log4j/log4j/bundles/log4j-1.2.17.jar:/home/marco/.ivy2/cache/commons-lang/commons-lang/jars/commons-lang-2.5.jar:/home/marco/.ivy2/cache/commons-configuration/commons-configuration/jars/commons-configuration-1.6.jar:/home/marco/.ivy2/cache/commons-collections/commons-collections/jars/commons-collections-3.2.1.jar:/home/marco/.ivy2/cache/commons-digester/commons-digester/jars/commons-digester-1.8.jar:/home/marco/.ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-auth/jars/hadoop-auth-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/jars/hadoop-hdfs-2.2.0.jar:/home/marco/.ivy2/cache/org.mortbay.jetty/jetty-util/jars/jetty-util-6.1.26.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-app/jars/hadoop-mapreduce-client-app-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-common/jars/hadoop-mapreduce-client-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-common/jars/hadoop-yarn-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-api/jars/hadoop-yarn-api-2.2.0.jar:/home/marco/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.10.jar:/home/marco/.ivy2/cache/com.google.inject/guice/jars/guice-3.0.jar:/home/marco/.ivy2/cache/javax.inject/javax.inject/jars/javax.inject-1.jar:/home/marco/.ivy2/cache/aopalliance/aopalliance/jars/aopalliance-1.0.jar:/home/marco/.ivy2/cache/org.sonatype.sisu.inject/cglib/jars/cglib-2.2.1-v20090111.jar:/home/marco/.ivy2/cache/com.sun.jersey.jersey-test-framework/jersey-test-framework-grizzly2/jars/jersey-test-framework-grizzly2-1.9.jar:/home/marco/.ivy2/cache/com.sun.jersey/jersey-server/bundles/jersey-server-1.9.jar:/home/marco/.ivy2/cache/asm/asm/jars/asm-3.1.jar:/home/marco/.ivy2/cache/com.sun.jersey/jersey-json/bundles/jersey-json-1.9.jar:/home/marco/.ivy2/cache/org.codehaus.jettison/jettison/bundles/jettison-1.1.jar:/home/marco/.ivy2/cache/stax/stax-api/jars/stax-api-1.0.1.jar:/home/marco/.ivy2/cache/com.sun.xml.bind/jaxb-impl/jars/jaxb-impl-2.2.3-1.jar:/home/marco/.ivy2/cache/javax.xml.bind/jaxb-api/jars/jaxb-api-2.2.2.jar:/home/marco/.ivy2/cache/javax.activation/activation/jars/activation-1.1.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-jaxrs/jars/jackson-jaxrs-1.8.3.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-xc/jars/jackson-xc-1.8.3.jar:/home/marco/.ivy2/cache/com.sun.jersey.contribs/jersey-guice/jars/jersey-guice-1.9.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-client/jars/hadoop-yarn-client-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-core/jars/hadoop-mapreduce-client-core-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-server-common/jars/hadoop-yarn-server-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-shuffle/jars/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-jobclient/jars/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-launcher_2.10/jars/spark-launcher_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-network-common_2.10/jars/spark-network-common_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-network-shuffle_2.10/jars/spark-network-shuffle_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-core-2.4.4.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-unsafe_2.10/jars/spark-unsafe_2.10-1.6.1.jar:/home/marco/.ivy2/cache/net.java.dev.jets3t/jets3t/jars/jets3t-0.7.1.jar:/home/marco/.ivy2/cache/org.apache.curator/curator-recipes/bundles/curator-recipes-2.4.0.jar:/home/marco/.ivy2/cache/org.apache.curator/curator-framework/bundles/curator-framework-2.4.0.jar:/home/marco/.ivy2/cache/org.apache.curator/curator-client/bundles/curator-client-2.4.0.jar:/home/marco/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.5.jar:/home/marco/.ivy2/cache/jline/jline/jars/jline-0.9.94.jar:/home/marco/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/marco/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-lang3/jars/commons-lang3-3.3.2.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-math3/jars/commons-math3-3.4.1.jar:/home/marco/.ivy2/cache/org.slf4j/jul-to-slf4j/jars/jul-to-slf4j-1.7.10.jar:/home/marco/.ivy2/cache/org.slf4j/jcl-over-slf4j/jars/jcl-over-slf4j-1.7.10.jar:/home/marco/.ivy2/cache/com.ning/compress-lzf/bundles/compress-lzf-1.0.3.jar:/home/marco/.ivy2/cache/net.jpountz.lz4/lz4/jars/lz4-1.3.0.jar:/home/marco/.ivy2/cache/org.roaringbitmap/RoaringBitmap/bundles/RoaringBitmap-0.5.11.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-slf4j_2.10/jars/akka-slf4j_2.10-2.3.11.jar:/home/marco/.ivy2/cache/org.json4s/json4s-jackson_2.10/jars/json4s-jackson_2.10-3.2.10.jar:/home/marco/.ivy2/cache/org.json4s/json4s-core_2.10/jars/json4s-core_2.10-3.2.10.jar:/home/marco/.ivy2/cache/org.json4s/json4s-ast_2.10/jars/json4s-ast_2.10-3.2.10.jar:/home/marco/.ivy2/cache/com.thoughtworks.paranamer/paranamer/jars/paranamer-2.6.jar:/home/marco/.ivy2/cache/org.scala-lang/scalap/jars/scalap-2.10.0.jar:/home/marco/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.0.jar:/home/marco/.ivy2/cache/com.sun.jersey/jersey-core/bundles/jersey-core-1.9.jar:/home/marco/.ivy2/cache/org.apache.mesos/mesos/jars/mesos-0.21.1-shaded-protobuf.jar:/home/marco/.ivy2/cache/com.clearspring.analytics/stream/jars/stream-2.7.0.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-core/bundles/metrics-core-3.1.2.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-jvm/bundles/metrics-jvm-3.1.2.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-json/bundles/metrics-json-3.1.2.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-graphite/bundles/metrics-graphite-3.1.2.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.module/jackson-module-scala_2.10/bundles/jackson-module-scala_2.10-2.4.4.jar:/home/marco/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.10.4.jar:/home/marco/.ivy2/cache/org.apache.ivy/ivy/jars/ivy-2.4.0.jar:/home/marco/.ivy2/cache/oro/oro/jars/oro-2.0.8.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-client/jars/tachyon-client-0.8.2.jar:/home/marco/.ivy2/cache/commons-io/commons-io/jars/commons-io-2.4.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-underfs-hdfs/jars/tachyon-underfs-hdfs-0.8.2.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-underfs-s3/jars/tachyon-underfs-s3-0.8.2.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-underfs-local/jars/tachyon-underfs-local-0.8.2.jar:/home/marco/.ivy2/cache/net.razorvine/pyrolite/jars/pyrolite-4.9.jar:/home/marco/.ivy2/cache/net.sf.py4j/py4j/jars/py4j-0.9.jar -feature -bootclasspath /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/resources.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jsse.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/classes:/home/marco/.sbt/boot/scala-2.10.6/lib/scala-library.jar -language:implicitConversions -language:postfixOps [error] [error] last tree to typer: Literal(Constant(collection.Set)) [error] symbol: null [error] symbol definition: null [error] tpe: Class(classOf[scala.collection.Set]) [error] symbol owners: [error] context owners: object DefaultSparkMasterProvider -> package util [error] [error] == Enclosing template or block == [error] [error] Template( // val <local DefaultSparkMasterProvider>: <notype> in object DefaultSparkMasterProvider, tree.tpe=spark.jobserver.util.DefaultSparkMasterProvider.type [error] "java.lang.Object", "spark.jobserver.util.SparkMasterProvider" // parents [error] ValDef( [error] private [error] "_" [error] <tpt> [error] <empty> [error] ) [error] // 2 statements [error] DefDef( // def getSparkMaster(config: com.typesafe.config.Config): String in object DefaultSparkMasterProvider [error] <method> [error] "getSparkMaster" [error] [] [error] // 1 parameter list [error] ValDef( // config: com.typesafe.config.Config [error] <param> <triedcooking> [error] "config" [error] <tpt> // tree.tpe=com.typesafe.config.Config [error] <empty> [error] ) [error] <tpt> // tree.tpe=String [error] Apply( // def getString(x$1: String): String in trait Config, tree.tpe=String [error] "config"."getString" // def getString(x$1: String): String in trait Config, tree.tpe=(x$1: String)String [error] "spark.master" [error] ) [error] ) [error] DefDef( // def <init>(): spark.jobserver.util.DefaultSparkMasterProvider.type in object DefaultSparkMasterProvider [error] <method> [error] "<init>" [error] [] [error] List(Nil) [error] <tpt> // tree.tpe=spark.jobserver.util.DefaultSparkMasterProvider.type [error] Block( // tree.tpe=Unit [error] Apply( // def <init>(): Object in class Object, tree.tpe=Object [error] DefaultSparkMasterProvider.super."<init>" // def <init>(): Object in class Object, tree.tpe=()Object [error] Nil [error] ) [error] () [error] ) [error] ) [error] ) [error] [error] == Expanded type of tree == [error] [error] ConstantType(value = Constant(collection.Set)) [error] [error] uncaught exception during compilation: java.io.IOException [error] File name too long [error] two errors found [error] (job-server/compile:compileIncremental) Compilation failed [error] Total time: 16 s, completed Oct 25, 2016 3:32:36 PM I didn't find anything, somebody knows how to do this? Thank you
Apparently, you can't run this on an encrypted folder on Ubuntu. Moving the project folder to a disk partition non- encrypted made the magic. For more infos, see: https://github.com/scala/pickling/issues/10
Reading a log file from given path using logstash
input { file { path => ["D:/logstash-2.3.4/temp/logs/localhost_access_log.2016-08-24.log"] start_position => "beginning" } } filter { date { match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] } } output { stdout { codec => rubydebug } } Now after running logstash i am unable to see any output on logstash command window. That is the logs inside a give file are not fetching. some of the sample logs in my localhost_access_log.2016-08-24 log file are below: 127.0.0.1 - - [24/Aug/2016:10:07:54 +0530] "GET / HTTP/1.1" 200 11452 0:0:0:0:0:0:0:1 - - [24/Aug/2016:10:08:09 +0530] "GET /Migration/firstpage.jsp HTTP/1.1" 404 1040 127.0.0.1 - - [24/Aug/2016:10:08:39 +0530] "GET / HTTP/1.1" 200 11452 0:0:0:0:0:0:0:1 - - [24/Aug/2016:10:08:41 +0530] "GET /Migration/firstpage.jsp HTTP/1.1" 500 3750 0:0:0:0:0:0:0:1 - - [24/Aug/2016:10:09:38 +0530] "GET /Mortgage/faces/NewFile.jsp HTTP/1.1" 404 1046 Is there any problem with the input code or date filter code? Can anyone help me where i am committing mistake?
Did you try keeping the stdout {} empty as this within your output section of your conf file in order to check the output from your logstash console? As #baudsp mentioned, it's better to use grok filter when you're dealing with log files. Something like this: filter { grok { match => { "message" => "%{COMBINEDAPACHELOG}"} } } Source: Parsing Logs with Logstash