Trying to pull data from a public API using the Logstash http_poller input plugin:
input {
http_poller {
urls => {
method => "GET"
url => "https://api.example.com/v1/service/"
}
request_timeout => 60
schedule => { cron => "0 * * * *"}
codec => "json"
metadata_target => "http_poller_metadata"
}
}
filter {
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
Keep on getting a bad get URL error:
[ERROR][logstash.pipeline] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Invalid URL GET>...]
Any idea what's causing this? The URL for the API is correct...
Turns out it was the method => "GET" line. Removing it worked like a charm.
Related
I'm trying to schedule some job on nodejs using node-sched (I've also tried node-cron-tz which supports timezones), on my current dev machine it works correctly and works fine.. on customer test/prod env it executes it 2 hours later.
I need to perform this job at the time present on a DB and it's normally 0 8 * * *
The code I used in my latest test is
scheduleService.addSqlTask = function (item) {
if (item == null)
return;
logger.info('Adding job {Item} to schedule', {Item:item.stored_procedure});
//logger.info("Adding {Item} : {Cron} to tasks", { Item: item.stored_procedure, Cron : item.cron_tab });
cron.schedule(item.cron_tab,
async () => {
try {
logger.info("Executing {Item} to schedule", { Item: item.stored_procedure });
await dataStore.executeSql(item);
} catch (e) {
logger.error(e.message);
}
},
{
scheduled: true,
timezone: "Europe/Rome"
}
);
}
In this case, it works wrong even on my machine since it doesn't consider saving light...
How can I handle this?
I'm trying to configure input for logstash 5 with Apache Drill JDBC (https://drill.apache.org/docs/using-the-jdbc-driver/)
Below is my input jdbc configuration for logstash.
input {
jdbc {
jdbc_driver_library => "jdbc_jars/drill-jdbc-all-1.10.0.jar"
jdbc_driver_class => "org.apache.drill.jdbc.Driver"
jdbc_connection_string => "jdbc:drill:zk=local"
jdbc_user=> "dfs"
schedule => "* * * * *"
statement => "select * from `sample.json`;"
}
}
I essentially get logstash WARN of "Failed test_connection". Hence, although logstash is launching, the DB connection is failing.
Any suggestions?
I see a few problems with your configuration.
You need to provide a valid IP address and port for a zookeeper node that Drill is using. The line you provided to logstash jdbc_connection_string => "jdbc:drill:zk=local" is telling logstash that zookeeper is running on the same node as logstash. What you need to provide instead is jdbc_connection_string => "jdbc:drill:zk=zk_hostname_or_ip:zk_port". Talk to the guy who setup your drill cluster to figure out the hostname or ip and port of your zookeeper node.
dfs is not a drill user, it is the name of one of Drill's storage plugins. If you want to run your query on a file stored on hdfs change
statement => "select * from `sample.json`;"
to
statement => "select * from dfs.`/path/to/sample.json`;"
If you do not have authentication configured for Drill your config should look like this.
input {
jdbc {
jdbc_driver_library => "jdbc_jars/drill-jdbc-all-1.10.0.jar"
jdbc_driver_class => "org.apache.drill.jdbc.Driver"
jdbc_connection_string => "jdbc:drill:zk=zk_hostname_or_ip:zk_port"
schedule => "* * * * *"
statement => "select * from `dfs./path/to/sample.json`;"
}
}
If you have authentication configured for Drill and you know your Drill username and password your config should look like this.
input {
jdbc {
jdbc_driver_library => "jdbc_jars/drill-jdbc-all-1.10.0.jar"
jdbc_driver_class => "org.apache.drill.jdbc.Driver"
jdbc_connection_string => "jdbc:drill:zk=zk_hostname_or_ip:zk_port"
schedule => "* * * * *"
statement => "select * from `dfs./path/to/sample.json`;"
jdbc_user => "myusername"
jdbc_password => "mypassword"
}
}
I've just started angular 2. I've done an angular2 sample as given in the https://angular.io/guide/quickstart
when I run the project in Firefox using
npm start
command in terminal, the connection get disconnected after output showing once.Error showing like
The connection to ws://localhost:3000/browser-sync/socket.io/?EIO=3&transport=websocket&sid=6YFGHWy7oD7T7qioAAAA was interrupted while the page was loading
Any idea about how to fix this issue ?
I don't know how you manage your web socket but you could consider using the following code. This idea is to wrap the web socket into an observable.
For this you could use a service like below. The initializeWebSocket will create a shared observable (hot) to wrap a WebSocket object.
export class WebSocketService {
initializeWebSocket(url) {
this.wsObservable = Observable.create((observer) => {
this.ws = new WebSocket(url);
this.ws.onopen = (e) => {
(...)
};
this.ws.onclose = (e) => {
if (e.wasClean) {
observer.complete();
} else {
observer.error(e);
}
};
this.ws.onerror = (e) => {
observer.error(e);
}
this.ws.onmessage = (e) => {
observer.next(JSON.parse(e.data));
}
return () => {
this.ws.close();
};
}).share();
}
}
You could add a sendData to send data on the web socket:
export class WebSocketService {
(...)
sendData(message) {
this.ws.send(JSON.stringify(message));
}
}
The last point is to make things a bit robust, i.e. filter received messages based on a criteria and implement retry when there is a disconnection. For this, you need to wrap our initial websocket observable into another one. This way we can support retries when the connection is lost and integrate filtering on criteria like the client identifier (in the sample the received data is JSON and contains a sender attribute).
export class WebSocketService {
(...)
createClientObservable(clientId) {
return Observable.create((observer) => {
let subscription = this.wsObservable
.filter((data) => data.sender!==clientId)
.subscribe(observer);
return () => {
subscription.unsubscribe();
};
}).retryWhen((errors) => {
return Observable.timer(3000);
});
}
}
You can see that deconnections are handled in this code using the retryWhen operator of observable.
I am trying to grab messages from IRC with Logstash, but I am not getting anything. Here is my config file:
input {
irc {
channels => ["#logstash"]
host => "irc://irc.freenode.net"
user => "abcde"
}
}
filter {
}
output {
stdout {}
}
Is there something I am missing?
Here is the documentation on that plugin for Logstash:
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-irc.html
I am using Logstash 1.5.5.
is there a way in Puppet to catch a failure when resource is applied, for example, when declaration like
file { '/var/tmp/test':
ensure => file,
mode => '0755',
}
fails, invoke something like
exec { 'Register some failure':
command => '/var/tmp/register failure for /var/tmp/test',
}
?
You can try this :
exec { 'Notify a failure' :
command => "/var/tmp/register failure for /var/tmp/test",
path => "/bin:",
subscribe => File["/var/tmp/test"],
}