Is it possible to create a message using values from two other messages in Graylog using only graylog's default functionality (such as pipelines and streams)?
For example, if two messages arrive in graylog 30 seconds apart:
{
message:"view enter",
timestamp:"2018-01-01 10:10:00.00"
},
{
message:"view leave",
timestamp:"2018-01-01 10:10:30.00"
}
I would like to create a third message with a field showing the difference between the two timestamp fields.
{
message:"view visit duration",
duration:30
}
Related
I have a bot that changes the variable to true when the radio station starts playing. I want to have a variable that is unique for the different servers it is used in. I don't know how to do this.
This is my code for the system :
const Radio = {
Predvaja : false,
Radio1 : false,
RadioCity : false,
RadioDJCity : false,
RadioCenter : false,
RadioCenter100 : false,
RadioMTV : false,
RadioPtuj : false,
RadioOtvoreni : false,
RadioAktual : false,
RadioVeseljak : false,
RadioDalmacija : false
}
if (msg === 'radio 1') {
Radio.Predvaja = true;
Radio.Radio1 = true;
}
Servers Map
You should create a global Map object in your bot. Have a look at examples how to use it.
The Map stores key-value pairs and by design it is optimized to do efficient lookups of keys. So you can use the server ID as a key for each piece of information about radio stations that are playing right now.
let stationsPerServer = new Map();
//when a server adds your bot
client.on("guildCreate", guild => {
stationsPerServer.set(guild.id, new Set());
});
//station activation per server
activateStation(guildId, stationName) {
const stations = stationsPerServer.get(guildId);
stations.add(stationName);
//add any other custom actions like broadcast to a voice channel
}
//station deactivation per server
activateStation(guildId, stationName) {
stationsPerServer.get(guildId).delete(stationName)
//add any other custom actions like leaving a voice channel
}
//is station playing in server
activateStation(guildId, stationName) {
return stationsPerServer.get(guildId).has(stationName)
}
//deactivate station for all servers
deactivateStationGlobally(stationName){
for (let stations of stationsPerServer.values()) {
stations.delete(stationName);
//add any other custom actions like leaving the voice channel
}
}
Stations vector
What you want to store is basically a binary vector. You have at least three options how to do that:
Dedicated object
That's your current solution. An object full of predefined boolean values. Not very flexible.
Using a set
It is much better to take a Set of station names that are playing right now. A Set stores each value only once and it is optimized for fast lookups. You can interpret the boolean value in your object structure as a query if a station is present in a set of active stations.
There are several advantages when using sets:
You can later add other stations easily, or remove existing ones if they stop broadcasting without changing the structure of your object.
You don't need to store the negative (false) values for each server. This would be storage-efficient if you had lots of stations and lots of servers.
If necessary, you can keep a single global set of all known stations for reference.
So per server you can do something like:
let activeStations = new Set(); //initialize the set
//add some stations
//(equals setting the station field to true)
activeStations.add('RadioPtuj');
activeStations.add('RadioDalmacia');
//now the set contains {'RadioPtuj', 'RadioDalmacia'}
//adding a station twice does not change the set
// a set stores each member just once,
// unlike an array that does not care about duplicates
activeStations.add('RadioDalmacia');
//the set still contains only {'RadioPtuj', 'RadioDalmacia'}
activeStations.add('RadioDalmacia');
//the set still contains only {'RadioPtuj', 'RadioDalmacia'}
//find out if a station is active
//(equals checking the boolean station field)
activeStations.has('RadioPtuj'); //true
activeStations.has('RadioOtvoreni'); //false
//remove an inactive station
//(equals setting the station field to false)
activeStations.delete('RadioDalmacia');
//now the set contains {'RadioPtuj'}
//deletion of a station not present in the set has no effect
activeStations.delete('RadioMTV');
//the set still contains {'RadioPtuj'}
Enum
Javascript does not support enums out of the box, but you could assign binary codes to stations and use binary operations and masks to replace the set of strings with a set of numbers. This would be relevant if you had thousands of servers running your bot.
Adding more server data
The Map is great for keeping all the par-server data, not only the set of active stations. I assume you will need to store dispatcher objects once you start streaming the stations into voice channels etc. You can pack all this to a server class and have the Map store instances of such class for each server.
I have set up an ELK stack on one server and filebeat on 2 other servers to send data directly to logstash.
Setup is working fine and I got log result as per need but when I see field sections on Kibana UI (Left side), I see "host.hostname" field which have two servers fqdns (i.e "ip-113-331-116-35.us-east-1.compute.internal",
"ip-122-231-123-35.us-east-1.compute.internal"
)
I want to set alias or rename those value as Production-1 and Production-2 respectively to show on kibana UI
How can I change those values without breaking anything
If you need any code snippet let me know
You can use the translate filter in the filter block of your logstash pipeline to rename the values.
filter {
translate {
field => "[host][hostname]"
destination => "[host][hostname]"
dictionary => {
"ip-113-331-116-35.us-east-1.compute.internal" => "Production-1"
"ip-122-231-123-35.us-east-1.compute.internal" => "Production-2"
}
}
}
Since the field host.hostname is an ECS-field I would not suggest to rename this particular field.
In my opinion you have two choices:
1.) Create a pipeline in Logstash
You can set up a simple pipeline in Logstash where you use the mutate filter plugin and do a add_field operation. This will create a new field on your event with the value of host.hostname. Here's a quick example:
filter{
if [host][hostname]{
mutate{
add_field => { "your_cool_field_name" => "%{[host][hostname]}" }
}
}
}
2.) Setup a custom mapping/index template
You can define field aliases within your custom mappings. I recommend reading this article about field aliases
We're using logstash to capture log messages.
Application logs sometimes contain very long lines and Splunk cannot ingest messages longer than 10k or so (default).
How to drop large messages with logstash?
Requires Logstash >= 5:
filter {
if [message] {
ruby {
code => "event.cancel if event.get('message').bytesize > 8192"
}
}
}
Basically my ‘Maintopic’ topic receives three types of xml files (‘TEST’,’DEV’,’PROD’).
‘MainSubscription’ subscribes to that topic and now based on the XML file type, I need to route the XML files to:
Respective topics (child topics).
See the below message flow.
Maintopic --à MainSubscription’ (Filter on xml node type)-- > child topic 1 ( xml node type=’TEST’)
child topic 2 ( xml node type=’DEV’)
child topic 3 ( xml node type=’PROD)
I can add a subscription to the ‘Maintopic’, but where can I define all the filter logic for routing the file?
I am new to Azure cloud, how can I do this? I don't even know where to start.
Service Bus supports three filter conditions:
Boolean filters - The TrueFilter and FalseFilter either cause all arriving messages (true) or none of the arriving messages (false) to be selected for the subscription.
SQL Filters - A SqlFilter holds a SQL-like conditional expression that is evaluated in the broker against the arriving messages' user-defined properties and system properties.
Correlation Filters - A CorrelationFilter holds a set of conditions that are matched against one or more of an arriving message's user and system properties.
You must create a filtered subscription which will only receive messages you are interested in.
A filter can be based on any properties of the BrokeredMessage, except the body, since that would require every message to be desterilized in order to be handed on to the correct subscriptions. You can use sql filter.
An example of sql filter is below –
if (!namespaceManager.SubscriptionExists(topicName, filteredSubName1))
{
namespaceManager.CreateSubscription(topicName, filteredSubName1, new SqlFilter("From LIKE '%Smith'"));
}
You don’t send your messages directly to a subscription, you send them to the topic, and that will forward them to all the relevant subscriptions based on their filters. Below is the example -
var message1 = new BrokeredMessage("Second message");
message1.Properties["From"] = "Alan Smith";
var client = TopicClient.CreateFromConnectionString(connectionString, topicName);
client.Send(message1);
Below is how you receive message –
var subClient = SubscriptionClient.CreateFromConnectionString(connectionString, topicName, subscriptionName);
var received = subClient.ReceiveBatch(10, TimeSpan.FromSeconds(5));
foreach (var message in received)
{
Console.WriteLine("{0} '{1}' Label: '{2}' From: '{3}'",
subscriptionName,
message.GetBody<string>(),
message.Label,
message.Properties["From"]);
}
If you look at: http://developer.github.com/v3/pulls/ it shows you how to get pull requests for a given repository.
How do we get "my pull requests" from the GitHub API similar to the data displayed on the GitHub dashboard?
I asked Github directly. A rep told me to use the search endpoint. Search for issues owned by you that are open and of type pr.
https://api.github.com/search/issues?q=state%3Aopen+author%3Adavidxia+type%3Apr
If you're using a python client lib like Pygithub you can do
issues = gh.search_issues('', state='open', author='davidxia', type='pr')
You can also use GraphQL API v4 to get all your pull requests :
{
user(login: "bertrandmartel") {
pullRequests(first: 100, states: OPEN) {
totalCount
nodes {
createdAt
number
title
}
pageInfo {
hasNextPage
endCursor
}
}
}
}
Try it in the explorer
or using viewer :
{
viewer {
pullRequests(first: 100, states: OPEN) {
totalCount
nodes {
createdAt
number
title
}
pageInfo {
hasNextPage
endCursor
}
}
}
}
First you have to realize that you must authenticate using either Basic Authentication or a token. Next you have to realize that there is no simple way to do this so you will have to be clever.
To be specific, if you probe https://api.github.com/issues, you'll notice that the issues there have a hash called pull_request which should have 3 URLs: html, diff, and patch. All three will be non-null if the issue is also a Pull Request. (Pro-tip: They're the same thing as far as GitHub is concerned…sort of.)
If you iterate over your issues and filter for ones where those attributes are not null, then you'll have your pull requests.