I am currently trying to print the integers from a list of lists of lists and am struggling to know the most effective way of doing this.
An example of this list is as follows:
[ [ [ 2,3 ], [ 1,6 ] ]
, [ [ 5,9 ], [ 2,9 ] ]
, [ [ 6,2 ], [ 7,7 ] ] ]
My hope is to print a string such as "231659296277".
Any advice would be greatly appreciated.
Since it's a 3 layered list, you can just concatenate it twice:
concat . concat $
[ [ [ 2,3 ], [ 1,6 ] ]
, [ [ 5,9 ], [ 2,9 ] ]
, [ [ 6,2 ], [ 7,7 ] ]
]
-- [2,3,1,6,5,9,2,9,6,2,7,7]
If you want to convert it to a string, then you can >>= show:
[2,3,1,6,5,9,2,9,6,2,7,7] >>= show
-- "231659296277"
I have a geojson document on which I want to perform some GEOS transforms on, such as: calculating the intersection, subtracting a polygon from another etc.
I have been able to create a geo_types::Polygon from the document but haven't been able to convert that to a GEOS Polygon. The documentation for the geos library says that this is possible, but I am getting compilation errors.
use serde_json::{Result, Value};
use geo_geojson;
use geos::from_geo::TryInto;
use geos::{Error, Geometry};
fn main() {
let data = r#"
{
"type" : "Feature",
"properties" : {},
"geometry" : {
"type" : "Polygon",
"coordinates" : [ [ [ -80.2006099057282, 25.7667778809006], [ -80.2005810927863, 25.7667893295156],
[ -80.2005511360631, 25.7667981904308], [ -80.2005203313322, 25.7668043699427], [ -80.2004889842378, 25.7668078025078],
[ -80.2004574067358, 25.766808451653], [ -80.2004259134638, 25.7668063104759], [ -80.2003948180789, 25.7668014017381],
[ -80.2003644296081, 25.7667937775553], [ -80.2003350488499, 25.7667835186779], [ -80.2003069648777, 25.7667707333574],
[ -80.2002804517018, 25.7667555557905], [ -80.2002557651654, 25.7667381441435], [ -80.2002331401646, 25.7667186781729],
[ -80.2002127882898, 25.7666973564876], [ -80.200194895997, 25.7666743935394], [ -80.2001796233871, 25.7666500164743],
[ -80.2001671036392, 25.7666244620256], [ -80.2001574430594, 25.7665979736533], [ -80.2001507216193, 25.7665707991263],
[ -80.2001469937692, 25.7665431886774], [ -80.2001462892496, 25.766515393747], [ -80.2001486136429, 25.7664876661955],
[ -80.200153948486, 25.7664602577369], [ -80.2001622509086, 25.7664334192908], [ -80.2001734529129, 25.7664073999664],
[ -80.2001874605259, 25.7663824454984], [ -80.2002041531028, 25.7663587960835], [ -80.2002233830273, 25.7663366837049],
[ -80.2002449759842, 25.7663163291135], [ -80.2002687318761, 25.7662979386737], [ -80.2002944263789, 25.7662817012691],
[ -80.2003218130656, 25.7662677854259], [ -80.2003506260038, 25.7662563367582], [ -80.2003805827209, 25.7662474758012],
[ -80.2004113874437, 25.7662412962597], [ -80.2004427345288, 25.766237863678], [ -80.2004743120208, 25.7662372145296],
[ -80.200505805283, 25.7662393557171], [ -80.2005369006592, 25.7662442644785], [ -80.2005672891229, 25.7662518886976],
[ -80.2005966698762, 25.7662621476228], [ -80.200624753846, 25.7662749330014], [ -80.2006512670223, 25.7662901106348],
[ -80.2006759535611, 25.7663075223545], [ -80.2006985785666, 25.7663269884019], [ -80.200718930447, 25.7663483101652],
[ -80.2007368227461, 25.7663712731897], [ -80.2007520953618, 25.7663956503261], [ -80.2007646151143, 25.7664212048378],
[ -80.2007742756976, 25.766447693262], [ -80.2007809971394, 25.7664748678266], [ -80.20078472499, 25.766502478297],
[ -80.2007854295097, 25.7665302732315], [ -80.2007831051166, 25.7665580007696], [ -80.2007777702746, 25.7665854091976],
[ -80.2007694678545, 25.7666122475983], [ -80.2007582658544, 25.7666382668644], [ -80.2007442582467, 25.7666632212646],
[ -80.2007275656759, 25.7666868706053], [ -80.2007083357575, 25.7667089829062], [ -80.2006867428059, 25.7667293374198],
[ -80.2006629869175, 25.7667477277848], [ -80.200637292416, 25.7667639651196], [ -80.2006099057282, 25.7667778809006]]]
}
}"#;
// Parse the string of data into serde_json::Value.
let serialized: Value = serde_json::from_str(data).unwrap();
let collection: geo_types::GeometryCollection<f64> = geo_geojson::from_str(&serialized.to_string()).unwrap();
for geom in collection {
let poly = geom.into_polygon().unwrap();
let converted_poly: geos::Geometry = (&poly).try_into().expect("failed conversion");
}
}
I expect this to compile and have converted_poly be a geos::Polygon. Instead I get this from the compiler:
could not find from_geo in geos
no method named try_into found for type &geo_types::polygon::Polygon<f64> in the current scope
Both the import and the try_into call are referenced on the first page of the documentation for the geos crate under the "Conversion from rust-geo" section.
The from_geo module is behind a feature flag:
#[cfg(any(feature = "geo", feature = "dox"))]
pub mod from_geo;
You need to specify that feature when you add the crate to your Cargo.toml:
[dependencies]
geos = { version = "5.0.0", features = ["geo"] }
You should also file issues with the crate to document this.
See also:
Cargo documentation for feature flags
Below is our log entry and we want extractct below highlighted values using
Grok expression -http://grokdebug.herokuapp.com/discover
sys tmp usr var Purging cache - END (PID: 4477, QN: 51/51, ET:
0) anaconda-post.log bin dev etc home lib lib64 lost+found media mnt opt
Required help to get above values ing Grok expression
This works for me:
QN: %{NUMBER:QN1}/%{NUMBER:QN2}, ET: %{NUMBER:ET}
Results in the following output:
{
"QN1": [
[
"51"
]
],
"BASE10NUM": [
[
"51",
"51",
"0"
]
],
"QN2": [
[
"51"
]
],
"ET": [
[
"0"
]
]
}
Can anyone please tell me the GROK pattern for this log
I am new to Logstash. Any help is appreciated
: "ppsweb1 [ERROR] [JJN01234313887b4319ad0536bf6324j34h5469624340M] [913h56a5-e359-4a75-be9a-fae60d1a5ecb] 2016-07-28 13:14:58.848 [http-nio-8080-exec-4] PaymentAction - Net amount 149644"
I tried the following:
%{WORD:field1} \[%{LOGLEVEL:field2}\] \[%{NOTSPACE:field3}\] \[%{NOTSPACE:field4}\] %{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:field5}\] %{WORD:field6} - %{GREEDYDATA:field7} %{NUMBER:filed8}
And I got the output as:
{
"field1": [
[
"ppsweb1"
]
],
"field2": [
[
"ERROR"
]
],
"field3": [
[
"JJN01234313887b4319ad0536bf6324j34h5469624340M"
]
],
"field4": [
[
"913h56a5-e359-4a75-be9a-fae60d1a5ecb"
]
],
"timestamp": [
[
"2016-07-28 13:14:58.848"
]
],
"field5": [
[
"http-nio-8080-exec-4"
]
],
"field6": [
[
"PaymentAction"
]
],
"field7": [
[
"Net amount"
]
],
"filed8": [
[
"149644"
]
]
}
You can change the names of fields as you want. You haven't mentioned anything about expected output in your question. So this is just to give you a basic idea. For further modifications you can use http://grokdebug.herokuapp.com/ to verify your filter.
Note: I have used basic patterns, there are complex patterns available and you can play around with the debugger to suit your requirements.
Good luck!
I have two maps as below(these are the logs of my output...sorry for the bad groovy)
map1 = [
[ "name1":value1, "name2":value2, "name3":value3 ],
[ "name1":value1, "name2":value20, "name3":value30 ]
]
map2 = [
[ "name1":value1, "name2":value4, "name3":value5, "name4":value6 ],
[ "name1":value1, "name2":value7, "name3":value8, "name4":value9 ]
]
I need to set the values of name2 and name3 of map1 to name2 and name3 of map2 when "name1":value1 in both the maps
Required output:
map2 = [
[ "name1":value1, "name2":value2, "name3":value3, "name4":value6 ],
[ "name1":value1, "name2":value20, "name3":value30, "name4":value9 ]
]
I tried looping through both of them, but there is an overwrite(as it is a map) and the result is as below
map2 = [
[ "name1":value1, "name2":value20, "name3":value30, "name4":value9 ],
[ "name1":value1, "name2":value20, "name3":value30, "name4":value9 ]
]
First of all, they (map1 and map2) are lists and not maps.
Taking into consideration, the cardinality of both the lists are same, simplistically you can achieve the same by:
list2.eachWithIndex{item, i ->
if(list2[i].name1 == list1[i].name1){
list2[i].name2 = list1[i].name2
list2[i].name3 = list1[i].name3
}
}
assert list2 == [
[ "name1":'value1', "name2":'value2', "name3":'value3', "name4":'value6' ],
[ "name1":'value1', "name2":'value20', "name3":'value30', "name4":'value9' ]
]