How to read cassandra FQL logs in java? - cassandra

I have a bunch of cassandra FQL logs with the "cq4" extension. I would like to read them in Java, is there a Java class that those log entries can be mapped into?
These are the logs I see.
I want to read this with this code:
import net.openhft.chronicle.Chronicle;
import net.openhft.chronicle.ChronicleQueueBuilder;
import net.openhft.chronicle.ExcerptTailer;
import java.io.IOException;
public class Main{
public static void main(String[] args) throws IOException {
Chronicle chronicle = ChronicleQueueBuilder.indexed("/Users/pavelorekhov/Desktop/fql_logs").build();
ExcerptTailer tailer = chronicle.createTailer();
while (tailer.nextIndex()) {
tailer.readInstance(/*class goes here*/)
}
}
}
I think from the code and screenshot you can understand what kind of class I need in order to read log entries into objects. Does that class exist in some cassandra maven dependency?

You are using Chronicle 3.x, which is very old.
I suggest using Chronicle 5.20.123, which is the version Cassandra uses.
I would assume Cassandra has it's own tool for reading the contents of these file however, you can dump the raw messages with net.openhft.chronicle.queue.main.DumpMain

I ended up cloning cassandra's github repo from here: https://github.com/apache/cassandra
In their code they have the FQLQueryIterator class which you can use to read logs, like so:
SingleChronicleQueue scq = SingleChronicleQueueBuilder.builder().path("/Users/pavelorekhov/Desktop/fql_logs").build();
ExcerptTailer excerptTailer = scq.createTailer();
FQLQueryIterator iterator = new FQLQueryIterator(excerptTailer, 1);
while (iterator.hasNext()) {
FQLQuery fqlQuery = iterator.next(); // object that holds the log entry
// do whatever you need to do with that log entry...
}

Related

Spring state machine UML stays in memory

I've been working with Spring State machines for over a year now trying different ways to implement according to my requirements, and I've come across a serious issue when I use UML.
I use papyrus to draw the UML and I have many UML stored in a certain location. The one I need to use is selected dynamically. That has been done successfully. Now I have come across a serious problem. Below is the code on how I have called the UML.
Resource resource = new FileSystemResource(stmDir+"/"+model+".uml");
UmlStateMachineModelFactory umlBuilder = new UmlStateMachineModelFactory(resource);
umlBuilder.setStateMachineComponentResolver(resolveActionConfig(model));
StateMachineModelFactory<String, String> modelFactory = umlBuilder;
Builder<String, String> builder = StateMachineBuilder.builder();
builder.configureModel().withModel().factory(modelFactory);
builder.configureConfiguration().withConfiguration().beanFactory(new StaticListableBeanFactory());
stateMachine = builder.build();
And as you can see I use new UmlStateMachineModelFactory(resource);
UmlStateMachineModelFactory Class has the following code
#Override
public StateMachineModel<String, String> build() {
Model model = null;
try {
model = UmlUtils.getModel(getResourceUri(resolveResource()).getPath());
} catch (IOException e) {
throw new IllegalArgumentException("Cannot build build model from resource " + resource + " or location " + location, e);
}
UmlModelParser parser = new UmlModelParser(model, this);
DataHolder dataHolder = parser.parseModel();
// we don't set configurationData here, so assume null
return new DefaultStateMachineModel<String, String>(null, dataHolder.getStatesData(), dataHolder.getTransitionsData());
}
and everytime I create one UmlStateMachineModelFactory, it in turn creates one UmlModelParser.
This class has
import org.eclipse.emf.common.util.EList;
import org.eclipse.emf.ecore.util.EcoreUtil;
import org.eclipse.uml2.uml.Activity;
import org.eclipse.uml2.uml.Constraint;
import org.eclipse.uml2.uml.Event;
import org.eclipse.uml2.uml.Model;
import org.eclipse.uml2.uml.OpaqueBehavior;
import org.eclipse.uml2.uml.OpaqueExpression;
import org.eclipse.uml2.uml.PackageableElement;
import org.eclipse.uml2.uml.Pseudostate;
import org.eclipse.uml2.uml.PseudostateKind;
import org.eclipse.uml2.uml.Region;
import org.eclipse.uml2.uml.Signal;
import org.eclipse.uml2.uml.SignalEvent;
import org.eclipse.uml2.uml.State;
import org.eclipse.uml2.uml.StateMachine;
import org.eclipse.uml2.uml.TimeEvent;
import org.eclipse.uml2.uml.Transition;
import org.eclipse.uml2.uml.Trigger;
import org.eclipse.uml2.uml.UMLPackage;
import org.eclipse.uml2.uml.Vertex;
These remain in my memory causing it to use up a large amount of memory and doesn't get collected by the garbage collector. This is causing a lot of trouble as we are using this for a large scale application and many instances are created every few minutes.
Please suggest a workaround.
EDIT- I managed to create a singleton wrapper for this problem but regardless of that, it persists. My colleague had found out that the loaded resources do not unload. so everytime I call builder.build(),
ResourceSet resourceSet = new ResourceSetImpl();
resourceSet.getPackageRegistry().put(UMLPackage.eNS_URI, UMLPackage.eINSTANCE);
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put(UMLResource.FILE_EXTENSION, UMLResource.Factory.INSTANCE);
resourceSet.createResource(modelUri);
Resource resource = resourceSet.getResource(modelUri, true);
this is called. I wonder is this is causing the heap build-up. Please help
I pushed some fixes per gh572 to master and 1.2.x. Hopefully those work for you. At least I was able to see garbage collection to work better. I'm planning to create releases later this week.

Read & write data into cassandra using apache flink Java API

I intend to use apache flink for read/write data into cassandra using flink. I was hoping to use flink-connector-cassandra, I don't find good documentation/examples for the connector.
Can you please point me to the right way for read and write data from cassandra using Apache Flink. I see only sink example which are purely for write ? Is apache flink meant for reading data too from cassandra similar to apache spark ?
I had the same question, and this is what I was looking for. I don't know if it is over simplified for what you need, but figured I should show it none the less.
ClusterBuilder cb = new ClusterBuilder() {
#Override
public Cluster buildCluster(Cluster.Builder builder) {
return builder.addContactPoint("urlToUse.com").withPort(9042).build();
}
};
CassandraInputFormat<Tuple2<String, String>> cassandraInputFormat = new CassandraInputFormat<>("SELECT * FROM example.cassandraconnectorexample", cb);
cassandraInputFormat.configure(null);
cassandraInputFormat.open(null);
Tuple2<String, String> testOutputTuple = new Tuple2<>();
cassandraInputFormat.nextRecord(testOutputTuple);
System.out.println("column1: " + testOutputTuple.f0);
System.out.println("column2: " + testOutputTuple.f1);
The way I figured this out was thanks to finding the code for the "CassandraInputFormat" class and seeing how it worked (http://www.javatips.net/api/flink-master/flink-connectors/flink-connector-cassandra/src/main/java/org/apache/flink/batch/connectors/cassandra/CassandraInputFormat.java). I honestly expected it to just be a format and not the full class of reading from Cassandra based on the name, and I have a feeling others might be thinking the same thing.
ClusterBuilder cb = new ClusterBuilder() {
#Override
public Cluster buildCluster(Cluster.Builder builder) {
return builder.addContactPoint("localhost").withPort(9042).build();
}
};
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnv = TableEnvironment.getTableEnvironment(env);
InputFormat inputFormat = new CassandraInputFormat<Tuple3<Integer, Integer, Integer>>("SELECT * FROM test.example;", cb);//, TypeInformation.of(Tuple3.class));
DataStreamSource t = env.createInput(inputFormat, TupleTypeInfo.of(new TypeHint<Tuple3<Integer, Integer,Integer>>() {}));
tableEnv.registerDataStream("t1",t);
Table t2 = tableEnv.sql("select * from t1");
t2.printSchema();
You can use RichFlatMapFunction to extend a class
class MongoMapper extends RichFlatMapFunction[JsonNode,JsonNode]{
var userCollection: MongoCollection[Document] = _
override def open(parameters: Configuration): Unit = {
// do something here like opening connection
val client: MongoClient = MongoClient("mongodb://localhost:10000")
userCollection = client.getDatabase("gp_stage").getCollection("users").withReadPreference(ReadPreference.secondaryPreferred())
super.open(parameters)
}
override def flatMap(event: JsonNode, out: Collector[JsonNode]): Unit = {
// Do something here per record and this function can make use of objects initialized via open
userCollection.find(Filters.eq("_id", somevalue)).limit(1).first().subscribe(
(result: Document) => {
// println(result)
},
(t: Throwable) =>{
println(t)
},
()=>{
out.collect(event)
}
)
}
}
}
Basically open function executes once per worker and flatmap executes it per record. The example is for mongo but can be similarly used for cassandra
In your case as I understand the first step of your pipeline is reading data from Cassandra rather than writing a RichFlatMapFunction you should write your own RichSourceFunction
As a reference you can have a look at simple implementation of WikipediaEditsSource.

task not serializable error while performing an RDD map function scala

trust me on this one, I have tried really hard toiling night and day but just just could not get hold of this Task not serializable error which is now totally eating me out!.
And I understand there are many similar questions floating around on SO but either am I really too dumb to get those (I am not expecting any spoon feeding in here) or mine being a different bug story altogether.
Totally requesting you guys to have a look at this one :
class RootServer(val config: Config) extends Configurable with Server with Serializable {
var PKsAffectedDF: DataFrame = dataSource.getAffectedPKs(prevBookMark,currBookMark)
if(PKsAffectedDF.rdd.isEmpty()){
Holder.log.info("[SQL] no records to upsert in the internal Status Table == for bookmarks : "+prevBookMark+" ==and== "+currBookMark+" for table == "+dataSource.db+"_"+dataSource.table)
}
val PKsAffectedDF_json = PKsAffectedDF.toJSON
PKsAffectedDF_json.foreachPartition { partitionOfRecords => {
var props = new Properties()
props.put('','')
props.put('','')
props.put('','')
props.put('','')
val producer = new KafkaProducer[String,String](props)
partitionOfRecords.foreach
{
case x:String=>{
println(x)
val message=new ProducerRecord[String, String]("[TOPIC] "+dbname+"_"+dbtable,dbname,x)
producer.send(message)
}
}
}
}
}
Now Config is the typesafe.config Class which I believe is serializable, and server class is a basic abstract class with just two methods. I am totally stuck at what would be giving the following stacktrace :
http://pastebin.com/5yPA1s7e Sorry for pastebinning it but the entire stacktrace is there.
Thanks in Advance peeps.

Revit Api Load Command - Auto Reload

I'm working with the revit api, and one of its problems is that it locks the .dll once the command's run. You have to exit revit before the command can be rebuilt, very time consuming.
After some research, I came across this post on GitHub, that streams the command .dll into memory, thus hiding it from Revit. Letting you rebuild the VS project as much as you like.
The AutoReload Class impliments the revit IExteneralCommand Class which is the link into the Revit Program.
But the AutoReload class hides the actual source DLL from revit. So revit can't lock the DLL and lets one rebuilt the source file.
Only problem is I cant figure out how to implement it, and have revit execute the command. I guess my C# general knowledge is still too limited.
I created an entry in the RevitAddin.addin manifest that points to the AutoReload Method command, but nothing happens.
I've tried to follow all the comments in the posted code, but nothing seems to work; and no luck finding a contact for the developer.
Found at: https://gist.github.com/6084730.git
using System;
namespace Mine
{
// helper class
public class PluginData
{
public DateTime _creation_time;
public Autodesk.Revit.UI.IExternalCommand _instance;
public PluginData(Autodesk.Revit.UI.IExternalCommand instance)
{
_instance = instance;
}
}
//
// Base class for auto-reloading external commands that reside in other dll's
// (that Revit never knows about, and therefore cannot lock)
//
public class AutoReload : Autodesk.Revit.UI.IExternalCommand
{
// keep a static dictionary of loaded modules (so the data persists between calls to Execute)
static System.Collections.Generic.Dictionary<string, PluginData> _dictionary;
String _path; // to the dll
String _class_full_name;
public AutoReload(String path, String class_full_name)
{
if (_dictionary == null)
{
_dictionary = new System.Collections.Generic.Dictionary<string, PluginData>();
}
if (!_dictionary.ContainsKey(class_full_name))
{
PluginData data = new PluginData(null);
_dictionary.Add(class_full_name, data);
}
_path = path;
_class_full_name = class_full_name;
}
public Autodesk.Revit.UI.Result Execute(
Autodesk.Revit.UI.ExternalCommandData commandData,
ref string message,
Autodesk.Revit.DB.ElementSet elements)
{
PluginData data = _dictionary[_class_full_name];
DateTime creation_time = new System.IO.FileInfo(_path).LastWriteTime;
if (creation_time.CompareTo(data._creation_time) > 0)
{
// dll file has been modified, or this is the first time we execute this command.
data._creation_time = creation_time;
byte[] assembly_bytes = System.IO.File.ReadAllBytes(_path);
System.Reflection.Assembly assembly = System.Reflection.Assembly.Load(assembly_bytes);
foreach (Type type in assembly.GetTypes())
{
if (type.IsClass && type.FullName == _class_full_name)
{
data._instance = Activator.CreateInstance(type) as Autodesk.Revit.UI.IExternalCommand;
break;
}
}
}
// now actually call the command
return data._instance.Execute(commandData, ref message, elements);
}
}
//
// Derive a class from AutoReload for every auto-reloadable command. Hardcode the path
// to the dll and the full name of the IExternalCommand class in the constructor of the base class.
//
[Autodesk.Revit.Attributes.Transaction(Autodesk.Revit.Attributes.TransactionMode.Manual)]
[Autodesk.Revit.Attributes.Regeneration(Autodesk.Revit.Attributes.RegenerationOption.Manual)]
public class AutoReloadExample : AutoReload
{
public AutoReloadExample()
: base("C:\\revit2014plugins\\ExampleCommand.dll", "Mine.ExampleCommand")
{
}
}
}
There is an easier approach: Add-in Manager
Go to Revit Developer Center and download the Revit SDK, unzip/install it, the check at \Revit 2016 SDK\Add-In Manager folder. With this tool you can load/reload DLLs without having to modify your code.
There is also some additional information at this blog post.
this is how you can use the above code:
Create a new VS class project; name it anything (eg. AutoLoad)
Copy&Paste the above code in-between the namespace region
reference revitapi.dll & revitapiui.dll
Scroll down to AutoReloadExample class and replace the path to point
your dll
Replace "Mine.ExampleCommand" with your plugins namespace.mainclass
Build the solution
Create an .addin manifest to point this new loader (eg.
AutoLoad.dll)
your .addin should include "FullClassName" AutoLoad.AutoReloadExample
This method uses reflection to create an instance of your plugin and prevent Revit to lock your dll file! You can add more of your commands just by adding new classes like AutoReloadExample and point them with seperate .addin files.
Cheers

GWT-GXT FileUploadField

I tried making a form in GXT to upload files, but I see more examples on the net, I failed to make it work a simple FileUploadField to save the file locally.
Cde fragment:
formPanel = new FormPanel();
formPanel.setBodyBorder(false);
formPanel.setHeaderVisible(false);
formPanel.setAction(GWT.getModuleBaseURL() + "fileUpload");
formPanel.setEncoding(Encoding.MULTIPART);
formPanel.setMethod(Method.POST);
formPanel.setButtonAlign(HorizontalAlignment.CENTER);
formPanel.setHeaderVisible(true);
fileUploadField = new FileUploadField();
fileUploadField.setName("fileName");
fileUploadField.setAllowBlank(false);
fileUploadField.setFieldLabel("Archivo");
fileUploadField.addListener(Events.OnChange, new Listener<BaseEvent>() {
public void handleEvent(BaseEvent BaseEvent) {
aSubmitButton.setEnabled(true);
}
});
aSubmitButton = new Button("OK");
aSubmitButton.setEnabled(false);
aSubmitButton.setId("submit_button");
aSubmitButton.addSelectionListener(new SelectionListener<ButtonEvent>() {
#Override
public void componentSelected(ButtonEvent inButtonEvent) {
formPanel.submit();
}
});
The above code is the declaration of FormPanel and FileUploadField.
We use gwtupload-0.6.3-compat.jar library to do the job.
Basically, the idea is that on the server side you need to create a servlet, which is going to be accepting your uploaded files. The mentioned library provides UploadAction servlet extension facilitating that.
On the client side you can use one of gwtupload components. We use MultiUploader for instance. That's literally a few lines of code there. Main code is in the listener:
private IUploader.OnFinishUploaderHandler onFinishUploaderHandler = new IUploader.OnFinishUploaderHandler() {
public void onFinish(IUploader uploader) {
if (uploader.getStatus() == Status.SUCCESS) {
// What you want to do when file is uploaded.
}
}
};
The rest is taken care of by the component. Since the library is for GWT, it comes with source code, so you can see what it's doing behind the scene and read extensive comments in the code.
Free to use of course.

Resources