How to add janus graph imports to gremlin groovy script engine? - groovy

I use GremlinGroovyScriptEngine, which is part of gremlin-server to evaluate string gremlin queries - like this:
final ScriptEngine engine = new GremlinGroovyScriptEngine();
engine.eval("g.V().count().next();");
... everything was good until I have started to use janus-graph specific elements in queries - like that (last string):
final ScriptEngine engine = new GremlinGroovyScriptEngine();
//== Set binding with traversal/graph/transaction to script engine ===
JanusGraphManagement mgmt = jg.openManagement();
SimpleBindings trBinding = new SimpleBindings();
trBinding.putAll(this.bindings);
trBinding.put("mgmt", mgmt);
engine.setBindings(trBinding, ScriptContext.ENGINE_SCOPE);
result = engine.eval("mgmt.makePropertyKey('zzzzzz').dataType(String.class).cardinality(Cardinality.SINGLE).make();");
... in that case I got:
MissingPropertyException: No such property: SINGLE for class: org.apache.tinkerpop.gremlin.structure.VertexProperty$Cardinality
As workaround I define whole class name org.janusgraph.core.Cardinality.SINGLE in query.
As I understand it is possible to set up all specific import to script engine during its creation.
Janus specific imports is defined in JanusGraphGremlinPlugin class which I use during gremlin-script-engine initialization in this way:
JanusGraphGremlinPlugin graphGremlinPlugin = JanusGraphGremlinPlugin.instance();
GremlinScriptEngineManager engineManager = new CachedGremlinScriptEngineManager();
/* Create gremlin script engine */
GremlinGroovyScriptEngine engine = GremlinGroovyScriptEngine.class
.cast(engineManager.getEngineByName("gremlin-groovy"));
... but it does not work. It seems that engineManager not set any plugins because after creation of engine engine.getPlugins().size() gives 0.
Also there is direct method of engine for loading plugin:
...
engine.loadPlugins(Collections.singletonList(graphGremlinPlugin))
...
... but it receive List of instances of org.apache.tinkerpop.gremlin.groovy.plugin.GremlinPlugin class which is deprecated (replaced by org.apache.tinkerpop.gremlin.jsr223.GremlinPlugin).
Moreover JanusGraphGremlinPlugin class is descendant of org.apache.tinkerpop.gremlin.jsr223.AbstractGremlinPlugin so that it cannot be used in .loadPlugins() method.
Do you know how it is possible to use JanusGraphGremlinPlugin class to add janus-specific imports to gremlin-groovy-engine?

You need to add the plugin to the GremlinScriptEngineManager instance:
GremlinScriptEngineManager engineManager = new CachedGremlinScriptEngineManager();
engineManager.addPlugin(JanusGraphGremlinPlugin.instance())
engine = engineManager.getEngineByName("gremlin-groovy")
As long as the plugin is added before you instantiate the engine, it should work.

Related

How can I call this jOOQ generated function taking a custom bound type?

I originally had the following SQL function:
CREATE FUNCTION resolve_device(query JSONB) RETURNS JSONB...
and the following code calling the method generated by jOOQ:
final JsonArray jsonArray = jooqDWH.select(resolveDevice(queryJson)).fetchOne().value1().getAsJsonArray();
final JsonObject o = jsonArray.get(0).getAsJsonObject();
This worked fine. I needed to return a real device object rather than a JSON blob though, so I changed the SQL function to:
CREATE FUNCTION resolve_device(query JSONB) RETURNS SETOF device...
and the code to:
final ResolveDeviceRecord deviceRecord = jooqDWH.fetchOne(resolveDevice(queryJson));
but I am getting a runtime error:
org.jooq.exception.SQLDialectNotSupportedException: Type class com.google.gson.JsonElement is not supported in dialect DEFAULT
Many other parts of my code continue to work fine with the custom binding I have converting JsonElement to JSONB, but something about the change to this function's signature caused it to stop working.
I tried a few different variants of DSL.field() and DSL.val() to try to force it to be recognized but have not had any luck so far.
This could be a bug in jOOQ or a misconfiguration in your code generator. I'll update my answer once it is clear what went wrong.
Workaround:
Meanwhile, here's a workaround using plain SQL:
// Manually create a data type from your custom JSONB binding first:
final DataType<JsonObject> jsonb = SQLDataType.OTHER.asConvertedDataType(jsonBinding);
// Then, create an explicit bind variable using that data type:
final ResolveDeviceRecord deviceRecord =
jooqDWH.fetchOptional(table("resolve_device({0})", val(queryJson, jsonb)))
.map(r -> r.into(ResolveDeviceRecord.class))
.orElse(null);

how to self-discover variables within groovy script?

the scenario is, i can write groovy code that will be executed by a script engine within another application.
the only thing i know is the function name and it takes one argument, for example:
def runGroovyCode(name1) { ... }
is there a way to figure out from within the groovy code itself, what other variable or objects (other than the name1 passed in) that is available to be used by the groovy code?
hope i described this clearly. it's kind of like the groovy code self-discovering what external variables (data) are within it's scope.
basically, i need more data for my groovy code. and i need to confirm, if name1 is the only data i have, or maybe there are more variables available, but i don't know what their names are otherwise i cannot access them.
i need to find out what variables or objects are available in this scripting engine execution environment that my groovy code will be running within.
there is no further documentation. so basically, my groovy code is running in a black box.
You can have a look at groovy.lang.Script class. It has a property called binding which contains all the variables that have been passed to the script.
Here is an example
class Main {
def static SCRIPT = """
// binding var contains all the bindings of the script
println binding.variables
"""
public static void main(String[] args) {
def factory = new ScriptEngineManager();
def engine = factory.getEngineByName("groovy");
engine.eval(SCRIPT, new SimpleBindings([param1: 1, param2: 2]));
}
}
Output:
[param1:1, param2:2, context:javax.script.SimpleScriptContext#2f7c2f4f, out:java.io.PrintWriter#6af93788]

SOAPUI Load Custom Properties from file using groovy

I am trying to write a groovy script which loads the custom properties for a test suite using information from a properties file.
The properties file has around 6 different attributes
I have had a look at quite a few different methods i.e Loading from Properties test step and trying to expand the properties with groovy, but have not been successful.
If anyone could advise on how to achieve this, it would be much appreciated.
Thanks in advance.
Here is the groovy script which reads a property file and set them at test suite level:
def props = new Properties()
//replace the path with your file name below. use / instead of \ as path separator even on windows platform.
new File("/absolute/path/of/test.properties").withInputStream { s ->
props.load(s)
}
props.each {
context.testCase.testSuite.setPropertyValue(it.key, it.value)
}
The above script load test suite level for the current suite where the groovy script is present.
Unfortunately, in my case I want to have the properties in the same order as the input file, ie. sorted, and this methode does not work.
I wanted to load a 'Project properties' file containing sorted properties and each time I used this method it stored them unsorted.
I had to use a more straightforward method (see below). If anyone knows about a more elegant/practical way to do it, I'm interested
def filename = context.expand( '${#TestCase#filename}' )
def propertiesFile = new File(filename)
assert propertiesFile.exists(), "$filename does not exist"
project = testRunner.testCase.testSuite.project
//Remove properties
project.propertyNames.collect{project.removeProperty(it)}
//load the properties of external file
propertiesFile.eachLine {
line->
firstIndexOf = line.indexOf('=') // properties as set as key=value in the file
key = line.substring(0, firstIndexOf)
value = line.substring(firstIndexOf+1)
project.setPropertyValue(key, value)
}

ModelMapper Provider ignored on first level properties of a mapping

I have already successfully used a Provider in a ModelMapper transformation but I've stumbled upon a weird situation. I've noticed that the Provider is being considered only for the objects beyond the "first level" of the transformation, eg.:
I have two hierarchies:
1st) TipoConsultarProcessoResposta, TipoProcessoJudicial and TipoDocumento
2nd) ConsultarProcessoResposta, ProcessoJudicial and Documento
TipoConsultarProcessoResposta has a TipoProcessoJudicial that in turn has a List of TipoDocumento, the 2nd hierarchy ressembles the first.
I am converting from the first hierarchy to the second and the provider is working fine for the TipoDocumento to Documento conversion but it is being ignored for the conversion of TipoProcessoJudicial to ProcessoJudicial.
Here is the relevant part of the code:
modelMapper = new ModelMapper();
modelMapper.getConfiguration().setMatchingStrategy(STRICT);
modelMapper.addMappings(new DocumentoPropertyMap()).setProvider(documentoProvider);
modelMapper.addMappings(new ProcessoJudicialPropertyMap()).setProvider(processoJudicialProvider);
ConsultarProcessoResposta resposta = modelMapper.map(tipoConsultarProcessoResposta, ConsultarProcessoResposta.class);
DocumentoPropertyMap extends PropertyMap<TipoDocumento, Documento> and ProcessoJudicialPropertyMap extends PropertyMap<TipoProcessoJudicial, ProcessoJudicial>.
The thing is that the DocumentoProvider is being called but ProcessoJudicialProvider isn't being called. ModelMapper tries to invoke a Global Provider that fails as well and resorts to instantiating through the constructor.

return codes for Jira workflow script validators

I'm writing a workflow validator in Groovy to link two issues based on a custom field value input at case creation. It is required that the custom filed value to Jira issue link be unique. In other words, I need to ensure only one issue has a particular custom field value. If there is more than one issue that has the input custom field value, the validation should fail.
How or what do I return to cause a workflow validator to fail?
Example code:
// Set up jqlQueryParser object
jqlQueryParser = ComponentManager.getComponentInstanceOfType(JqlQueryParser.class) as JqlQueryParser
// Form the JQL query
query = jqlQueryParser.parseQuery('<my_jql_query>')
// Set up SearchService object used to query Jira
searchService = componentManager.getSearchService()
// Run the query to get all issues with Article number that match input
results = searchService.search(componentManager.getJiraAuthenticationContext().getUser(), query, PagerFilter.getUnlimitedFilter())
// Throw a FATAL level log statement because we should never have more than one case associated with a given KB article
if (results.getIssues().size() > 1) {
for (r in results.getIssues()) {
log.fatal('Custom field has more than one Jira ssue associated with it. ' + r.getKey() + ' is one of the offending issues')
}
return "?????"
}
// Create link from new Improvement to parent issue
for (r in results) {
IssueLinkManager.createIssueLink(issue.getId(), r.getId(), 10201, 1, getJiraAuthenticationContext().getUser())
}
try something like
import com.opensymphony.workflow.InvalidInputException
invalidInputException = new InvalidInputException("Validation failure")
this is based of the groovy script runner. If it doesn't work for you, i would recommend you using some sort of framework to make scripting easier, I like using either groovy script runner , Jira Scripting Suite or Behaviours Plugin
. All of them really makes script writing easier and much more intuitive.

Resources