Groovy No such property error while creating dynamic json - groovy

I am trying to create a dynamic json using csv data in jmeter with JSR223 PreProcessor
Below is the code for the same I am using CSV for data for Id and Name:
def builder = new groovy.json.JsonBuilder()
#groovy.transform.Immutable
class Items {
String Id
String Name
}
def items = new File("Item.txt").readLines().collect { line -> new Items(line.split(",")[0], line.split(",")[1]) }
builder.Rule(
__type: "DataCollectionRule",
DeviceFamily: '${__P(DeviceFamily)}',
RuleId: 0,
Name: 'test-${__time(yyyy-MM-dd'T'hh:mm:ss)}-${__counter(TRUE,)}',
Targets:
[
Groups :
[
[
Id: '${logicalid1_1}',
]
],
Devices:
[
]
],
StartDate: '/Date(${__time(,)})/',
IsEnabled: true,
Priority: 0,
AlertType: 0,
DeliverySchedule:
[
Id : 1,
Name : "Every 30 Minutes",
Period : "30M"
],
CollectionSchedule:
[
Id : 1,
Name : "Every 30 Minutes",
Period : "30M"
],
Items : items.collect() [
[
Id : it.Id,
Name : it.Name
]
],
LocationAccuracy:
[
UseGPS : false,
DistanceInMeters : 100,
ReportToServer : true,
AccuracyInMeters : 10
],
HasDolphinCounters: false,
EnrollmentCertificateId: null,
EnrollmentCertificateName: "",
DatabaseHighWatermark: 28,
DatabaseLowWatermark: 14,
DeviceHighWatermark: 400,
DeviceLowWatermark: 200
)
sampler.getArguments().removeAllArguments()
sampler.addNonEncodedArgument('', builder.toPrettyString(), '')
sampler.setPostBodyRaw(true);
While running the test I am getting HTTP 400 with Bad Request
Log message is as shown below:
2018-09-24 13:49:23,669 ERROR o.a.j.m.JSR223PreProcessor: Problem in JSR223 script, JSR223 PreProcessor
javax.script.ScriptException: groovy.lang.MissingPropertyException: No such property: it for class: Script32
at org.codehaus.groovy.jsr223.GroovyScriptEngineImpl.eval(GroovyScriptEngineImpl.java:320) ~[groovy-all-2.4.13.jar:2.4.13]
at org.codehaus.groovy.jsr223.GroovyCompiledScript.eval(GroovyCompiledScript.java:72) ~[groovy-all-2.4.13.jar:2.4.13]
at javax.script.CompiledScript.eval(Unknown Source) ~[?:1.8.0_151]
at org.apache.jmeter.util.JSR223TestElement.processFileOrScript(JSR223TestElement.java:221) ~[ApacheJMeter_core.jar:4.0 r1823414]
at org.apache.jmeter.modifiers.JSR223PreProcessor.process(JSR223PreProcessor.java:44) [ApacheJMeter_components.jar:4.0 r1823414]
at org.apache.jmeter.threads.JMeterThread.runPreProcessors(JMeterThread.java:849) [ApacheJMeter_core.jar:4.0 r1823414]
at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:467) [ApacheJMeter_core.jar:4.0 r1823414]
at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:416) [ApacheJMeter_core.jar:4.0 r1823414]
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:250) [ApacheJMeter_core.jar:4.0 r1823414]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_151]
Caused by: groovy.lang.MissingPropertyException: No such property: it for class: Script32
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53) ~[groovy-all-2.4.13.jar:2.4.13]
at org.codehaus.groovy.runtime.callsite.PogoGetPropertySite.getProperty(PogoGetPropertySite.java:52) ~[groovy-all-2.4.13.jar:2.4.13]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGroovyObjectGetProperty(AbstractCallSite.java:307) ~[groovy-all-2.4.13.jar:2.4.13]
at Script32.run(Script32.groovy:46) ~[?:?]
at org.codehaus.groovy.jsr223.GroovyScriptEngineImpl.eval(GroovyScriptEngineImpl.java:317) ~[groovy-all-2.4.13.jar:2.4.13]
... 9 more
CSV is as follows :
-1,BatteryStatus
-3,AvailableMemory
-5,AvailableStorage
Thank you in advance

You have to use {} for a closure here:
items.collect() { // wrong: [
// ...
} // wrong: ]
Or just items.collect { ... }
With the [] the compiler will see that as a map literal and you get above errors (it is undefined)

I believe you should be copying and pasting the example code more accurately, to wit your "Items" section should look like:
Items: items.collect() {
[
Id : it.Id,
Name: it.Name
]
}
Also be aware that you should not be using JMeter Functions and or Variables in Groovy scripts directly as it conflicts with GString Template feature and makes caching of compiled scripts impossible negatively impacting performance.
So I would also recommend changing:
${__P(DeviceFamily) to props.get('DeviceFamily)`
${__time(yyyy-MM-dd'T'hh:mm:ss)} to new Date().format("yyyy-MM-dd'T'hh:mm:ss")
etc.
See The Groovy Templates Cheat Sheet for JMeter article for more information on Groovy scripting in JMeter if needed

Related

Back-end (JVM) Internal error: Failed to generate expression: KtCallExpression

Decompiler tool of my IDE is not working.
Here is my code which I want to decompile:
data class FetchOrdersState(
val orders: List<Order> = listOf(),
val isLoading: Boolean = false,
val error: String = ""
) : OrderState {
companion object {
val Empty = FetchOrdersState()
}
}
This is the error I am getting in the IDE when I visit Tools > Kotlin > Show Kotlin Bytecode
org.jetbrains.kotlin.codegen.CompilationException: Back-end (JVM) Internal error:
Failed to generate expression: KtCallExpression
File being compiled: (20,31) in /Users/.../src/commonMain/kotlin/.../OrderState.kt
The root cause java.lang.IllegalStateException was thrown at: org.jetbrains.kotlin.codegen.state.KotlinTypeMapper$Companion.getPackageMemberContainingClassesInfo(KotlinTypeMapper.kt:1386)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.genQualified(ExpressionCodegen.java:356)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.genQualified(ExpressionCodegen.java:314)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.gen(ExpressionCodegen.java:430)
at org.jetbrains.kotlin.codegen.DefaultParameterValueLoader.lambda$static$0(DefaultParameterValueLoader.java:30)
at org.jetbrains.kotlin.codegen.FunctionCodegen.lambda$generateDefaultImplBody$5(FunctionCodegen.java:1278)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.runWithShouldMarkLineNumbers(ExpressionCodegen.java:1559)
at org.jetbrains.kotlin.codegen.FunctionCodegen.generateDefaultImplBody(FunctionCodegen.java:1274)
at org.jetbrains.kotlin.codegen.FunctionCodegen.generateDefaultIfNeeded(FunctionCodegen.java:1203)
at org.jetbrains.kotlin.codegen.ConstructorCodegen.generatePrimaryConstructor(ConstructorCodegen.java:112)
at org.jetbrains.kotlin.codegen.ImplementationBodyCodegen.generateConstructors(ImplementationBodyCodegen.java:457)
at org.jetbrains.kotlin.codegen.ClassBodyCodegen.generateBody(ClassBodyCodegen.java:96)
at org.jetbrains.kotlin.codegen.MemberCodegen.generate(MemberCodegen.java:132)
at org.jetbrains.kotlin.codegen.MemberCodegen.genClassOrObject(MemberCodegen.java:305)
at org.jetbrains.kotlin.codegen.MemberCodegen.genClassOrObject(MemberCodegen.java:289)
at org.jetbrains.kotlin.codegen.PackageCodegenImpl.generateClassesAndObjectsInFile(PackageCodegenImpl.java:119)
at org.jetbrains.kotlin.codegen.PackageCodegenImpl.generateFile(PackageCodegenImpl.java:138)
at org.jetbrains.kotlin.codegen.PackageCodegenImpl.generate(PackageCodegenImpl.java:70)
at org.jetbrains.kotlin.codegen.DefaultCodegenFactory.generatePackage(CodegenFactory.kt:77)
at org.jetbrains.kotlin.codegen.DefaultCodegenFactory.generateModule(CodegenFactory.kt:62)
at org.jetbrains.kotlin.codegen.KotlinCodegenFacade.compileCorrectFiles(KotlinCodegenFacade.java:35)
at org.jetbrains.kotlin.idea.core.KotlinCompilerIde.compile(KotlinCompilerIde.kt:144)
at org.jetbrains.kotlin.idea.internal.KotlinBytecodeToolWindow$Companion.compileSingleFile(KotlinBytecodeToolWindow.kt:272)
at org.jetbrains.kotlin.idea.internal.KotlinBytecodeToolWindow$Companion.getBytecodeForFile(KotlinBytecodeToolWindow.kt:235)
at org.jetbrains.kotlin.idea.internal.KotlinBytecodeToolWindow$UpdateBytecodeToolWindowTask.processRequest(KotlinBytecodeToolWindow.kt:111)
at org.jetbrains.kotlin.idea.internal.KotlinBytecodeToolWindow$UpdateBytecodeToolWindowTask.processRequest(KotlinBytecodeToolWindow.kt:60)
at org.jetbrains.kotlin.idea.util.LongRunningReadTask$1$1.run(LongRunningReadTask.java:115)
at com.intellij.openapi.application.impl.ApplicationImpl.runReadAction(ApplicationImpl.java:866)
at org.jetbrains.kotlin.idea.util.LongRunningReadTask.lambda$runWithWriteActionPriority$0(LongRunningReadTask.java:235)
at com.intellij.openapi.progress.impl.CoreProgressManager.lambda$runProcess$2(CoreProgressManager.java:178)
at com.intellij.openapi.progress.impl.CoreProgressManager.registerIndicatorAndRun(CoreProgressManager.java:658)
at com.intellij.openapi.progress.impl.CoreProgressManager.executeProcessUnderProgress(CoreProgressManager.java:610)
at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:65)
at com.intellij.openapi.progress.impl.CoreProgressManager.runProcess(CoreProgressManager.java:165)
at org.jetbrains.kotlin.idea.util.LongRunningReadTask.runWithWriteActionPriority(LongRunningReadTask.java:235)
at org.jetbrains.kotlin.idea.util.LongRunningReadTask$1.run(LongRunningReadTask.java:110)
at com.intellij.util.RunnableCallable.call(RunnableCallable.java:20)
at com.intellij.util.RunnableCallable.call(RunnableCallable.java:11)
at com.intellij.openapi.application.impl.ApplicationImpl$1.call(ApplicationImpl.java:276)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:668)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:665)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1.run(Executors.java:665)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.IllegalStateException: No implClassName for #kotlin.internal.InlineOnly public inline fun <T> listOf(): kotlin.collections.List<T> defined in kotlin.collections[DeserializedSimpleFunctionDescriptor#43dad151]
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper$Companion.getPackageMemberContainingClassesInfo(KotlinTypeMapper.kt:1386)
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper$Companion.getPackageMemberOwnerInternalName(KotlinTypeMapper.kt:1369)
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper$Companion.internalNameForPackageMemberOwner(KotlinTypeMapper.kt:1315)
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper$Companion.access$internalNameForPackageMemberOwner(KotlinTypeMapper.kt:1283)
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper.mapOwner(KotlinTypeMapper.kt:163)
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper.mapOwner(KotlinTypeMapper.kt:142)
at org.jetbrains.kotlin.codegen.state.KotlinTypeMapper.mapToCallableMethod(KotlinTypeMapper.kt:544)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.resolveToCallable(ExpressionCodegen.java:2727)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.invokeFunction(ExpressionCodegen.java:2602)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.invokeFunction(ExpressionCodegen.java:2576)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.visitCallExpression(ExpressionCodegen.java:2470)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.visitCallExpression(ExpressionCodegen.java:125)
at org.jetbrains.kotlin.psi.KtCallExpression.accept(KtCallExpression.java:35)
at org.jetbrains.kotlin.codegen.ExpressionCodegen.genQualified(ExpressionCodegen.java:332)
... 45 more
This works absolutely fine if I comment out the listOf() line from my data class.

Query CUstom Record using Custom Field in Mulesoft

I want to query a custom record based on the custom field in the custom record in Anypoint Studio. I tried using Search operation in Netsuite but it seems that I unable to write the dataweave code accordingly.
I used the code below
%dw 2.0
output application/java
---
{
customFieldList: {
customField: [{
internalId: "8",
scriptId: "abc"
} as Object {
class : "org.mule.module.netsuite.extension.api.ListOrRecordRef"
} ]
} as Object {
class : "org.mule.module.netsuite.extension.api.CustomFieldList"
},
recType: {
internalId: "10078"
}
} as Object {
class : "org.mule.module.netsuite.extension.api.CustomRecordSearchBasic"
}
I used basic Custom record search.I am getting below error
Message : null
Element : testFlow2/processors/0 # test:test.xml:18 (Transform Message)
Element DSL : <ee:transform doc:name="Transform Message" doc:id="c6dbb0ee-4979-4667-bab1-e0b5d72a6b2b">
<ee:message>
<ee:set-payload>%dw 2.0
output application/java
---
{
customFieldList: {
customField: [{
internalId: "8",
scriptId: "abc"
} as Object {
class : "org.mule.module.netsuite.extension.api.ListOrRecordRef"
} ]
} as Object {
class : "org.mule.module.netsuite.extension.api.CustomFieldList"
},
recType: {
internalId: "10078"
}
} as Object {
class : "org.mule.module.netsuite.extension.api.CustomRecordSearchBasic"
}</ee:set-payload>
</ee:message>
</ee:transform>
Error type : MULE:UNKNOWN
FlowStack : at testFlow2(testFlow2/processors/0 # test:test.xml:18 (Transform Message))
--------------------------------------------------------------------------------
Root Exception stack trace:
java.lang.NullPointerException
at org.mule.weave.v2.module.pojo.exception.CannotInstantiateException.message(CannotInstantiateException.scala:7)
at org.mule.weave.v2.parser.exception.LocatableException.getMessage(LocatableException.scala:18)
at org.mule.weave.v2.parser.exception.LocatableException.getMessage$(LocatableException.scala:15)
at org.mule.weave.v2.module.pojo.exception.CannotInstantiateException.getMessage(CannotInstantiateException.scala:6)
at org.mule.runtime.core.internal.el.dataweave.DataWeaveExpressionLanguageAdaptor$1.handledException(DataWeaveExpressionLanguageAdaptor.java:298)
at org.mule.runtime.core.internal.el.dataweave.DataWeaveExpressionLanguageAdaptor$1.evaluate(DataWeaveExpressionLanguageAdaptor.java:309)
at org.mule.runtime.core.internal.el.DefaultExpressionManagerSession.evaluate(DefaultExpressionManagerSession.java:105)
at com.mulesoft.mule.runtime.core.internal.processor.SetPayloadTransformationTarget.process(SetPayloadTransformationTarget.java:32)
at com.mulesoft.mule.runtime.core.internal.processor.TransformMessageProcessor.lambda$0(TransformMessageProcessor.java:92)
at java.util.Optional.ifPresent(Optional.java:159)
at com.mulesoft.mule.runtime.core.internal.processor.TransformMessageProcessor.process(TransformMessageProcessor.java:92)
at org.mule.runtime.core.api.util.func.CheckedFunction.apply(CheckedFunction.java:25)
at org.mule.runtime.core.api.rx.Exceptions.lambda$checkedFunction$2(Exceptions.java:84)
at org.mule.runtime.core.internal.util.rx.Operators.lambda$nullSafeMap$0(Operators.java:47)
at reactor.core.* (1 elements filtered from stack; set debug level logging or '-Dmule.verbose.exceptions=true' for everything)(Unknown Source)
at org.mule.runtime.core.privileged.processor.chain.* (2 elements filtered from stack; set debug level logging or '-Dmule.verbose.exceptions=true' for everything)(Unknown Source)
at reactor.core.* (6 elements filtered from stack; set debug level logging or '-Dmule.verbose.exceptions=true' for everything)(Unknown Source)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.mule.service.scheduler.internal.AbstractRunnableFutureDecorator.doRun(AbstractRunnableFutureDecorator.java:111)
at org.mule.service.scheduler.internal.RunnableFutureDecorator.run(RunnableFutureDecorator.java:54)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I don't know where I am doing wrong.
Any help is appreciated.
Thank you so much. But I figured it some how. I got the solution to it. But I can accept other solutions too. I just added a class and it worked.
Thank you again.

saveToCassandra works with Cassandra Lucene plugin?

I am implementing the example on Lucene plugin for Cassandra page (https://github.com/Stratio/cassandra-lucene-index) and when I try to save the data using saveToCassandra I get the exception NoSuchElementException.
If I use CassandraConnector.withSessionDo I am able to add elements into Cassandra and no exception is raised.
The tables:
CREATE KEYSPACE demo
WITH REPLICATION = {'class' : 'SimpleStrategy', 'replication_factor': 1};
USE demo;
CREATE TABLE tweets (
id INT PRIMARY KEY,
user TEXT,
body TEXT,
time TIMESTAMP,
latitude FLOAT,
longitude FLOAT
);
CREATE CUSTOM INDEX tweets_index ON tweets ()
USING 'com.stratio.cassandra.lucene.Index'
WITH OPTIONS = {
'refresh_seconds' : '1',
'schema' : '{
fields : {
id : {type : "integer"},
user : {type : "string"},
body : {type : "text", analyzer : "english"},
time : {type : "date", pattern : "yyyy/MM/dd", sorted : true},
place : {type : "geo_point", latitude:"latitude", longitude:"longitude"}
}
}'
};
The code :
import org.apache.spark.{SparkConf, SparkContext, Logging}
import com.datastax.spark.connector.cql.CassandraConnector
import com.datastax.spark.connector._
object App extends Logging{
def main(args: Array[String]) {
// Get the cassandra IP and create the spark context
val cassandraIP = System.getenv("CASSANDRA_IP");
val sparkConf = new SparkConf(true)
.set("spark.cassandra.connection.host", cassandraIP)
.set("spark.cleaner.ttl", "3600")
.setAppName("Simple Spark Cassandra Example")
val sc = new SparkContext(sparkConf)
// Works
CassandraConnector(sparkConf).withSessionDo { session =>
session.execute("INSERT INTO demo.tweets(id, user, body, time, latitude, longitude) VALUES (19, 'Name', 'Body', '2016-03-19 09:00:00-0300', 39, 39)")
}
// Does not work
val demo = sc.parallelize(Seq((9, "Name", "Body", "2016-03-29 19:00:00-0300", 29, 29)))
// Raises the exception
demo.saveToCassandra("demo", "tweets", SomeColumns("id", "user", "body", "time", "latitude", "longitude"))
}
}
The exception:
16/03/28 14:15:41 INFO CassandraConnector: Connected to Cassandra cluster: Test Cluster
Exception in thread "main" java.util.NoSuchElementException: Column not found in demo.tweets
at com.datastax.spark.connector.cql.StructDef$$anonfun$columnByName$2.apply(Schema.scala:60)
at com.datastax.spark.connector.cql.StructDef$$anonfun$columnByName$2.apply(Schema.scala:60)
at scala.collection.Map$WithDefault.default(Map.scala:52)
at scala.collection.MapLike$class.apply(MapLike.scala:141)
at scala.collection.AbstractMap.apply(Map.scala:58)
at com.datastax.spark.connector.cql.TableDef$$anonfun$9.apply(Schema.scala:153)
at com.datastax.spark.connector.cql.TableDef$$anonfun$9.apply(Schema.scala:152)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at com.datastax.spark.connector.cql.TableDef.<init>(Schema.scala:152)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchTables$1$2.apply(Schema.scala:283)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchTables$1$2.apply(Schema.scala:271)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.Set$Set4.foreach(Set.scala:137)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchTables$1(Schema.scala:271)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:295)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:294)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:153)
at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1(Schema.scala:294)
at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:307)
at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:304)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:121)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:120)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109)
at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:304)
at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:275)
at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
at com.webradar.spci.spark.cassandra.App$.main(App.scala:27)
at com.webradar.spci.spark.cassandra.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 16/03/28 14:15:41 INFO CassandraConnector: Connected to Cassandra cluster: Test Cluster
Exception in thread "main" java.util.NoSuchElementException: Column not found in demo.tweets
at com.datastax.spark.connector.cql.StructDef$$anonfun$columnByName$2.apply(Schema.scala:60)
at com.datastax.spark.connector.cql.StructDef$$anonfun$columnByName$2.apply(Schema.scala:60)
at scala.collection.Map$WithDefault.default(Map.scala:52)
at scala.collection.MapLike$class.apply(MapLike.scala:141)
at scala.collection.AbstractMap.apply(Map.scala:58)
at com.datastax.spark.connector.cql.TableDef$$anonfun$9.apply(Schema.scala:153)
at com.datastax.spark.connector.cql.TableDef$$anonfun$9.apply(Schema.scala:152)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at com.datastax.spark.connector.cql.TableDef.<init>(Schema.scala:152)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchTables$1$2.apply(Schema.scala:283)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchTables$1$2.apply(Schema.scala:271)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.Set$Set4.foreach(Set.scala:137)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchTables$1(Schema.scala:271)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:295)
at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:294)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:153)
at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1(Schema.scala:294)
at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:307)
at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:304)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:121)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:120)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:109)
at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:139)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:304)
at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:275)
at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
at com.webradar.spci.spark.cassandra.App$.main(App.scala:27)
at com.webradar.spci.spark.cassandra.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
EDITED:
Versions
Spark 1.6.0
Cassandra 3.0.3
Lucene plugin 3.0.3.1
For Jar creation I used maven-assembly-plugin to get a fat JAR.
If I remove the custom index I am able to use saveToCassandra
It seems that the problem is caused by a problem in the Cassandra Spark driver, and not in the plugin.
Since CASSANDRA-10217 Cassandra 3.x per-row indexes don't require to be created on a fake column anymore. Thus, from Cassandra 3.x the "CREATE CUSTOM INDEX %s ON %s(%s)" column-based syntax is replaced with the new "CREATE CUSTOM INDEX %s ON %s()" row-based syntax. However, DataStax Spark driver doesn't seem to support this new feature yet.
When "com.datastax.spark.connector.RDDFunctions.saveToCassandra" is called it tries to load the table schema and the index schema related to a table column. Since this new index syntax does not have the fake-column anymore it results in a NoSuchElementException due to an empty column name.
However, saveToCassandra works well if you execute the same example with prior fake column syntax:
CREATE KEYSPACE demo
WITH REPLICATION = {'class' : 'SimpleStrategy', 'replication_factor': 1};
USE demo;
CREATE TABLE tweets (
id INT PRIMARY KEY,
user TEXT,
body TEXT,
time TIMESTAMP,
latitude FLOAT,
longitude FLOAT,
lucene TEXT
);
CREATE CUSTOM INDEX tweets_index ON tweets (lucene)
USING 'com.stratio.cassandra.lucene.Index'
WITH OPTIONS = {
'refresh_seconds' : '1',
'schema' : '{
fields : {
id : {type : "integer"},
user : {type : "string"},
body : {type : "text", analyzer : "english"},
time : {type : "date", pattern : "yyyy/MM/dd", sorted : true},
place : {type : "geo_point", latitude:"latitude", longitude:"longitude"}
}
}'
};

Failing scripts in groovy using Grab

The following groovy scripts fail using command line
#Grab("org.apache.poi:poi:3.9")
println "test"
Error:
unexpected token: println # line 2, column 1.
println "test"
^
1 error
Removing the Grab, it works!
Anything I missed?
$>groovy -v
Groovy Version: 2.1.7 JVM: 1.7.0_25 Vendor: Oracle Corporation OS: Linux
Annotations can only be applied to certain targets. See SO: Why can't I do a method call after a #Grab declaration in a Groovy script?
#Grab("org.apache.poi:poi:3.9")
dummy = null
println "test"
Alternatively you can use grab as a method call:
import static groovy.grape.Grape.grab
grab(group: "org.apache.poi", module: "poi", version: "3.9")
println "test"
For more information refer to Groovy Language Documentation > Dependency management with Grape.
File 'Grabber.groovy'
package org.taste
import groovy.grape.Grape
//List<List[]> artifacts => [[<group>,<module>,<version>,[<Maven-URL>]],..]
static def grab (List<List[]> artifacts) {
ClassLoader classLoader = new groovy.lang.GroovyClassLoader()
def eal = Grape.getEnableAutoDownload()
artifacts.each { artifact -> {
Map param = [
classLoader: classLoader,
group : artifact.get(0),
module : artifact.get(1),
version : artifact.get(2),
classifier : (artifact.size() < 4) ? null : artifact.get(3)
]
println param
Grape.grab(param)
}
}
Grape.setEnableAutoDownload(eal)
}
Usage :
package org.taste
import org.taste.Grabber
Grabber.grab([
[ "org.codehaus.groovy.modules.http-builder", "http-builder", '0.7.1'],
[ "org.postgresql", "postgresql", '42.3.1', null ],
[ "com.oracle.database.jdbc", "ojdbc8", '12.2.0.1', null]
])

ElasticSearch 1.5 upgrade: JodaTime conversion script error

I'm working through upgrading our ElasticSearch version from 1.3 to 1.5. We use the Java API heavily. the following script in an ES query:
{
"script" : {
"script" : "values contains (int)doc['timestamp'].date.toDateTime(DateTimeZone.forID('America/New_York')).getMonthOfYear()",
"params" : {
"values" : [ 1 ]
},
"lang" : "groovy"
}
}
This works with 1.3, but gives the following error in 1.5:
org.elasticsearch.action.search.SearchPhaseExecutionException: Failed to execute phase [query_fetch], all shards failed; shardFailures {[XwOu9zq0TMi2uOptdfIS7w][eventdata][0]: QueryPhaseExecutionException[[eventdata][0]: query[filtered(ConstantScore(ScriptFilter(values contains (int)doc['timestamp'].date.toDateTime(DateTimeZone.forID('America/New_York')).getMonthOfYear().toString())))->cache(org.elasticsearch.index.search.nested.NonNestedDocsFilter#2a8c0465)],from[0],size[10]: Query Failed [Failed to execute main query]]; nested: GroovyScriptExecutionException[MissingMethodException[No signature of method: Script1.contains() is applicable for argument types: (java.lang.Class) values: [int]
Possible solutions: toString(), toString(), notify()]]; }
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:238)
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$1.onFailure(TransportSearchTypeAction.java:184)
at org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:565)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
What's the best way to address this?
That looks like a brackets issue with the cast, so it thinks you're passing int to contains. Try adding brackets to help out the parser:
values.contains((int)doc['timestamp'].date.toDateTime(DateTimeZone.forID('Ameri‌​ca/New_York')).getMonthOfYear())

Resources