What could be the reasons for JsonMappingException? - apache-spark

Get an exception while the Spark (3.0.1) application is running:
com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.0 requires Jackson Databind version >= 2.10.0 and < 2.11.0
I have already seen questions with a similar Jackson error, but they are different from my case: these Jackson dependencies are already match up and should be compatible:
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-annotations-2.10.0.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-core-2.10.0.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-databind-2.10.0.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-dataformat-yaml-2.10.0.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-datatype-jsr310-2.10.3.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-module-jaxb-annotations-2.10.0.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-module-paranamer-2.10.0.jar
[Driver] INFO ru.kontur.srs.infra.launching.daemon.DaemonLauncher$ - /hadoop/yarn/local/usercache/SrsApp/filecache/4368/__spark_libs__4872112349984824286.zip/jackson-module-scala_2.12-2.10.0.jar
I double-checked it even in this way:
def main(args: Array[String]): Unit = {
// ...
logger.info("JACKSON VERSION: " + new ObjectMapper().version())
// ...
}
> [Driver] INFO ru.kontur.khajiit.reports.builders.raw_products.ReportsRawProductsReportBuilderDaemon$ - JACKSON VERSION: 2.10.0
What could be the reasons that 2.10 does not pass the >= 2.10.0 and < 2.11.0 check and how can this problem be solved?

Related

Error when running apache spark-kafka-hbase jar from cmd

First of all I'm new to all this tech stack, so if I don't present all the details please let me know.
Here's my problem with this: I'm trying to make a jar archive of a apache spark - kafka app. To package my app into a jar I use sbt assembly plugin:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.10")
the package of the jar runs successfuly.
Now if I try to run it with:
spark-submit kafka-consumer.jar
the app boots up successfuly.
I want to do the same with the java -jar cmd, but unfortunately it fails.
Here's how to stack looks like:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/03/29 11:16:23 INFO SparkContext: Running Spark version 2.4.4
20/03/29 11:16:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/03/29 11:16:23 INFO SparkContext: Submitted application: KafkaConsumer
20/03/29 11:16:23 INFO SecurityManager: Changing view acls to: popar
20/03/29 11:16:23 INFO SecurityManager: Changing modify acls to: popar
20/03/29 11:16:23 INFO SecurityManager: Changing view acls groups to:
20/03/29 11:16:23 INFO SecurityManager: Changing modify acls groups to:
20/03/29 11:16:23 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(popar); groups with view permissions: Set(); users with modify permissions: Set(popar); groups with modify permissions: Set()
20/03/29 11:16:25 INFO Utils: Successfully started service 'sparkDriver' on port 55595.
20/03/29 11:16:25 INFO SparkEnv: Registering MapOutputTracker
20/03/29 11:16:25 INFO SparkEnv: Registering BlockManagerMaster
20/03/29 11:16:25 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/03/29 11:16:25 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/03/29 11:16:25 INFO DiskBlockManager: Created local directory at C:\Users\popar\AppData\Local\Temp\blockmgr-77af3fbc-264e-451c-9df3-5b7dda58f3a8
20/03/29 11:16:25 INFO MemoryStore: MemoryStore started with capacity 898.5 MB
20/03/29 11:16:26 INFO SparkEnv: Registering OutputCommitCoordinator
20/03/29 11:16:26 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/03/29 11:16:26 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-0IISN4F.mshome.net:4040
20/03/29 11:16:26 INFO Executor: Starting executor ID driver on host localhost
20/03/29 11:16:26 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55636.
20/03/29 11:16:26 INFO NettyBlockTransferService: Server created on DESKTOP-0IISN4F.mshome.net:55636
20/03/29 11:16:26 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/03/29 11:16:26 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DESKTOP-0IISN4F.mshome.net, 55636, None)
20/03/29 11:16:26 INFO BlockManagerMasterEndpoint: Registering block manager DESKTOP-0IISN4F.mshome.net:55636 with 898.5 MB RAM, BlockManagerId(driver, DESKTOP-0IISN4F.mshome.net, 55636, None)
20/03/29 11:16:26 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DESKTOP-0IISN4F.mshome.net, 55636, None)
20/03/29 11:16:26 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, DESKTOP-0IISN4F.mshome.net, 55636, None)
Exception in thread "main" java.io.IOException: No FileSystem for scheme: file
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2798)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2809)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:181)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
at org.apache.spark.streaming.StreamingContext.checkpoint(StreamingContext.scala:239)
at KafkaConsumer$.main(KafkaConsumer.scala:85)
at KafkaConsumer.main(KafkaConsumer.scala)
As you can see it fails with: Exception in thread "main" java.io.IOException: No FileSystem for scheme: file
Now for my main class this is the def:
import Service._
import kafka.serializer.StringDecoder
import org.apache.hadoop.hbase.TableName
import org.apache.hadoop.hbase.client.{Put, Scan, Table}
import org.apache.hadoop.hbase.util.Bytes
import org.apache.log4j.{Level, Logger}
import org.apache.spark.streaming.dstream.DStream
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.streaming.{Seconds, StreamingContext}
import scala.collection.JavaConversions
import scala.util.Try
object KafkaConsumer {
def setupLogging(): Unit = {
val rootLogger = Logger.getRootLogger
rootLogger.setLevel(Level.ERROR)
}
def persistToHbase[A](rdd: A): Unit = {
val connection = getHbaseConnection
val admin = connection.getAdmin
val columnFamily1 = "personal_data"
val table = connection.getTable(TableName.valueOf("employees"))
val scan = new Scan()
scan.addFamily(columnFamily1.getBytes())
val totalRows: Int = getLastRowNumber(scan, columnFamily1, table)
persistRdd(rdd, table, columnFamily1, totalRows + 1)
Try(table.close())
Try(admin.close())
Try(connection.close())
}
private def getLastRowNumber[A](scan: Scan,
columnFamily: String,
table: Table): Int = {
val scanner = table.getScanner(scan)
val values = scanner.iterator()
val seq = JavaConversions.asScalaIterator(values).toIndexedSeq
seq.size
}
def persistRdd[A](rdd: A,
table: Table,
columnFamily: String,
rowNumber: Int): Unit = {
val row = Bytes.toBytes(String.valueOf(rowNumber))
val put = new Put(row)
val qualifier = "test_column"
put.addColumn(columnFamily.getBytes(),
qualifier.getBytes(),
String.valueOf(rdd).getBytes())
table.put(put)
}
def main(args: Array[String]): Unit = {
// create the context with a one second batch of data & uses all the CPU cores
val context = new StreamingContext("local[*]", "KafkaConsumer", Seconds(1))
// hostname:port for Kafka brokers
val kafkaParams = Map("metadata.broker.list" -> "192.168.56.22:9092")
// list of topics you want to listen from Kafka
val topics = List("employees").toSet
setupLogging()
// create a Kafka Stream, which will contain(topic, message) pairs
// we take a map(_._2) at the end in order to only get the messages which contain individual lines of data
val stream: DStream[String] = KafkaUtils
.createDirectStream[String, String, StringDecoder, StringDecoder](
context,
kafkaParams,
topics)
.map(_._2)
// debug print
stream.print()
// stream.foreachRDD(rdd => rdd.foreach(persistToHbase(_)))
context.checkpoint("C:/checkpoint/")
context.start()
context.awaitTermination()
}
}
and the build.sbt looks like this:
import sbt._
import Keys._
name := "kafka-consumer"
version := "0.1"
scalaVersion := "2.11.8"
lazy val sparkVersion = "2.4.4"
lazy val sparkStreamingKafkaVersion = "1.6.3"
lazy val hbaseVersion = "2.2.1"
lazy val hadoopVersion = "2.8.0"
lazy val hadoopCoreVersion = "1.2.1"
resolvers in Global ++= Seq(
"Sbt plugins" at "https://dl.bintray.com/sbt/sbt-plugin-releases"
)
lazy val commonSettings = Seq(
version := "0.1",
organization := "com.rares",
scalaVersion := "2.11.8",
test in assembly := {}
)
lazy val excludeJPountz =
ExclusionRule(organization = "net.jpountz.lz4", name = "lz4")
lazy val excludeHadoop =
ExclusionRule(organization = "org.apache.hadoop")
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % sparkVersion excludeAll (excludeJPountz, excludeHadoop),
"org.apache.spark" % "spark-streaming-kafka_2.11" % sparkStreamingKafkaVersion,
"org.apache.spark" % "spark-streaming_2.11" % sparkVersion excludeAll (excludeJPountz),
"org.apache.hadoop" % "hadoop-client" % hadoopVersion,
"org.apache.hbase" % "hbase-server" % hbaseVersion,
"org.apache.hbase" % "hbase-client" % hbaseVersion,
"org.apache.hbase" % "hbase-common" % hbaseVersion
)
//Fat jar
assemblyMergeStrategy in assembly := {
case PathList("org", "aopalliance", xs # _*) => MergeStrategy.last
case PathList("javax", "inject", xs # _*) => MergeStrategy.last
case PathList("net", "jpountz", xs # _*) => MergeStrategy.last
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case PathList("jetty-dir.css", xs # _*) => MergeStrategy.last
case PathList("org", "apache", xs # _*) => MergeStrategy.last
case PathList("com", "sun", xs # _*) => MergeStrategy.last
case PathList("hdfs-default.xml", xs # _*) => MergeStrategy.last
case PathList("javax", xs # _*) => MergeStrategy.last
case PathList("mapred-default.xml", xs # _*) => MergeStrategy.last
case PathList("core-default.xml", xs # _*) => MergeStrategy.last
case PathList("javax", "servlet", xs # _*) => MergeStrategy.last
// case "git.properties" => MergeStrategy.last
// case PathList("org", "apache", "jasper", xs # _*) => MergeStrategy.first
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
assemblyJarName in assembly := "kafka-consumer.jar"
Any advice will be deeply appreciated!!!
Ok, so here is what helped me out. Add the hadoop configuration for the spark context as following:
val hadoopConfiguration = context.sparkContext.hadoopConfiguration
hadoopConfiguration.set(
"fs.hdfs.impl",
classOf[org.apache.hadoop.hdfs.DistributedFileSystem].getName)
hadoopConfiguration.set(
"fs.file.impl",
classOf[org.apache.hadoop.fs.LocalFileSystem].getName)
Works like a charm!
A big thanks to this: hadoop No FileSystem for scheme: file

GraphFactory could not find gremlin.graph property configration

Summary
While trying to start the gremlin server with origindb GraphFactory message: GraphFactory could not find [org.apache.tinkerpop.gremlin.orientdb.OrientEmbeddedFactory] error i got
Detail
I am using the below configuration
Gremlin : apache-tinkerpop-gremlin-server-3.3.1
Orientdb : orientdb-tp3-3.0.2
for download jar files use bin/gremlin-server.sh -i com.orientechnologies orientdb-gremlin 3.0.2
gremlinpython : 3.3.0
gremlin-server-orientdb.yaml file
host: localhost
port: 8182
scriptEvaluationTimeout: 30000
channelizer: org.apache.tinkerpop.gremlin.server.channel.WebSocketChannelizer
graphs: {
graph : conf/orientdb-empty.properties
}
scriptEngines: {
gremlin-groovy: {
plugins: { org.apache.tinkerpop.gremlin.server.jsr223.GremlinServerGremlinPlugin: {},
org.apache.tinkerpop.gremlin.orientdb.jsr223.OrientDBGremlinPlugin: {},
org.apache.tinkerpop.gremlin.jsr223.ImportGremlinPlugin: {classImports: [java.lang.Math], methodImports: [java.lang.Math#*]},
org.apache.tinkerpop.gremlin.jsr223.ScriptFileGremlinPlugin: {files: [../config/demodb.groovy]}}}}
serializers:
- { className: org.apache.tinkerpop.gremlin.driver.ser.GryoMessageSerializerV3d0, config: { ioRegistries: [org.apache.tinkerpop.gremlin.orientdb.io.OrientIoRegistry] }} # application/vnd.gremlin-v3.0+gryo
- { className: org.apache.tinkerpop.gremlin.driver.ser.GryoMessageSerializerV3d0, config: { serializeResultToString: true }} # application/vnd.gremlin-v3.0+gryo-stringd
- { className: org.apache.tinkerpop.gremlin.driver.ser.GraphSONMessageSerializerV3d0, config: { ioRegistries: [org.apache.tinkerpop.gremlin.orientdb.io.OrientIoRegistry] }} # application/json
processors:
- { className: org.apache.tinkerpop.gremlin.server.op.session.SessionOpProcessor, config: { sessionTimeout: 28800000 }}
- { className: org.apache.tinkerpop.gremlin.server.op.traversal.TraversalOpProcessor, config: { cacheExpirationTime: 600000, cacheMaxSize: 1000 }}
metrics: {
consoleReporter: {enabled: true, interval: 180000},
csvReporter: {enabled: true, interval: 180000, fileName: /tmp/gremlin-server-metrics.csv},
jmxReporter: {enabled: true},
slf4jReporter: {enabled: true, interval: 180000}}
strictTransactionManagement: false
maxInitialLineLength: 4096
maxHeaderSize: 8192
maxChunkSize: 8192
maxContentLength: 65536
maxAccumulationBufferComponents: 1024
resultIterationBatchSize: 64
writeBufferLowWaterMark: 32768
writeBufferHighWaterMark: 65536
authentication: {
authenticator: com.orientechnologies.tinkerpop.server.auth.OGremlinServerAuthenticator
}
ssl: {
enabled: false}
orientdb-empty.properties file
gremlin.graph=org.apache.tinkerpop.gremlin.orientdb.OrientEmbeddedFactory
also, tried with this
gremlin.graph=org.apache.tinkerpop.gremlin.orientdb.OrientGraph
stacktrace
admin-12#admin:~/Documents/apache-tinkerpop-gremlin-server-3.3.1/bin$ ./gremlin-server.sh conf/gremlin-server-orientdb.yaml
[INFO] GremlinServer -
\,,,/
(o o)
-----oOOo-(3)-oOOo-----
[INFO] GremlinServer - Configuring Gremlin Server from conf/gremlin-server-orientdb.yaml
[INFO] MetricManager - Configured Metrics ConsoleReporter configured with report interval=180000ms
[INFO] MetricManager - Configured Metrics CsvReporter configured with report interval=180000ms to fileName=/tmp/gremlin-server-metrics.csv
[INFO] MetricManager - Configured Metrics JmxReporter configured with domain= and agentId=
[INFO] MetricManager - Configured Metrics Slf4jReporter configured with interval=180000ms and loggerName=org.apache.tinkerpop.gremlin.server.Settings$Slf4jReporterMetrics
[WARN] DefaultGraphManager - Graph [graph] configured at [conf/orientdb-empty.properties] could not be instantiated and will not be available in Gremlin Server. GraphFactory message: GraphFactory could not find [org.apache.tinkerpop.gremlin.orientdb.OrientEmbeddedFactory] - Ensure that the jar is in the classpath
java.lang.RuntimeException: GraphFactory could not find [org.apache.tinkerpop.gremlin.orientdb.OrientEmbeddedFactory] - Ensure that the jar is in the classpath
at org.apache.tinkerpop.gremlin.structure.util.GraphFactory.open(GraphFactory.java:63)
at org.apache.tinkerpop.gremlin.structure.util.GraphFactory.open(GraphFactory.java:104)
at org.apache.tinkerpop.gremlin.server.util.DefaultGraphManager.lambda$new$0(DefaultGraphManager.java:57)
at java.util.LinkedHashMap$LinkedEntrySet.forEach(LinkedHashMap.java:671)
at org.apache.tinkerpop.gremlin.server.util.DefaultGraphManager.<init>(DefaultGraphManager.java:55)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:80)
at org.apache.tinkerpop.gremlin.server.GremlinServer.<init>(GremlinServer.java:111)
at org.apache.tinkerpop.gremlin.server.GremlinServer.main(GremlinServer.java:325)
[INFO] ServerGremlinExecutor - Initialized Gremlin thread pool. Threads in pool named with pattern gremlin-*
Exception in thread "main" java.lang.IllegalStateException: java.lang.reflect.InvocationTargetException
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.initializeGremlinScriptEngineManager(GremlinExecutor.java:448)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.<init>(GremlinExecutor.java:105)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.<init>(GremlinExecutor.java:74)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor$Builder.create(GremlinExecutor.java:590)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:128)
at org.apache.tinkerpop.gremlin.server.GremlinServer.<init>(GremlinServer.java:111)
at org.apache.tinkerpop.gremlin.server.GremlinServer.main(GremlinServer.java:325
Updated
[WARN] Slf4JLogger - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
java.lang.NullPointerException
at com.orientechnologies.tinkerpop.server.auth.OGremlinServerAuthenticator.authenticate(OGremlinServerAuthenticator.java:34)
at org.apache.tinkerpop.gremlin.server.auth.SimpleAuthenticator$PlainTextSaslAuthenticator.getAuthenticatedUser(SimpleAuthenticator.java:143)
at org.apache.tinkerpop.gremlin.server.handler.SaslAuthenticationHandler.channelRead(SaslAuthenticationHandler.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
You are mixing a lot of different versions so it's hard to say what will work. First of all, TinkerPop recommends that you try to match the versions of the server with the version of the client. So that means that if you you use 3.3.1 on the server then you should try to use 3.3.1 of a client (in your case gremlin-python). Next, you are using orientdb-gremlin with 3.0.2 which appears to be bound to TinkerPop 3.3.0,
https://github.com/orientechnologies/orientdb-gremlin/blob/3.0.2/driver/pom.xml
which means that for best results you should probably use 3.3.0 on Gremlin Server and gremlin-python. Now, while I mention all this about "matching versions" I will say that it is possible to mix versions, but matching will help limit the things that can go wrong as you're just getting started so I'd encourage you to start there.
As for your error, I think you installed the wrong dependencies. You should have done:
bin/gremlin-server.sh -i com.orientechnologies orientdb-gremlin-server 3.0.2
as orientdb-gremlin-server will bring in OrientEmbeddedFactory as well as the orientdb-gremlin dependencies. I also think that your orientdb-empty.properties file is missing some configuration options - see what is defaulted and what is not here.

Cannot write to Druid through SparkStreaming and Tranquility

I am trying to write results from Spark Streaming job to Druid datasource. Spark successfully completes its jobs and hands to Druid. Druid starts indexing but does not write anything.
My code and logs are as follows:
import org.apache.spark._
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext
import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka010._
impor org.apache.spark.streaming.kafka010.LocationStrategies.PreferConsistent
import org.apache.spark.streaming.kafka010.ConsumerStrategies.Subscribe
import scala.util.parsing.json._
import com.metamx.tranquility.spark.BeamRDD._
import org.joda.time.{DateTime, DateTimeZone}
object MyDirectStreamDriver {
def main(args:Array[String]) {
val sc = new SparkContext()
val ssc = new StreamingContext(sc, Minutes(5))
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "[$hadoopURL]:6667",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> "use_a_separate_group_id_for_each_stream",
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val eventStream = KafkaUtils.createDirectStream[String, String](
ssc,
PreferConsistent,
Subscribe[String, String](Array("events_test"), kafkaParams))
val t = eventStream.map(record => record.value).flatMap(_.split("(?<=\\}),(?=\\{)")).
map(JSON.parseRaw(_).getOrElse(new JSONObject(Map(""-> ""))).asInstanceOf[JSONObject]).
map( new DateTime(), x => (x.obj.getOrElse("OID", "").asInstanceOf[String], x.obj.getOrElse("STATUS", "").asInstanceOf[Double].toInt)).
map(x => MyEvent(x._1, x._2, x._3))
t.saveAsTextFiles("/user/username/result", "txt")
t.foreachRDD(rdd => rdd.propagate(new MyEventBeamFactory))
ssc.start
ssc.awaitTermination
}
}
case class MyEvent (time: DateTime,oid: String, status: Int)
{
#JsonValue
def toMap: Map[String, Any] = Map(
"timestamp" -> (time.getMillis / 1000),
"oid" -> oid,
"status" -> status
)
}
object MyEvent {
implicit val MyEventTimestamper = new Timestamper[MyEvent] {
def timestamp(a: MyEvent) = a.time
}
val Columns = Seq("time", "oid", "status")
def fromMap(d: Dict): MyEvent = {
MyEvent(
new DateTime(long(d("timestamp")) * 1000),
str(d("oid")),
int(d("status"))
)
}
}
import org.apache.curator.framework.CuratorFrameworkFactory
import org.apache.curator.retry.BoundedExponentialBackoffRetry
import io.druid.granularity._
import io.druid.query.aggregation.LongSumAggregatorFactory
import com.metamx.common.Granularity
import org.joda.time.Period
class MyEventBeamFactory extends BeamFactory[MyEvent]
{
// Return a singleton, so the same connection is shared across all tasks in the same JVM.
def makeBeam: Beam[MyEvent] = MyEventBeamFactory.BeamInstance
object MyEventBeamFactory {
val BeamInstance: Beam[MyEvent] = {
// Tranquility uses ZooKeeper (through Curator framework) for coordination.
val curator = CuratorFrameworkFactory.newClient(
"{IP_2}:2181",
new BoundedExponentialBackoffRetry(100, 3000, 5)
)
curator.start()
val indexService = DruidEnvironment("druid/overlord") // Your overlord's druid.service, with slashes replaced by colons.
val discoveryPath = "/druid/discovery" // Your overlord's druid.discovery.curator.path
val dataSource = "events_druid"
val dimensions = IndexedSeq("oid")
val aggregators = Seq(new LongSumAggregatorFactory("status", "status"))
// Expects simpleEvent.timestamp to return a Joda DateTime object.
DruidBeams
.builder((event: MyEvent) => event.time)
.curator(curator)
.discoveryPath(discoveryPath)
.location(DruidLocation(indexService, dataSource))
.rollup(DruidRollup(SpecificDruidDimensions(dimensions), aggregators, QueryGranularities.MINUTE))
.tuning(
ClusteredBeamTuning(
segmentGranularity = Granularity.HOUR,
windowPeriod = new Period("PT10M"),
partitions = 1,
replicants = 1
)
)
.buildBeam()
}
}
}
This is the druid indexing task log: (index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0)
2017-12-28T13:05:19,299 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
"type" : "index_realtime",
"id" : "index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0",
"resource" : {
"availabilityGroup" : "events_druid-2017-12-28T13:00:00.000Z-0000",
"requiredCapacity" : 1
},
"spec" : {
"dataSchema" : {
"dataSource" : "events_druid",
"parser" : {
"type" : "map",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "timestamp",
"format" : "iso",
"missingValue" : null
},
"dimensionsSpec" : {
"dimensions" : [ "oid" ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "longSum",
"name" : "status",
"fieldName" : "status",
"expression" : null
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "HOUR",
"queryGranularity" : {
"type" : "duration",
"duration" : 60000,
"origin" : "1970-01-01T00:00:00.000Z"
},
"rollup" : true,
"intervals" : null
}
},
"ioConfig" : {
"type" : "realtime",
"firehose" : {
"type" : "clipped",
"delegate" : {
"type" : "timed",
"delegate" : {
"type" : "receiver",
"serviceName" : "firehose:druid:overlord:events_druid-013-0000-0000",
"bufferSize" : 100000
},
"shutoffTime" : "2017-12-28T14:15:00.000Z"
},
"interval" : "2017-12-28T13:00:00.000Z/2017-12-28T14:00:00.000Z"
},
"firehoseV2" : null
},
"tuningConfig" : {
"type" : "realtime",
"maxRowsInMemory" : 75000,
"intermediatePersistPeriod" : "PT10M",
"windowPeriod" : "PT10M",
"basePersistDirectory" : "/tmp/1514466313873-0",
"versioningPolicy" : {
"type" : "intervalStart"
},
"rejectionPolicy" : {
"type" : "none"
},
"maxPendingPersists" : 0,
"shardSpec" : {
"type" : "linear",
"partitionNum" : 0
},
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : "lz4",
"metricCompression" : "lz4",
"longEncoding" : "longs"
},
"buildV9Directly" : true,
"persistThreadPriority" : 0,
"mergeThreadPriority" : 0,
"reportParseExceptions" : false,
"handoffConditionTimeout" : 0,
"alertTimeout" : 0
}
},
"context" : null,
"groupId" : "index_realtime_events_druid",
"dataSource" : "events_druid"
}
2017-12-28T13:05:19,312 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Attempting to lock file[/apps/druid/tasks/index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0/lock].
2017-12-28T13:05:19,313 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Acquired lock file[/apps/druid/tasks/index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0/lock] in 1ms.
2017-12-28T13:05:19,317 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0
2017-12-28T13:05:19,323 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0] location changed to [TaskLocation{host='hadooptest9.{host}', port=8100}].
2017-12-28T13:05:19,323 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0] status changed to [RUNNING].
2017-12-28T13:05:19,327 INFO [main] org.eclipse.jetty.server.Server - jetty-9.3.19.v20170502
2017-12-28T13:05:19,350 INFO [task-runner-0-priority-0] io.druid.segment.realtime.plumber.RealtimePlumber - Creating plumber using rejectionPolicy[io.druid.segment.realtime.plumber.NoopRejectionPolicyFactory$1#7925d517]
2017-12-28T13:05:19,351 INFO [task-runner-0-priority-0] io.druid.server.coordination.CuratorDataSegmentServerAnnouncer - Announcing self[DruidServerMetadata{name='hadooptest9.{host}:8100', host='hadooptest9.{host}:8100', maxSize=0, tier='_default_tier', type='realtime', priority='0'}] at [/druid/announcements/hadooptest9.{host}:8100]
2017-12-28T13:05:19,382 INFO [task-runner-0-priority-0] io.druid.segment.realtime.plumber.RealtimePlumber - Expect to run at [2017-12-28T14:10:00.000Z]
2017-12-28T13:05:19,392 INFO [task-runner-0-priority-0] io.druid.segment.realtime.plumber.RealtimePlumber - Starting merge and push.
2017-12-28T13:05:19,392 INFO [task-runner-0-priority-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] segments. Attempting to hand off segments that start before [1970-01-01T00:00:00.000Z].
2017-12-28T13:05:19,392 INFO [task-runner-0-priority-0] io.druid.segment.realtime.plumber.RealtimePlumber - Found [0] sinks to persist and merge
2017-12-28T13:05:19,451 INFO [task-runner-0-priority-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Connecting firehose: firehose:druid:overlord:events_druid-013-0000-0000
2017-12-28T13:05:19,453 INFO [task-runner-0-priority-0] io.druid.segment.realtime.firehose.EventReceiverFirehoseFactory - Found chathandler of class[io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider]
2017-12-28T13:05:19,453 INFO [task-runner-0-priority-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[firehose:druid:overlord:events_druid-013-0000-0000]
2017-12-28T13:05:19,454 INFO [task-runner-0-priority-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='firehose:druid:overlord:events_druid-013-0000-0000', host='hadooptest9.{host}', port=8100}]
2017-12-28T13:05:19,502 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
2017-12-28T13:05:19,502 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider as a provider class
2017-12-28T13:05:19,502 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering io.druid.server.initialization.jetty.CustomExceptionMapper as a provider class
2017-12-28T13:05:19,502 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering io.druid.server.StatusResource as a root resource class
2017-12-28T13:05:19,505 INFO [main] com.sun.jersey.server.impl.application.WebApplicationImpl - Initiating Jersey application, version 'Jersey: 1.19.3 10/24/2016 03:43 PM'
2017-12-28T13:05:19,515 INFO [task-runner-0-priority-0] io.druid.segment.realtime.firehose.ServiceAnnouncingChatHandlerProvider - Registering Eventhandler[events_druid-013-0000-0000]
2017-12-28T13:05:19,515 INFO [task-runner-0-priority-0] io.druid.curator.discovery.CuratorServiceAnnouncer - Announcing service[DruidNode{serviceName='events_druid-013-0000-0000', host='hadooptest9.{host}', port=8100}]
2017-12-28T13:05:19,529 WARN [task-runner-0-priority-0] org.apache.curator.utils.ZKPaths - The version of ZooKeeper being used doesn't support Container nodes. CreateMode.PERSISTENT will be used instead.
2017-12-28T13:05:19,535 INFO [task-runner-0-priority-0] io.druid.server.metrics.EventReceiverFirehoseRegister - Registering EventReceiverFirehoseMetric for service [firehose:druid:overlord:events_druid-013-0000-0000]
2017-12-28T13:05:19,536 INFO [task-runner-0-priority-0] io.druid.data.input.FirehoseFactory - Firehose created, will shut down at: 2017-12-28T14:15:00.000Z
2017-12-28T13:05:19,574 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.initialization.jetty.CustomExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2017-12-28T13:05:19,576 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2017-12-28T13:05:19,583 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider to GuiceManagedComponentProvider with the scope "Singleton"
2017-12-28T13:05:19,845 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.http.security.StateResourceFilter to GuiceInstantiatedComponentProvider
2017-12-28T13:05:19,863 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.http.SegmentListerResource to GuiceInstantiatedComponentProvider
2017-12-28T13:05:19,874 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
2017-12-28T13:05:19,876 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
2017-12-28T13:05:19,880 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.query.lookup.LookupListeningResource to GuiceInstantiatedComponentProvider
2017-12-28T13:05:19,882 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.query.lookup.LookupIntrospectionResource to GuiceInstantiatedComponentProvider
2017-12-28T13:05:19,883 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2017-12-28T13:05:19,896 WARN [main] com.sun.jersey.spi.inject.Errors - The following warnings have been detected with resource and/or provider classes:
WARNING: A HTTP GET method, public void io.druid.server.http.SegmentListerResource.getSegments(long,long,long,javax.servlet.http.HttpServletRequest) throws java.io.IOException, MUST return a non-void type.
2017-12-28T13:05:19,905 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler#2fba0dac{/,null,AVAILABLE}
2017-12-28T13:05:19,914 INFO [main] org.eclipse.jetty.server.AbstractConnector - Started ServerConnector#25218a4d{HTTP/1.1,[http/1.1]}{0.0.0.0:8100}
2017-12-28T13:05:19,914 INFO [main] org.eclipse.jetty.server.Server - Started #6014ms
2017-12-28T13:05:19,915 INFO [main] io.druid.java.util.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.listener.announcer.ListenerResourceAnnouncer.start()] on object[io.druid.query.lookup.LookupResourceListenerAnnouncer#426710f0].
2017-12-28T13:05:19,919 INFO [main] io.druid.server.listener.announcer.ListenerResourceAnnouncer - Announcing start time on [/druid/listeners/lookups/__default/hadooptest9.{host}:8100]
2017-12-28T13:05:20,517 WARN [task-runner-0-priority-0] io.druid.segment.realtime.firehose.PredicateFirehose - [0] InputRow(s) ignored as they do not satisfy the predicate
This is index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0 payload:
{
"task":"index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0","payload":{
"id":"index_realtime_events_druid_2017-12-28T13:00:00.000Z_0_0","resource":{
"availabilityGroup":"events_druid-2017-12-28T13:00:00.000Z-0000","requiredCapacity":1},"spec":{
"dataSchema":{
"dataSource":"events_druid","parser":{
"type":"map","parseSpec":{
"format":"json","timestampSpec":{
"column":"timestamp","format":"iso","missingValue":null},"dimensionsSpec":{
"dimensions":["oid"],"spatialDimensions":[]}}},"metricsSpec":[{
"type":"longSum","name":"status","fieldName":"status","expression":null}],"granularitySpec":{
"type":"uniform","segmentGranularity":"HOUR","queryGranularity":{
"type":"duration","duration":60000,"origin":"1970-01-01T00:00:00.000Z"},"rollup":true,"intervals":null}},"ioConfig":{
"type":"realtime","firehose":{
"type":"clipped","delegate":{
"type":"timed","delegate":{
"type":"receiver","serviceName":"firehose:druid:overlord:events_druid-013-0000-0000","bufferSize":100000},"shutoffTime":"2017-12-28T14:15:00.000Z"},"interval":"2017-12-28T13:00:00.000Z/2017-12-28T14:00:00.000Z"},"firehoseV2":null},"tuningConfig":{
"type":"realtime","maxRowsInMemory":75000,"intermediatePersistPeriod":"PT10M","windowPeriod":"PT10M","basePersistDirectory":"/tmp/1514466313873-0","versioningPolicy":{
"type":"intervalStart"},"rejectionPolicy":{
"type":"none"},"maxPendingPersists":0,"shardSpec":{
"type":"linear","partitionNum":0},"indexSpec":{
"bitmap":{
"type":"concise"},"dimensionCompression":"lz4","metricCompression":"lz4","longEncoding":"longs"},"buildV9Directly":true,"persistThreadPriority":0,"mergeThreadPriority":0,"reportParseExceptions":false,"handoffConditionTimeout":0,"alertTimeout":0}},"context":null,"groupId":"index_realtime_events_druid","dataSource":"events_druid"}}
This is end of spark job stderr
50:09 INFO ZooKeeper: Client environment:os.version=3.10.0-514.10.2.el7.x86_64
17/12/28 14:50:09 INFO ZooKeeper: Client environment:user.name=yarn
17/12/28 14:50:09 INFO ZooKeeper: Client environment:user.home=/home/yarn
17/12/28 14:50:09 INFO ZooKeeper: Client environment:user.dir=/data1/hadoop/yarn/local/usercache/hdfs/appcache/application_1512485869804_6924/container_e58_1512485869804_6924_01_000002
17/12/28 14:50:09 INFO ZooKeeper: Initiating client connection, connectString={IP2}:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState#5967905
17/12/28 14:50:09 INFO ClientCnxn: Opening socket connection to server {IP2}/{IP2}:2181. Will not attempt to authenticate using SASL (unknown error)
17/12/28 14:50:09 INFO ClientCnxn: Socket connection established, initiating session, client: /{IP6}:42704, server: {IP2}/{IP2}:2181
17/12/28 14:50:09 INFO ClientCnxn: Session establishment complete on server {IP2}/{IP2}:2181, sessionid = 0x25fa4ea15980119, negotiated timeout = 40000
17/12/28 14:50:10 INFO ConnectionStateManager: State change: CONNECTED
17/12/28 14:50:10 INFO Version: HV000001: Hibernate Validator 5.1.3.Final
17/12/28 14:50:10 INFO JsonConfigurator: Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='extensions', hadoopDependenciesDir='hadoop-dependencies', hadoopContainerDruidClasspath='null', loadList=null}]
17/12/28 14:50:10 INFO LoggingEmitter: Start: started [true]
17/12/28 14:50:11 INFO FinagleRegistry: Adding resolver for scheme[disco].
17/12/28 14:50:11 INFO CachedKafkaConsumer: Initial fetch for spark-executor-use_a_separate_group_id_for_each_stream events_test 0 6658
17/12/28 14:50:12 INFO ClusteredBeam: Global latestCloseTime[2017-12-28T12:00:00.000Z] for identifier[druid:overlord/events_druid] has moved past timestamp[2017-12-28T12:00:00.000Z], not creating merged beam
17/12/28 14:50:12 INFO ClusteredBeam: Turns out we decided not to actually make beams for identifier[druid:overlord/events_druid] timestamp[2017-12-28T12:00:00.000Z]. Returning None.
17/12/28 14:50:12 WARN MapPartitioner: Cannot partition object of class[class MyEvent] by time and dimensions. Consider implementing a Partitioner.
17/12/28 14:50:12 INFO ClusteredBeam: Global latestCloseTime[2017-12-28T12:00:00.000Z] for identifier[druid:overlord/events_druid] has moved past timestamp[2017-12-28T12:00:00.000Z], not creating merged beam
17/12/28 14:50:12 INFO ClusteredBeam: Turns out we decided not to actually make beams for identifier[druid:overlord/events_druid] timestamp[2017-12-28T12:00:00.000Z]. Returning None.
17/12/28 14:50:12 INFO ClusteredBeam: Global latestCloseTime[2017-12-28T12:00:00.000Z] for identifier[druid:overlord/events_druid] has moved past timestamp[2017-12-28T12:00:00.000Z], not creating merged beam
17/12/28 14:50:12 INFO ClusteredBeam: Turns out we decided not to actually make beams for identifier[druid:overlord/events_druid] timestamp[2017-12-28T12:00:00.000Z]. Returning None.
17/12/28 14:50:12 INFO ClusteredBeam: Global latestCloseTime[2017-12-28T12:00:00.000Z] for identifier[druid:overlord/events_druid] has moved past timestamp[2017-12-28T12:00:00.000Z], not creating merged beam
17/12/28 14:50:12 INFO ClusteredBeam: Turns out we decided not to actually make beams for identifier[druid:overlord/events_druid] timestamp[2017-12-28T12:00:00.000Z]. Returning None.
17/12/28 14:50:12 INFO ClusteredBeam: Global latestCloseTime[2017-12-28T12:00:00.000Z] for identifier[druid:overlord/events_druid] has moved past timestamp[2017-12-28T12:00:00.000Z], not creating merged beam
17/12/28 14:50:12 INFO ClusteredBeam: Turns out we decided not to actually make beams for identifier[druid:overlord/events_druid] timestamp[2017-12-28T12:00:00.000Z]. Returning None.
17/12/28 14:50:16 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1541 bytes result sent to driver
I have also written result to a text file to make sure data is coming and formatted. Here are a few lines of text file:
MyEvent(2017-12-28T16:10:00.387+03:00,0010,1)
MyEvent(2017-12-28T16:10:00.406+03:00,0030,1)
MyEvent(2017-12-28T16:10:00.417+03:00,0010,1)
MyEvent(2017-12-28T16:10:00.431+03:00,0010,1)
MyEvent(2017-12-28T16:10:00.448+03:00,0010,1)
MyEvent(2017-12-28T16:10:00.464+03:00,0030,1)
Help is much appreciated. Thanks.
This problem was solved by adding timestampSpec to DruidBeams as such:
DruidBeams
.builder((event: MyEvent) => event.time)
.curator(curator)
.discoveryPath(discoveryPath)
.location(DruidLocation(indexService, dataSource))
.rollup(DruidRollup(SpecificDruidDimensions(dimensions), aggregators, QueryGranularities.MINUTE))
.tuning(
ClusteredBeamTuning(
segmentGranularity = Granularity.HOUR,
windowPeriod = new Period("PT10M"),
partitions = 1,
replicants = 1
)
)
.timestampSpec(new TimestampSpec("timestamp", "posix", null))
.buildBeam()

Intern sends headers data into address bar

Every time I run ./node_modules/.bin/intern-runner config=tests/intern
internjs sends some junk data into browser address bar and tries to execute it.
Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22left%3A%200%3B%20position%3A%20absolute%3B%20top%3A%20-1000px%3B%22%3Ea%3C%2Fdiv%3E])
???
And then it finally get the valid url where it can continue with testing Suite.
10:36:14.837 INFO - Executing: [get: http://localhost:9000/index.html])
I've tested it on firefox 27, 32, 38.2 , on Linux box and windows 7 + cygwin.
Everywhere I found the same behaviour. It doesn't appear to be a problem with selenium (tried grid config, standalone ) or the driver itself.
My config is mostly from https://github.com/theintern/intern-tutorial
This behaviour causes problems with IE (11 in my case) as it's not able to get pass the first junk request and gets stuck at it.
10:35:04.654 INFO - Launching a standalone Selenium Server
10:35:04.815 INFO - Java: Oracle Corporation 24.85-b03
10:35:04.815 INFO - OS: Linux 3.10.0-229.11.1.el7.x86_64 amd64
10:35:04.854 INFO - v2.47.1, with Core v2.47.1. Built from revision 411b314
10:35:05.207 INFO - Driver provider org.openqa.selenium.ie.InternetExplorerDriver registration is skipped:
registration capabilities Capabilities [{platform=WINDOWS, ensureCleanSession=true, browserName=internet explorer, version=}] does not match the current platform LINUX
10:35:05.208 INFO - Driver provider org.openqa.selenium.edge.EdgeDriver registration is skipped:
registration capabilities Capabilities [{platform=WINDOWS, browserName=MicrosoftEdge, version=}] does not match the current platform LINUX
10:35:05.208 INFO - Driver class not found: com.opera.core.systems.OperaDriver
10:35:05.208 INFO - Driver provider com.opera.core.systems.OperaDriver is not registered
10:35:05.660 INFO - RemoteWebDriver instances should connect to: http://127.0.0.1:4444/wd/hub
10:35:05.660 INFO - Selenium Server is up and running
10:36:08.181 INFO - Executing: [new session: Capabilities [{idle-timeout=60, platform=LINUX, browserName=firefox, browserstack.selenium_version=2.45.0, name=tests/intern, version=38.2}]])
10:36:08.190 INFO - Creating a new session for Capabilities [{idle-timeout=60, platform=LINUX, browserName=firefox, browserstack.selenium_version=2.45.0, name=tests/intern, version=38.2}]
10:36:13.368 INFO - Done: [new session: Capabilities [{idle-timeout=60, platform=LINUX, browserName=firefox, browserstack.selenium_version=2.45.0, name=tests/intern, version=38.2}]]
10:36:13.378 INFO - Executing: [get: about:blank])
10:36:13.412 INFO - Done: [get: about:blank]
10:36:13.420 INFO - Executing: [get local storage size])
10:36:13.422 INFO - Executing: [get location context])
10:36:13.425 INFO - Executing: [get application cache status])
10:36:13.427 INFO - Executing: [take screenshot])
10:36:13.428 INFO - Executing: [execute async script: arguments[0](true);, []])
10:36:13.435 WARN - Exception thrown
org.openqa.selenium.UnsupportedCommandException: driver (org.openqa.selenium.firefox.FirefoxDriver) does not support org.openqa.selenium.html5.LocationContext
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: 'xxxxx', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
at org.openqa.selenium.remote.server.handler.html5.Utils.convert(Utils.java:91)
at org.openqa.selenium.remote.server.handler.html5.Utils.getLocationContext(Utils.java:57)
at org.openqa.selenium.remote.server.handler.html5.GetLocationContext.call(GetLocationContext.java:32)
at org.openqa.selenium.remote.server.handler.html5.GetLocationContext.call(GetLocationContext.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.openqa.selenium.remote.server.DefaultSession$1.run(DefaultSession.java:176)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
10:36:13.440 WARN - Exception: driver (org.openqa.selenium.firefox.FirefoxDriver) does not support org.openqa.selenium.html5.LocationContext
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
10:36:13.438 WARN - Exception thrown
org.openqa.selenium.UnsupportedCommandException: driver (org.openqa.selenium.firefox.FirefoxDriver) does not support org.openqa.selenium.html5.WebStorage
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
at org.openqa.selenium.remote.server.handler.html5.Utils.convert(Utils.java:91)
at org.openqa.selenium.remote.server.handler.html5.Utils.getWebStorage(Utils.java:62)
at org.openqa.selenium.remote.server.handler.html5.GetLocalStorageSize.call(GetLocalStorageSize.java:31)
at org.openqa.selenium.remote.server.handler.html5.GetLocalStorageSize.call(GetLocalStorageSize.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.openqa.selenium.remote.server.DefaultSession$1.run(DefaultSession.java:176)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
10:36:13.467 WARN - Exception: driver (org.openqa.selenium.firefox.FirefoxDriver) does not support org.openqa.selenium.html5.WebStorage
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
10:36:13.437 WARN - Exception thrown
org.openqa.selenium.UnsupportedCommandException: driver (org.openqa.selenium.firefox.FirefoxDriver) does not support org.openqa.selenium.html5.ApplicationCache
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
at org.openqa.selenium.remote.server.handler.html5.Utils.convert(Utils.java:91)
at org.openqa.selenium.remote.server.handler.html5.Utils.getApplicationCache(Utils.java:47)
at org.openqa.selenium.remote.server.handler.html5.GetAppCacheStatus.call(GetAppCacheStatus.java:32)
at org.openqa.selenium.remote.server.handler.html5.GetAppCacheStatus.call(GetAppCacheStatus.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.openqa.selenium.remote.server.DefaultSession$1.run(DefaultSession.java:176)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
10:36:13.487 WARN - Exception: driver (org.openqa.selenium.firefox.FirefoxDriver) does not support org.openqa.selenium.html5.ApplicationCache
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
10:36:13.503 INFO - Executing: [get window size])
10:36:13.504 INFO - Executing: [doubleclick: no args])
10:36:13.505 INFO - Executing: [Long press: null])
10:36:13.513 INFO - Done: [take screenshot]
10:36:13.539 INFO - Done: [execute async script: arguments[0](true);, []]
10:36:13.557 INFO - Done: [get window size]
10:36:13.563 INFO - Executing: [set window size])
10:36:13.572 WARN - Exception thrown
java.lang.UnsupportedOperationException: Underlying driver does not implement advanced user interactions yet.
at org.openqa.selenium.support.events.EventFiringWebDriver.getTouch(EventFiringWebDriver.java:312)
at org.openqa.selenium.remote.server.handler.interactions.touch.LongPressOnElement.call(LongPressOnElement.java:41)
at org.openqa.selenium.remote.server.handler.interactions.touch.LongPressOnElement.call(LongPressOnElement.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.openqa.selenium.remote.server.DefaultSession$1.run(DefaultSession.java:176)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
10:36:13.574 WARN - Exception: Underlying driver does not implement advanced user interactions yet.
10:36:13.574 WARN - Exception thrown
org.openqa.selenium.WebDriverException: [JavaScript Error: "Argument to isShown must be of type Element" {file: "file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/synthetic-mouse.js" line: 8547}]'[JavaScript Error: "Argument to isShown must be of type Element" {file: "file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/synthetic-mouse.js" line: 8547}]' when calling method: [wdIMouse::doubleClick]
Command duration or timeout: 12 milliseconds
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Session ID: 6ba8c75c-021b-4a46-8368-6dd3c0487142
Driver info: org.openqa.selenium.firefox.FirefoxDriver
Capabilities [{platform=LINUX, acceptSslCerts=true, javascriptEnabled=true, cssSelectorsEnabled=true, databaseEnabled=true, browserName=firefox, handlesAlerts=true, nativeEvents=false, webStorageEnabled=true, rotatable=false, locationContextEnabled=true, applicationCacheEnabled=true, takesScreenshot=true, version=38.2.1}]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.openqa.selenium.remote.ErrorHandler.createThrowable(ErrorHandler.java:206)
at org.openqa.selenium.remote.ErrorHandler.throwIfResponseFailed(ErrorHandler.java:158)
at org.openqa.selenium.remote.RemoteWebDriver.execute(RemoteWebDriver.java:595)
at org.openqa.selenium.remote.RemoteWebDriver.execute(RemoteWebDriver.java:618)
at org.openqa.selenium.remote.RemoteExecuteMethod.execute(RemoteExecuteMethod.java:33)
at org.openqa.selenium.remote.RemoteMouse.doubleClick(RemoteMouse.java:71)
at org.openqa.selenium.support.events.internal.EventFiringMouse.doubleClick(EventFiringMouse.java:46)
at org.openqa.selenium.remote.server.handler.interactions.DoubleClickInSession.call(DoubleClickInSession.java:34)
at org.openqa.selenium.remote.server.handler.interactions.DoubleClickInSession.call(DoubleClickInSession.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.openqa.selenium.remote.server.DefaultSession$1.run(DefaultSession.java:176)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.openqa.selenium.WebDriverException: [JavaScript Error: "Argument to isShown must be of type Element" {file: "file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/synthetic-mouse.js" line: 8547}]'[JavaScript Error: "Argument to isShown must be of type Element" {file: "file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/synthetic-mouse.js" line: 8547}]' when calling method: [wdIMouse::doubleClick]
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: '-------', ip: '-------', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
at <anonymous class>.FirefoxDriver.prototype.mouseDoubleClick(file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/driver-component.js:11092)
at <anonymous class>.DelayedCommand.prototype.executeInternal_/h(file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/command-processor.js:12643)
at <anonymous class>.DelayedCommand.prototype.executeInternal_(file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/command-processor.js:12648)
at <anonymous class>.DelayedCommand.prototype.execute/<(file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/command-processor.js:12590)
10:36:13.578 WARN - Exception: [JavaScript Error: "Argument to isShown must be of type Element" {file: "file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/synthetic-mouse.js" line: 8547}]'[JavaScript Error: "Argument to isShown must be of type Element" {file: "file:///tmp/anonymous6679311816744638276webdriver-profile/extensions/fxdriver#googlecode.com/components/synthetic-mouse.js" line: 8547}]' when calling method: [wdIMouse::doubleClick]
Build info: version: '2.47.1', revision: '411b314', time: '2015-07-30 03:03:16'
System info: host: 'xxxxx', ip: 'xxxxxxxxx', os.name: 'Linux', os.arch: 'amd64', os.version: '3.10.0-229.11.1.el7.x86_64', java.version: '1.7.0_85'
Driver info: driver.version: unknown
10:36:13.597 INFO - Done: [set window size]
10:36:13.601 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Ctitle%3Ea%3C%2Ftitle%3E])
10:36:13.650 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Ctitle%3Ea%3C%2Ftitle%3E]
10:36:13.655 INFO - Executing: [get title])
10:36:13.664 INFO - Done: [get title]
10:36:13.669 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cstyle%3E%23a%7Bwidth%3A8px%3Bheight%3A8px%3B-ms-transform%3Ascale(0.5)%3B-moz-transform%3Ascale(0.5)%3B-webkit-transform%3Ascale(0.5)%3Btransform%3Ascale(0.5)%3B%7D%3C%2Fstyle%3E%3Cdiv%20id%3D%22a%22%3E%3C%2Fdiv%3E])
10:36:13.691 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cstyle%3E%23a%7Bwidth%3A8px%3Bheight%3A8px%3B-ms-transform%3Ascale(0.5)%3B-moz-transform%3Ascale(0.5)%3B-webkit-transform%3Ascale(0.5)%3Btransform%3Ascale(0.5)%3B%7D%3C%2Fstyle%3E%3Cdiv%20id%3D%22a%22%3E%3C%2Fdiv%3E]
10:36:13.699 INFO - Executing: [execute script: return (function () {
var bbox = document.getElementById('a').getBoundingClientRect();
return bbox.right - bbox.left === 4;
}).apply(this, arguments);, []])
10:36:13.707 INFO - Done: [execute script: return (function () {
var bbox = document.getElementById('a').getBoundingClientRect();
return bbox.right - bbox.left === 4;
}).apply(this, arguments);, []]
10:36:13.711 INFO - Executing: [get: about:blank])
10:36:13.734 INFO - Done: [get: about:blank]
10:36:13.742 INFO - Executing: [fetching available log types])
10:36:13.744 INFO - Executing: [find element: By.tagName: html])
10:36:13.746 INFO - Executing: [find element: By.tagName: html])
10:36:13.751 INFO - Done: [fetching available log types]
10:36:13.795 INFO - Done: [find element: By.tagName: html]
10:36:13.801 INFO - Executing: [tag name: 0 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> tag name: html]])
10:36:13.803 INFO - Done: [find element: By.tagName: html]
10:36:13.811 INFO - Done: [tag name: 0 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> tag name: html]]
10:36:13.812 INFO - Executing: [get element attribute: 0 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> tag name: html], nonexisting])
10:36:13.819 INFO - Done: [get element attribute: 0 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> tag name: html], nonexisting]
10:36:13.823 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%3E%3C%2Fdiv%3E])
10:36:13.847 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%3E%3C%2Fdiv%3E]
10:36:13.851 INFO - Executing: [execute script: return document.getElementById("a");, []])
10:36:13.859 INFO - Done: [execute script: return document.getElementById("a");, []]
10:36:13.862 INFO - Executing: [tag name: 1 [org.openqa.selenium.remote.RemoteWebElement#294ad511 -> unknown locator]])
10:36:13.869 INFO - Done: [tag name: 1 [org.openqa.selenium.remote.RemoteWebElement#294ad511 -> unknown locator]]
10:36:13.873 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22opacity%3A%20.1%3B%22%3Ea%3C%2Fdiv%3E])
10:36:13.898 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22opacity%3A%20.1%3B%22%3Ea%3C%2Fdiv%3E]
10:36:13.902 INFO - Executing: [execute script: var o = document.getElementById("a").style.opacity; return o && o.charAt(0) === "0";, []])
10:36:13.910 INFO - Done: [execute script: var o = document.getElementById("a").style.opacity; return o && o.charAt(0) === "0";, []]
10:36:13.913 INFO - Executing: [execute script: document.getElementById("a").style.opacity = "0";, []])
10:36:13.919 INFO - Done: [execute script: document.getElementById("a").style.opacity = "0";, []]
10:36:13.923 INFO - Executing: [find element: By.id: a])
10:36:13.929 INFO - Done: [find element: By.id: a]
10:36:13.936 INFO - Executing: [is displayed: 2 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]])
10:36:13.943 INFO - Done: [is displayed: 2 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]]
10:36:13.949 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22left%3A%200%3B%20position%3A%20absolute%3B%20top%3A%20-1000px%3B%22%3Ea%3C%2Fdiv%3E])
10:36:13.970 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22left%3A%200%3B%20position%3A%20absolute%3B%20top%3A%20-1000px%3B%22%3Ea%3C%2Fdiv%3E]
10:36:13.974 INFO - Executing: [find element: By.id: a])
10:36:13.980 INFO - Done: [find element: By.id: a]
10:36:13.984 INFO - Executing: [is displayed: 3 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]])
10:36:13.995 INFO - Done: [is displayed: 3 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]]
10:36:13.999 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cform%20method%3D%22get%22%20action%3D%22about%3Ablank%22%3E%3Cinput%20id%3D%22a%22%20type%3D%22submit%22%20name%3D%22a%22%20value%3D%22a%22%3E%3C%2Fform%3E])
10:36:14.023 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cform%20method%3D%22get%22%20action%3D%22about%3Ablank%22%3E%3Cinput%20id%3D%22a%22%20type%3D%22submit%22%20name%3D%22a%22%20value%3D%22a%22%3E%3C%2Fform%3E]
10:36:14.031 INFO - Executing: [find element: By.id: a])
10:36:14.045 INFO - Done: [find element: By.id: a]
10:36:14.050 INFO - Executing: [submit: 4 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]])
10:36:14.069 INFO - Done: [submit: 4 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]]
10:36:14.072 INFO - Executing: [get current url])
10:36:14.079 INFO - Done: [get current url]
10:36:14.084 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22margin%3A%203000px%3B%22%3E%3C%2Fdiv%3E])
10:36:14.105 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cdiv%20id%3D%22a%22%20style%3D%22margin%3A%203000px%3B%22%3E%3C%2Fdiv%3E]
10:36:14.109 INFO - Executing: [find element: By.id: a])
10:36:14.122 INFO - Done: [find element: By.id: a]
10:36:14.126 INFO - Executing: [get location: 5 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]])
10:36:14.135 INFO - Done: [get location: 5 [[FirefoxDriver: firefox on LINUX (6ba8c75c-021b-4a46-8368-6dd3c0487142)] -> id: a]]
10:36:14.140 INFO - Executing: [get: about:blank?1])
10:36:14.165 INFO - Done: [get: about:blank?1]
10:36:14.170 INFO - Executing: [refresh])
10:36:14.191 INFO - Done: [refresh]
10:36:14.200 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cscript%3Ecounter%20%3D%200%3B%20var%20d%20%3D%20document%3B%20d.onclick%20%3D%20d.onmousedown%20%3D%20d.onmouseup%20%3D%20function%20()%20%7B%20counter%2B%2B%3B%20%7D%3B%3C%2Fscript%3E])
10:36:14.220 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cscript%3Ecounter%20%3D%200%3B%20var%20d%20%3D%20document%3B%20d.onclick%20%3D%20d.onmousedown%20%3D%20d.onmouseup%20%3D%20function%20()%20%7B%20counter%2B%2B%3B%20%7D%3B%3C%2Fscript%3E]
10:36:14.224 INFO - Executing: [find element: By.tagName: html])
10:36:14.232 INFO - Done: [find element: By.tagName: html]
10:36:14.236 INFO - Executing: [mousemove: 6 false])
10:36:14.255 INFO - Done: [mousemove: 6 false]
10:36:14.360 INFO - Executing: [doubleclick: no args])
10:36:14.379 INFO - Done: [doubleclick: no args]
10:36:14.382 INFO - Executing: [execute script: return counter;, []])
10:36:14.396 INFO - Done: [execute script: return counter;, []]
10:36:14.402 INFO - Executing: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cstyle%3E%23a%7Bwidth%3A8px%3Bheight%3A8px%3B-ms-transform%3Ascale(0.5)%3B-moz-transform%3Ascale(0.5)%3B-webkit-transform%3Ascale(0.5)%3Btransform%3Ascale(0.5)%3B%7D%3C%2Fstyle%3E%3Cdiv%20id%3D%22a%22%3E%3C%2Fdiv%3E])
10:36:14.429 INFO - Done: [get: data:text/html;charset=utf-8,%3C!DOCTYPE%20html%3E%3Cstyle%3E%23a%7Bwidth%3A8px%3Bheight%3A8px%3B-ms-transform%3Ascale(0.5)%3B-moz-transform%3Ascale(0.5)%3B-webkit-transform%3Ascale(0.5)%3Btransform%3Ascale(0.5)%3B%7D%3C%2Fstyle%3E%3Cdiv%20id%3D%22a%22%3E%3C%2Fdiv%3E]
10:36:14.434 INFO - Executing: [execute script: return document.getElementById("a");, []])
10:36:14.442 INFO - Done: [execute script: return document.getElementById("a");, []]
10:36:14.446 INFO - Executing: [get element size: 7 [org.openqa.selenium.remote.RemoteWebElement#7bae314f -> unknown locator]])
10:36:14.451 INFO - Done: [get element size: 7 [org.openqa.selenium.remote.RemoteWebElement#7bae314f -> unknown locator]]
10:36:14.454 INFO - Executing: [get: about:blank])
10:36:14.476 INFO - Done: [get: about:blank]
10:36:14.483 INFO - Executing: [get current url])
10:36:14.484 INFO - Executing: [execute script: return (function getCoverageData() {
function stringify(value) {
function escapeString(/*string*/ str) {
return ('"' + str.replace(/(["\\])/g, '\\$1') + '"')
.replace(/[\f]/g, '\\f')
.replace(/[\b]/g, '\\b')
.replace(/[\n]/g, '\\n')
.replace(/[\t]/g, '\\t')
.replace(/[\r]/g, '\\r'); // string
}
function serialize(value, key) {
.....
10:36:14.837 INFO - Executing: [get: http://localhost:9000/index.html])
config
define({
// Default desired capabilities for all environments. Individual capabilities can be overridden by any of the
// specified browser environments in the `environments` array below as well. See
// <https://theintern.github.io/intern/#option-capabilities> for links to the different capabilities options for
// different services.
//
// Note that the `build` capability will be filled in with the current commit ID or build tag from the CI
// environment automatically
capabilities: {
'browserstack.selenium_version': '2.45.0'
},
// Browsers to run integration testing against. Note that version numbers must be strings if used with Sauce
// OnDemand. Options that will be permutated are browserName, version, platform, and platformVersion; any other
// capabilities options specified for an environment will be copied as-is
environments: [
//{ browserName: 'internet explorer', version: '11', platform: 'WIN8' },
//{ browserName: 'internet explorer', version: '10', platform: 'WIN8' },
//{ browserName: 'internet explorer', version: '9', platform: 'WINDOWS' },
{ browserName: 'firefox', version: '38.2', platform: [ 'LINUX' ] }
//{ browserName: 'chrome', version: '39', platform: [ 'WINDOWS', 'MAC' ] },
//{ browserName: 'safari', version: '8', platform: 'MAC' }
],
maxConcurrency: 2,
tunnel: 'NullTunnel',
loaderOptions: {
// Packages that should be registered with the loader in each testing environment
packages: [ { name: 'myPackage', location: '.' } ]
},
// Non-functional test suite(s) to run in each browser
suites: [ 'tests/aa' ],
// Functional test suite(s) to execute against each browser once non-functional tests are completed
functionalSuites: [ 'tests/bb' ],
// A regular expression matching URLs to files that should not be included in code coverage analysis
excludeInstrumentation: /^(?:tests|node_modules)\//
});
Make sure you've followed the directions under Required Configuration on the InternetExplorerDriver wiki page. I ran into an issue like this as well; adding the registry setting mentioned on that page fixed it:
For IE 11 only, you will need to set a registry entry on the target computer so that the driver can maintain a connection to the instance of Internet Explorer it creates. For 32-bit Windows installations, the key you must examine in the registry editor is HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BFCACHE. For 64-bit Windows installations, the key is HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BFCACHE. Please note that the FEATURE_BFCACHE subkey may or may not be present, and should be created if it is not present. Important: Inside this key, create a DWORD value named iexplore.exe with the value of 0.
As for the "junk requests", they're a standard part of Leadfoot's start-up process. Leadfoot first runs a battery of tests to detect known WebDriver implementation bugs.

submitting spark job to mesos cluster using java driver code

I am trying to submit a spark job via a java driver to mesos cluster. There are no firewalls active and the mesos console shows 8GB ram available and 3 cores available. Yet I get "check cluster UI to see if enough resources are available"
Heres the code
SparkConf conf = new SparkConf();
conf.setMaster(mesosmasterstring);
conf.setAppName("demotask1");
conf.setJars(scalaJarList);
Properties sparkProps = FileLoader
.load(EnvironmentConstants.SPARK_PROPS_FILE_PATH);
Set<Object> propNames = (sparkProps.keySet());
for (Object kobj : propNames) {
conf.set(kobj.toString().trim(),
sparkProps.getProperty(kobj.toString()).trim());
}
//I have set the following via the code above:
//spark.executor.uri=https://s3-us-west-2.amazonaws.com/minimo-dev2/spark/spark-//1.3.1-bin-hadoop2.4.tgz
SparkContext sc = new SparkContext(conf);
JavaSparkContext javaSparkContext = new JavaSparkContext(sc);
My Webserver Log on which the driver process runs
2015-08-20 15:56:04,981 WARN [TaskSchedulerImpl] - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
2015-08-20 15:56:05,974 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:06,975 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:07,975 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:11,979 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:12,979 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:15,692 DEBUG [BlockManagerMasterActor] - [actor] received message ExpireDeadHosts from Actor[akka://sparkDriver/user/BlockManagerMaster#1596206704]
2015-08-20 15:56:15,693 DEBUG [BlockManagerMasterActor] - [actor] handled message (0.036696 ms) ExpireDeadHosts from Actor[akka://sparkDriver/user/BlockManagerMaster#1596206704]
2015-08-20 15:56:17,982 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:18,983 DEBUG [TaskSchedulerImpl] - parentName: , name: TaskSet_0, runningTasks: 0
2015-08-20 15:56:19,981 WARN [TaskSchedulerImpl] - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Mesos Master.ip
I0820 15:56:10.687042 32074 master.cpp:4290] Sending 1 offers to framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:10.689906 32074 master.cpp:2884] Processing DECLINE call for offers: [ 20150820-152021-1496194988-5050-32040-O934 ] for framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:10.690024 32074 hierarchical.hpp:761] Recovered cpus():1; mem():2735; disk():4031; ports():[31000-32000] (total: cpus():1; mem():2735; disk():4031; ports():[31000-32000], allocated: ) on slave 20150817-195212-2955812780-5050-29267-S0 from framework 20150820-152021-1496194988-5050-32040-0000
I0820 15:56:11.688359 32075 master.cpp:4290] Sending 1 offers to framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:11.690852 32075 master.cpp:2884] Processing DECLINE call for offers: [ 20150820-152021-1496194988-5050-32040-O935 ] for framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:11.690968 32075 hierarchical.hpp:761] Recovered cpus():1; mem():2735; disk():4031; ports():[31000-32000] (total: cpus():1; mem():2735; disk():4031; ports():[31000-32000], allocated: ) on slave 20150817-195212-2955812780-5050-29267-S1 from framework 20150820-152021-1496194988-5050-32040-0000
I0820 15:56:13.690409 32077 master.cpp:4290] Sending 1 offers to framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:13.692711 32077 master.cpp:2884] Processing DECLINE call for offers: [ 20150820-152021-1496194988-5050-32040-O936 ] for framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:13.692827 32077 hierarchical.hpp:761] Recovered cpus():1; mem():2735; disk():4031; ports():[31000-32000] (total: cpus():1; mem():2735; disk():4031; ports():[31000-32000], allocated: ) on slave 20150817-195212-2955812780-5050-29267-S2 from framework 20150820-152021-1496194988-5050-32040-0000
I0820 15:56:15.693084 32070 master.cpp:4290] Sending 1 offers to framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
I0820 15:56:15.695425 32070 master.cpp:2884] Processing DECLINE call for offers: [ 20150820-152021-1496194988-5050-32040-O937 ] for framework 20150820-152021-1496194988-5050-32040-0000 (demotask1) at scheduler-62a62438-bb19-4740-8428-7d632c958ef8#driverHost:33527
Mesos slave.Ip
I0820 15:56:00.962321 32133 slave.cpp:4179] Querying resource estimator for oversubscribable resources
I0820 15:56:00.962446 32133 slave.cpp:4193] Received oversubscribable resources from the resource estimator
I0820 15:56:15.967767 32128 slave.cpp:4179] Querying resource estimator for oversubscribable resources
I0820 15:56:15.967869 32128 slave.cpp:4193] Received oversubscribable resources from the resource estimator
I0820 15:56:30.331955 32132 slave.cpp:3842] Current disk usage 18.49%. Max allowed age: 5.005413813717211days
I0820 15:56:30.969789 32132 slave.cpp:4179] Querying resource estimator for oversubscribable resources
I0820 15:56:30.969914 32132 slave.cpp:4193] Received oversubscribable resources from the resource estimator
I0820 15:56:45.971680 32129 slave.cpp:4179] Querying resource estimator for oversubscribable resources
I0820 15:56:45.971807 32129 slave.cpp:4193] Received oversubscribable resources from the resource estimator
I0820 15:57:00.972995 32134 slave.cpp:4179] Querying resource estimator for oversubscribable resources

Resources