Override Spark Core_2.12 (v3.3.0) logging configuration - apache-spark

I'm using Grails 2.5.4 and trying to use SparkSession instance for generating a Parquet output. Recently, upgraded the spark core and it's related dependencies to their latest version(v3.3.0).
During the SparkSession builder() initialization, I notice that some extra logs are getting displayed:
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
22/07/13 11:58:54 WARN Utils: Your hostname, XY resolves to a loopback address: 127.0.1.1; using 1XX.1XX.0.1XX instead (on interface wlo1)
22/07/13 11:58:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
22/07/13 11:58:54 INFO SparkContext: Running Spark version 3.3.0
22/07/13 11:58:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/07/13 11:58:54 INFO ResourceUtils: ==============================================================
22/07/13 11:58:54 INFO ResourceUtils: No custom resources configured for spark.driver.
22/07/13 11:58:54 INFO ResourceUtils: ==============================================================
22/07/13 11:58:54 INFO SparkContext: Submitted application: ABCDE
22/07/13 11:58:54 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
22/07/13 11:58:54 INFO ResourceProfile: Limiting resource is cpu
22/07/13 11:58:54 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/07/13 11:58:54 INFO SecurityManager: Changing view acls to: xy
22/07/13 11:58:54 INFO SecurityManager: Changing modify acls to: xy
22/07/13 11:58:54 INFO SecurityManager: Changing view acls groups to:
22/07/13 11:58:54 INFO SecurityManager: Changing modify acls groups to:
22/07/13 11:58:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(xy); groups with view permissions: Set(); users with modify permissions: Set(xy); groups with modify permissions: Set()
22/07/13 11:58:54 INFO Utils: Successfully started service 'sparkDriver' on port 39483.
22/07/13 11:58:54 INFO SparkEnv: Registering MapOutputTracker
22/07/13 11:58:54 INFO SparkEnv: Registering BlockManagerMaster
22/07/13 11:58:54 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/07/13 11:58:54 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/07/13 11:58:54 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/07/13 11:58:55 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-cf39a58e-e5bc-4a26-b92a-d945a0deb8e7
22/07/13 11:58:55 INFO MemoryStore: MemoryStore started with capacity 2004.6 MiB
22/07/13 11:58:55 INFO SparkEnv: Registering OutputCommitCoordinator
22/07/13 11:58:55 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/07/13 11:58:55 INFO Executor: Starting executor ID driver on host 1XX.1XX.0.1XX
22/07/13 11:58:55 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): ''
22/07/13 11:58:55 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33993.
22/07/13 11:58:55 INFO NettyBlockTransferService: Server created on 192.168.0.135:33993
22/07/13 11:58:55 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/07/13 11:58:55 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.0.135, 33993, None)
22/07/13 11:58:55 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.0.135:33993 with 2004.6 MiB RAM, BlockManagerId(driver, 192.168.0.135, 33993, None)
22/07/13 11:58:55 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.0.135, 33993, None)
22/07/13 11:58:55 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.0.135, 33993, None)
Before initializing the SparkSession instance using the builder() method, I've configured the logger level programmatically by:
Configurator.setLevel("org", Level.ERROR)
Configurator.setLevel("org.apache.spark", Level.ERROR)
Configurator.setLevel("akka", Level.ERROR)
Configurator.setLevel("scala", Level.ERROR)
Configurator.setLevel("java", Level.ERROR)
Configurator.setLevel("org.slf4j", Level.ERROR)
Configurator.setLevel("com", Level.ERROR)
Configurator.setLevel("javax", Level.ERROR)
Configurator.setLevel("jakarta", Level.ERROR)
Configurator.setLevel("io", Level.ERROR)
Configurator.setLevel("net", Level.ERROR)
I notice that it's picking the default log4j2.properties file of Spark. Is there a way I can override the logging configuration?

Related

ERROR SparkContext: Failed to add None to Spark environment

I submit a spark job first like this in a pyspark file
os.system(f'spark-submit --master local --jars ./examples/lib/app.jar app.py')
Then in the submitted app.py file, I create a new SparkSession like this:
spark = SparkSession.builder.appName(appName) \
.config('spark.jars') \
.getOrCreate()
Error message:
23/01/17 11:02:52 INFO SparkContext: Running Spark version 3.3.0
23/01/17 11:02:52 INFO ResourceUtils: ==============================================================
23/01/17 11:02:52 INFO ResourceUtils: No custom resources configured for spark.driver.
23/01/17 11:02:52 INFO ResourceUtils: ==============================================================
23/01/17 11:02:52 INFO SparkContext: Submitted application: symbolic_test
23/01/17 11:02:52 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
23/01/17 11:02:52 INFO ResourceProfile: Limiting resource is cpu
23/01/17 11:02:53 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/01/17 11:02:53 INFO SecurityManager: Changing view acls to: annie
23/01/17 11:02:53 INFO SecurityManager: Changing modify acls to: annie
23/01/17 11:02:53 INFO SecurityManager: Changing view acls groups to:
23/01/17 11:02:53 INFO SecurityManager: Changing modify acls groups to:
23/01/17 11:02:53 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(annie); groups with view permissions: Set(); users with modify permissions: Set(annie); groups with modify permissions: Set()
23/01/17 11:02:53 INFO Utils: Successfully started service 'sparkDriver' on port 42141.
23/01/17 11:02:53 INFO SparkEnv: Registering MapOutputTracker
23/01/17 11:02:53 INFO SparkEnv: Registering BlockManagerMaster
23/01/17 11:02:53 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/01/17 11:02:53 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/01/17 11:02:53 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/01/17 11:02:53 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-e4cc3b01-a6d5-4454-ad2d-4d0f42066479
23/01/17 11:02:53 INFO MemoryStore: MemoryStore started with capacity 434.4 MiB
23/01/17 11:02:53 INFO SparkEnv: Registering OutputCommitCoordinator
23/01/17 11:02:53 INFO Utils: Successfully started service 'SparkUI' on port 4040.
23/01/17 11:02:53 ERROR SparkContext: Failed to add None to Spark environment
java.io.FileNotFoundException: Jar /home/annie/exampleApp/example/None not found
at org.apache.spark.SparkContext.addLocalJarFile$1(SparkContext.scala:1949)
at org.apache.spark.SparkContext.addJar(SparkContext.scala:2004)
at org.apache.spark.SparkContext.$anonfun$new$12(SparkContext.scala:507)
at org.apache.spark.SparkContext.$anonfun$new$12$adapted(SparkContext.scala:507)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:507)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Thread.java:829)
when creating spark session through pyspark, I get the following error messages, which only arise when I add .config('spark.jars').
I've set my $SPARK_HOME variable correctly...
Any help will be appreciated!
If your code sample is true you do not assign any value to spark.jars key while creating spark session. Assigning jar path as value may solve the error.
SparkSession.builder.appName(appName) \
.config('config_key', config_value) \

EMR Serverless Spark Executors Timeout

I have an EMR Serverless application that is getting stuck in executions timeouts for some reason. I have tested all s3 connections and it's working. The problem is happening during the execution of a query in spark tables.
The EMR version is: emr-6.7.0
The same job was abble to run on spark 3.1.1 version in k8s, maybe it's something related to version.
My spark session setup:
spark = (SparkSession.builder
.config("spark.hadoop.fs.s3a.fast.upload", True)
.config("spark.hadoop.fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
.config("spark.sql.legacy.parquet.datetimeRebaseModeInWrite", "CORRECTED")
.config("spark.sql.autoBroadcastJoinThreshold", -1)
.config("spark.sql.shuffle.partitions", "1000")
.config("spark.sql.adaptive.enabled", "true")
.config("spark.sql.adaptive.coalescePartitions.enabled", "true")
.config("spark.sql.adaptive.advisoryPartitionSizeInBytes", "268435456")
.config("spark.jars.packages", "org.apache.hadoop:hadoop-aws:3.2.0")
.config("spark.jars.packages", "mysql:mysql-connector-java:8.0.17")
.enableHiveSupport().getOrCreate()
)
Driver log:
Ivy Default Cache set to: /home/hadoop/.ivy2/cache
The jars for the packages stored in: /home/hadoop/.ivy2/jars
org.apache.hadoop#hadoop-aws added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-b943cb44-441b-41b2-8ea1-c44496d2e550;1.0
confs: [default]
found org.apache.hadoop#hadoop-aws;3.2.0 in central
found com.amazonaws#aws-java-sdk-bundle;1.11.375 in central
downloading https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.2.0/hadoop-aws-3.2.0.jar ...
[SUCCESSFUL ] org.apache.hadoop#hadoop-aws;3.2.0!hadoop-aws.jar (20ms)
downloading https://repo1.maven.org/maven2/com/amazonaws/aws-java-sdk-bundle/1.11.375/aws-java-sdk-bundle-1.11.375.jar ...
[SUCCESSFUL ] com.amazonaws#aws-java-sdk-bundle;1.11.375!aws-java-sdk-bundle.jar (960ms)
:: resolution report :: resolve 823ms :: artifacts dl 984ms
:: modules in use:
com.amazonaws#aws-java-sdk-bundle;1.11.375 from central in [default]
org.apache.hadoop#hadoop-aws;3.2.0 from central in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 2 | 2 | 0 || 2 | 2 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-b943cb44-441b-41b2-8ea1-c44496d2e550
confs: [default]
2 artifacts copied, 0 already retrieved (96887kB/80ms)
22/09/28 12:28:25 INFO SparkContext: Running Spark version 3.2.1-amzn-0
22/09/28 12:28:25 INFO ResourceUtils: ==============================================================
22/09/28 12:28:25 INFO ResourceUtils: No custom resources configured for spark.driver.
22/09/28 12:28:25 INFO ResourceUtils: ==============================================================
22/09/28 12:28:25 INFO SparkContext: Submitted application: spark_segmentacao_caminhoneiros.py
22/09/28 12:28:25 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 4, script: , vendor: , memory -> name: memory, amount: 14336, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
22/09/28 12:28:25 INFO ResourceProfile: Limiting resource is cpus at 4 tasks per executor
22/09/28 12:28:25 INFO ResourceProfileManager: Added ResourceProfile id: 0
22/09/28 12:28:25 INFO SecurityManager: Changing view acls to: hadoop
22/09/28 12:28:25 INFO SecurityManager: Changing modify acls to: hadoop
22/09/28 12:28:25 INFO SecurityManager: Changing view acls groups to:
22/09/28 12:28:25 INFO SecurityManager: Changing modify acls groups to:
22/09/28 12:28:25 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
22/09/28 12:28:26 INFO Utils: Successfully started service 'sparkDriver' on port 33303.
22/09/28 12:28:26 INFO SparkEnv: Registering MapOutputTracker
22/09/28 12:28:26 INFO SparkEnv: Registering BlockManagerMaster
22/09/28 12:28:26 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/09/28 12:28:26 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/09/28 12:28:26 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/09/28 12:28:26 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-22f8e599-8bdc-4d65-a5c1-f9ab0bf5f01c
22/09/28 12:28:26 INFO MemoryStore: MemoryStore started with capacity 7.3 GiB
22/09/28 12:28:26 INFO SparkEnv: Registering OutputCommitCoordinator
22/09/28 12:28:26 INFO SubResultCacheManager: Sub-result caches are disabled.
22/09/28 12:28:26 INFO log: Logging initialized #8694ms to org.sparkproject.jetty.util.log.Slf4jLog
22/09/28 12:28:26 INFO Server: jetty-9.4.43.v20210629; built: 2021-06-30T11:07:22.254Z; git: 526006ecfa3af7f1a27ef3a288e2bef7ea9dd7e8; jvm 1.8.0_342-b07
22/09/28 12:28:26 INFO Server: Started #8797ms
22/09/28 12:28:26 INFO AbstractConnector: Started ServerConnector#2f0a0570{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
22/09/28 12:28:26 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1afe3ab7{/jobs,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#4318eaf1{/jobs/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#31a7233b{/jobs/job,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#2db507b1{/jobs/job/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#69ab6402{/stages,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#34f8396d{/stages/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#e3b0050{/stages/stage,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1e50a487{/stages/stage/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#58d7e2db{/stages/pool,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#73151501{/stages/pool/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1a9ef059{/storage,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7e3d6570{/storage/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#2e69b2c6{/storage/rdd,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#40447208{/storage/rdd/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#2431cfcb{/environment,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#2e719959{/environment/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#2ec050df{/executors,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#72d76b63{/executors/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7670552a{/executors/threadDump,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1b0420c5{/executors/threadDump/json,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#60bcb746{/static,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#6aea04c3{/,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#476f8bfa{/api,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#652b8b55{/jobs/job/kill,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#4d0d77ae{/stages/stage/kill,null,AVAILABLE,#Spark}
22/09/28 12:28:26 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:4040
22/09/28 12:28:26 INFO SparkContext: Added JAR file:/tmp/spark-bc069368-d1ab-4d24-a4e3-f7a8634a3d52/uber-jars-1.0-SNAPSHOT.jar at spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/uber-jars-1.0-SNAPSHOT.jar with timestamp 1664368105728
22/09/28 12:28:26 INFO SparkContext: Added JAR file:///home/hadoop/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.2.0.jar at spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/org.apache.hadoop_hadoop-aws-3.2.0.jar with timestamp 1664368105728
22/09/28 12:28:26 INFO SparkContext: Added JAR file:///home/hadoop/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar at spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar with timestamp 1664368105728
22/09/28 12:28:26 INFO SparkContext: Added file file:/tmp/spark-bc069368-d1ab-4d24-a4e3-f7a8634a3d52/varname.zip at spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/varname.zip with timestamp 1664368105728
22/09/28 12:28:26 INFO Utils: Copying /tmp/spark-bc069368-d1ab-4d24-a4e3-f7a8634a3d52/varname.zip to /tmp/spark-8a3a402d-55f0-4a4f-a4d1-ce318ac97655/userFiles-5a356bf3-38d4-424a-a72e-036ab107a80c/varname.zip
22/09/28 12:28:26 INFO SparkContext: Added file file:///home/hadoop/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.2.0.jar at spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/org.apache.hadoop_hadoop-aws-3.2.0.jar with timestamp 1664368105728
22/09/28 12:28:26 INFO Utils: Copying /home/hadoop/.ivy2/jars/org.apache.hadoop_hadoop-aws-3.2.0.jar to /tmp/spark-8a3a402d-55f0-4a4f-a4d1-ce318ac97655/userFiles-5a356bf3-38d4-424a-a72e-036ab107a80c/org.apache.hadoop_hadoop-aws-3.2.0.jar
22/09/28 12:28:26 INFO SparkContext: Added file file:///home/hadoop/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar at spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar with timestamp 1664368105728
22/09/28 12:28:26 INFO Utils: Copying /home/hadoop/.ivy2/jars/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar to /tmp/spark-8a3a402d-55f0-4a4f-a4d1-ce318ac97655/userFiles-5a356bf3-38d4-424a-a72e-036ab107a80c/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar
22/09/28 12:28:27 INFO Utils: Using initial executors = 3, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
22/09/28 12:28:27 INFO ExecutorContainerAllocator: Set total expected execs to {0=3}
22/09/28 12:28:27 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34635.
22/09/28 12:28:27 INFO NettyBlockTransferService: Server created on [2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:34635
22/09/28 12:28:27 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/09/28 12:28:27 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, [2600:1f18:1837:bf02:a556:ccd:86d7:a6c9], 34635, None)
22/09/28 12:28:27 INFO BlockManagerMasterEndpoint: Registering block manager [2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:34635 with 7.3 GiB RAM, BlockManagerId(driver, [2600:1f18:1837:bf02:a556:ccd:86d7:a6c9], 34635, None)
22/09/28 12:28:27 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, [2600:1f18:1837:bf02:a556:ccd:86d7:a6c9], 34635, None)
22/09/28 12:28:27 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, [2600:1f18:1837:bf02:a556:ccd:86d7:a6c9], 34635, None)
22/09/28 12:28:27 INFO ExecutorContainerAllocator: Going to request 3 executors for ResourceProfile Id: 0, target: 3 already provisioned: 0.
22/09/28 12:28:27 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#74d28ce8{/metrics/json,null,AVAILABLE,#Spark}
22/09/28 12:28:27 INFO DefaultEmrServerlessRMClient: Creating containers with container role SPARK_EXECUTOR and keys: Set(1, 2, 3)
22/09/28 12:28:27 INFO SingleEventLogFileWriter: Logging events to file:/var/log/spark/apps/00f4ck9kasg9e001.inprogress
22/09/28 12:28:27 INFO Utils: Using initial executors = 3, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
22/09/28 12:28:27 WARN ExecutorAllocationManager: Dynamic allocation without a shuffle service is an experimental feature.
22/09/28 12:28:27 INFO ExecutorContainerAllocator: Set total expected execs to {0=3}
22/09/28 12:28:27 INFO DefaultEmrServerlessRMClient: Containers created with container role SPARK_EXECUTOR. key to container id map: Map(2 -> b6c1c208-d6ae-f116-456c-a70e62753a3e, 1 -> eec1c208-d6a4-a06f-416f-d3542eb67229, 3 -> 20c1c208-d6b9-a01b-3d7d-4e5d1ab9d5ee)
22/09/28 12:28:32 INFO EmrServerlessClusterSchedulerBackend$EmrServerlessDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (2600:1f18:1837:bf02:751c:4b79:c015:1299:36790) with ID 2, ResourceProfileId 0
22/09/28 12:28:32 INFO ExecutorMonitor: New executor 2 has registered (new total is 1)
22/09/28 12:28:32 INFO EmrServerlessClusterSchedulerBackend$EmrServerlessDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (2600:1f18:1837:bf02:5500:4064:5306:1a1b:54690) with ID 3, ResourceProfileId 0
22/09/28 12:28:32 INFO ExecutorMonitor: New executor 3 has registered (new total is 2)
22/09/28 12:28:33 INFO BlockManagerMasterEndpoint: Registering block manager [2600:1f18:1837:bf02:751c:4b79:c015:1299]:37079 with 7.9 GiB RAM, BlockManagerId(2, [2600:1f18:1837:bf02:751c:4b79:c015:1299], 37079, None)
22/09/28 12:28:33 INFO BlockManagerMasterEndpoint: Registering block manager [2600:1f18:1837:bf02:5500:4064:5306:1a1b]:40287 with 7.9 GiB RAM, BlockManagerId(3, [2600:1f18:1837:bf02:5500:4064:5306:1a1b], 40287, None)
22/09/28 12:28:57 INFO EmrServerlessClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000000000(ns)
22/09/28 12:28:57 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
22/09/28 12:28:57 INFO SharedState: Warehouse path is 'file:/home/hadoop/spark-warehouse'.
22/09/28 12:28:57 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7ab37b5f{/SQL,null,AVAILABLE,#Spark}
22/09/28 12:28:57 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#3223cfe1{/SQL/json,null,AVAILABLE,#Spark}
22/09/28 12:28:57 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#76ac5ba2{/SQL/execution,null,AVAILABLE,#Spark}
22/09/28 12:28:57 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#105afb2a{/SQL/execution/json,null,AVAILABLE,#Spark}
22/09/28 12:28:57 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7877cc29{/static/sql,null,AVAILABLE,#Spark}
22/09/28 12:28:57 WARN SQLConf: The SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInWrite' has been deprecated in Spark v3.2 and may be removed in the future. Use 'spark.sql.parquet.datetimeRebaseModeInWrite' instead.
22/09/28 12:28:57 WARN SQLConf: The SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInWrite' has been deprecated in Spark v3.2 and may be removed in the future. Use 'spark.sql.parquet.datetimeRebaseModeInWrite' instead.
22/09/28 12:28:58 WARN SQLConf: The SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInWrite' has been deprecated in Spark v3.2 and may be removed in the future. Use 'spark.sql.parquet.datetimeRebaseModeInWrite' instead.
22/09/28 12:28:58 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
22/09/28 12:29:08 WARN package: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.sql.debug.maxToStringFields'.
22/09/28 12:29:11 WARN SQLConf: The SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInWrite' has been deprecated in Spark v3.2 and may be removed in the future. Use 'spark.sql.parquet.datetimeRebaseModeInWrite' instead.
22/09/28 12:29:12 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist
22/09/28 12:29:12 WARN HiveConf: HiveConf of name hive.stats.retries.wait does not exist
22/09/28 12:29:16 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0
22/09/28 12:29:16 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore UNKNOWN#10.95.30.61
22/09/28 12:29:16 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
22/09/28 12:29:17 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
22/09/28 12:36:26 WARN HeartbeatReceiver: Removing executor 26 with no recent heartbeats: 176265 ms exceeds timeout 120000 ms
22/09/28 12:39:26 WARN HeartbeatReceiver: Removing executor 3 with no recent heartbeats: 161128 ms exceeds timeout 120000 ms
22/09/28 12:39:26 ERROR TaskSchedulerImpl: Lost executor 3 on [2600:1f18:1837:bf02:5500:4064:5306:1a1b]: Executor heartbeat timed out after 161128 ms
22/09/28 12:39:26 WARN TaskSetManager: Lost task 0.0 in stage 26.0 (TID 1321) ([2600:1f18:1837:bf02:5500:4064:5306:1a1b] executor 3): ExecutorLostFailure (executor 3 exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 161128 ms
Executor 26 logs:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/spark/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
22/09/28 12:32:25 INFO CoarseGrainedExecutorBackend: Started daemon with process name: 30#ip-10-95-26-159.ec2.internal
22/09/28 12:32:25 INFO SignalUtils: Registering signal handler for TERM
22/09/28 12:32:25 INFO SignalUtils: Registering signal handler for HUP
22/09/28 12:32:25 INFO SignalUtils: Registering signal handler for INT
22/09/28 12:32:25 INFO SecurityManager: Changing view acls to: hadoop
22/09/28 12:32:25 INFO SecurityManager: Changing modify acls to: hadoop
22/09/28 12:32:25 INFO SecurityManager: Changing view acls groups to:
22/09/28 12:32:25 INFO SecurityManager: Changing modify acls groups to:
22/09/28 12:32:25 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
22/09/28 12:32:26 INFO TransportClientFactory: Successfully created connection to /2600:1f18:1837:bf02:a556:ccd:86d7:a6c9:33303 after 134 ms (55 ms spent in bootstraps)
22/09/28 12:32:26 INFO SecurityManager: Changing view acls to: hadoop
22/09/28 12:32:26 INFO SecurityManager: Changing modify acls to: hadoop
22/09/28 12:32:26 INFO SecurityManager: Changing view acls groups to:
22/09/28 12:32:26 INFO SecurityManager: Changing modify acls groups to:
22/09/28 12:32:26 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
22/09/28 12:32:26 INFO TransportClientFactory: Successfully created connection to /2600:1f18:1837:bf02:a556:ccd:86d7:a6c9:33303 after 5 ms (3 ms spent in bootstraps)
22/09/28 12:32:26 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-37e26761-99b2-4b65-94b7-4df6bf9905ea
22/09/28 12:32:26 INFO MemoryStore: MemoryStore started with capacity 7.9 GiB
22/09/28 12:32:26 INFO SubResultCacheManager: Sub-result caches are disabled.
22/09/28 12:32:26 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler#[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303
22/09/28 12:32:26 INFO ResourceUtils: ==============================================================
22/09/28 12:32:26 INFO ResourceUtils: No custom resources configured for spark.executor.
22/09/28 12:32:26 INFO ResourceUtils: ==============================================================
22/09/28 12:32:26 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
22/09/28 12:32:26 INFO Executor: Starting executor ID 26 on host [2600:1f18:1837:bf02:4600:e58e:ddf0:59df]
22/09/28 12:32:26 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43365.
22/09/28 12:32:26 INFO NettyBlockTransferService: Server created on [2600:1f18:1837:bf02:4600:e58e:ddf0:59df]:43365
22/09/28 12:32:26 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/09/28 12:32:26 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(26, [2600:1f18:1837:bf02:4600:e58e:ddf0:59df], 43365, None)
22/09/28 12:32:26 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(26, [2600:1f18:1837:bf02:4600:e58e:ddf0:59df], 43365, None)
22/09/28 12:32:26 INFO BlockManager: Initialized BlockManager: BlockManagerId(26, [2600:1f18:1837:bf02:4600:e58e:ddf0:59df], 43365, None)
22/09/28 12:32:26 INFO Executor: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar with timestamp 1664368105728
22/09/28 12:32:26 INFO TransportClientFactory: Successfully created connection to /2600:1f18:1837:bf02:a556:ccd:86d7:a6c9:33303 after 5 ms (3 ms spent in bootstraps)
22/09/28 12:32:26 INFO Utils: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar to /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/fetchFileTemp1966826708488121221.tmp
22/09/28 12:32:27 INFO PlatformInfo: Unable to read clusterId from http://localhost:8321/configuration, trying extra instance data file: /var/lib/instance-controller/extraInstanceData.json
22/09/28 12:32:27 INFO PlatformInfo: Unable to read clusterId from /var/lib/instance-controller/extraInstanceData.json, trying EMR job-flow data file: /var/lib/info/job-flow.json
22/09/28 12:32:27 INFO PlatformInfo: Unable to read clusterId from /var/lib/info/job-flow.json, out of places to look
22/09/28 12:32:27 INFO DefaultAWSCredentialsProviderFactory: Unable to create provider using constructor: DefaultAWSCredentialsProviderChain(java.net.URI, org.apache.hadoop.conf.Configuration)
22/09/28 12:32:27 INFO ClientConfigurationFactory: Set initial getObject socket timeout to 2000 ms.
22/09/28 12:32:27 INFO CoarseGrainedExecutorBackend: eagerFSInit: Eagerly initialized FileSystem at s3://does/not/exist in 1165 ms
22/09/28 12:32:29 INFO Utils: Copying /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/-1178519531664368105728_cache to /home/hadoop/./com.amazonaws_aws-java-sdk-bundle-1.11.375.jar
22/09/28 12:32:29 INFO Executor: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/varname.zip with timestamp 1664368105728
22/09/28 12:32:29 INFO Utils: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/varname.zip to /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/fetchFileTemp6614333905641289896.tmp
22/09/28 12:32:29 INFO Utils: Copying /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/-9167713411664368105728_cache to /home/hadoop/./varname.zip
22/09/28 12:32:29 INFO Executor: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/org.apache.hadoop_hadoop-aws-3.2.0.jar with timestamp 1664368105728
22/09/28 12:32:29 INFO Utils: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/files/org.apache.hadoop_hadoop-aws-3.2.0.jar to /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/fetchFileTemp4499682470385493011.tmp
22/09/28 12:32:29 INFO Utils: Copying /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/8736416361664368105728_cache to /home/hadoop/./org.apache.hadoop_hadoop-aws-3.2.0.jar
22/09/28 12:32:29 INFO Executor: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/uber-jars-1.0-SNAPSHOT.jar with timestamp 1664368105728
22/09/28 12:32:29 INFO Utils: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/uber-jars-1.0-SNAPSHOT.jar to /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/fetchFileTemp8998725698456889956.tmp
22/09/28 12:32:32 INFO Utils: Copying /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/-13557803421664368105728_cache to /home/hadoop/./uber-jars-1.0-SNAPSHOT.jar
22/09/28 12:32:32 INFO Executor: Adding file:/home/hadoop/./uber-jars-1.0-SNAPSHOT.jar to class loader
22/09/28 12:32:32 INFO Executor: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar with timestamp 1664368105728
22/09/28 12:32:32 INFO Utils: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/com.amazonaws_aws-java-sdk-bundle-1.11.375.jar to /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/fetchFileTemp2300668029112739372.tmp
22/09/28 12:32:35 INFO Utils: /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/-1974999101664368105728_cache has been previously copied to /home/hadoop/./com.amazonaws_aws-java-sdk-bundle-1.11.375.jar
22/09/28 12:32:35 INFO Executor: Adding file:/home/hadoop/./com.amazonaws_aws-java-sdk-bundle-1.11.375.jar to class loader
22/09/28 12:32:35 INFO Executor: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/org.apache.hadoop_hadoop-aws-3.2.0.jar with timestamp 1664368105728
22/09/28 12:32:35 INFO Utils: Fetching spark://[2600:1f18:1837:bf02:a556:ccd:86d7:a6c9]:33303/jars/org.apache.hadoop_hadoop-aws-3.2.0.jar to /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/fetchFileTemp8882461784948851806.tmp
22/09/28 12:32:35 INFO Utils: /tmp/spark-91625202-5612-49eb-b355-3f637abe1934/14299127831664368105728_cache has been previously copied to /home/hadoop/./org.apache.hadoop_hadoop-aws-3.2.0.jar
22/09/28 12:32:35 INFO Executor: Adding file:/home/hadoop/./org.apache.hadoop_hadoop-aws-3.2.0.jar to class loader
22/09/28 12:33:37 ERROR CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM
22/09/28 12:33:37 INFO MemoryStore: MemoryStore cleared
22/09/28 12:33:37 INFO BlockManager: BlockManager stopped
22/09/28 12:33:37 INFO ShutdownHookManager: Shutdown hook called
22/09/28 12:33:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-91625202-5612-49eb-b355-3f637abe1934
Based on your log provided, you didn't config your executor memory when you create EMR application?
ERROR CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM this error only indicates that your executor was killed but didn't mention about the reason behind. As your log shows that your executor was killed during the data fetching but not at the beginning of your data fetching or executor initialization, I suspect your executor was killed due to out of memory (OOM). Try to increase your executor memory when you create your EMR application. On the other hand, you can check if there is any data skew in your job since it might trigger OOM too.

Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve '`product`' given input columns: [jsontostructs(message)];

C:\Users\sorun\.jdks\openjdk-14.0.1\bin\java.exe "-javaagent:D:\Intellij IDEA\IntelliJ IDEA 2020.1.1\lib\idea_rt.jar=50945:D:\Intellij IDEA\IntelliJ IDEA 2020.1.1\bin" -Dfile.encoding=UTF-8 -classpath C:\Users\sorun\IdeaProjects\spark-streaming-kafka\target\classes;C:\Users\sorun\.m2\repository\org\apache\spark\spark-sql_2.11\2.2.0\spark-sql_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\com\univocity\univocity-parsers\2.2.1\univocity-parsers-2.2.1.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-sketch_2.11\2.2.0\spark-sketch_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-core_2.11\2.2.0\spark-core_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;C:\Users\sorun\.m2\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;C:\Users\sorun\.m2\repository\org\apache\commons\commons-compress\1.4.1\commons-compress-1.4.1.jar;C:\Users\sorun\.m2\repository\org\tukaani\xz\1.0\xz-1.0.jar;C:\Users\sorun\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;C:\Users\sorun\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7.jar;C:\Users\sorun\.m2\repository\org\apache\avro\avro-ipc\1.7.7\avro-ipc-1.7.7-tests.jar;C:\Users\sorun\.m2\repository\com\twitter\chill_2.11\0.8.0\chill_2.11-0.8.0.jar;C:\Users\sorun\.m2\repository\com\esotericsoftware\kryo-shaded\3.0.3\kryo-shaded-3.0.3.jar;C:\Users\sorun\.m2\repository\com\esotericsoftware\minlog\1.3.0\minlog-1.3.0.jar;C:\Users\sorun\.m2\repository\org\objenesis\objenesis\2.1\objenesis-2.1.jar;C:\Users\sorun\.m2\repository\com\twitter\chill-java\0.8.0\chill-java-0.8.0.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-client\2.6.5\hadoop-client-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-common\2.6.5\hadoop-common-2.6.5.jar;C:\Users\sorun\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;C:\Users\sorun\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;C:\Users\sorun\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;C:\Users\sorun\.m2\repository\commons-io\commons-io\2.4\commons-io-2.4.jar;C:\Users\sorun\.m2\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;C:\Users\sorun\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\sorun\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;C:\Users\sorun\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;C:\Users\sorun\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;C:\Users\sorun\.m2\repository\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;C:\Users\sorun\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-auth\2.6.5\hadoop-auth-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;C:\Users\sorun\.m2\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;C:\Users\sorun\.m2\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;C:\Users\sorun\.m2\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;C:\Users\sorun\.m2\repository\org\apache\curator\curator-client\2.6.0\curator-client-2.6.0.jar;C:\Users\sorun\.m2\repository\org\htrace\htrace-core\3.0.4\htrace-core-3.0.4.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.6.5\hadoop-hdfs-2.6.5.jar;C:\Users\sorun\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;C:\Users\sorun\.m2\repository\xerces\xercesImpl\2.9.1\xercesImpl-2.9.1.jar;C:\Users\sorun\.m2\repository\xml-apis\xml-apis\1.3.04\xml-apis-1.3.04.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.6.5\hadoop-mapreduce-client-app-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.6.5\hadoop-mapreduce-client-common-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.6.5\hadoop-yarn-client-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.6.5\hadoop-yarn-server-common-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.6.5\hadoop-mapreduce-client-shuffle-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.6.5\hadoop-yarn-api-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.6.5\hadoop-mapreduce-client-core-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.6.5\hadoop-yarn-common-2.6.5.jar;C:\Users\sorun\.m2\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;C:\Users\sorun\.m2\repository\javax\xml\stream\stax-api\1.0-2\stax-api-1.0-2.jar;C:\Users\sorun\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.9.13\jackson-jaxrs-1.9.13.jar;C:\Users\sorun\.m2\repository\org\codehaus\jackson\jackson-xc\1.9.13\jackson-xc-1.9.13.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.6.5\hadoop-mapreduce-client-jobclient-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\hadoop\hadoop-annotations\2.6.5\hadoop-annotations-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-launcher_2.11\2.2.0\spark-launcher_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-network-common_2.11\2.2.0\spark-network-common_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-network-shuffle_2.11\2.2.0\spark-network-shuffle_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-unsafe_2.11\2.2.0\spark-unsafe_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\net\java\dev\jets3t\jets3t\0.9.3\jets3t-0.9.3.jar;C:\Users\sorun\.m2\repository\org\apache\httpcomponents\httpcore\4.3.3\httpcore-4.3.3.jar;C:\Users\sorun\.m2\repository\org\apache\httpcomponents\httpclient\4.3.6\httpclient-4.3.6.jar;C:\Users\sorun\.m2\repository\javax\activation\activation\1.1.1\activation-1.1.1.jar;C:\Users\sorun\.m2\repository\mx4j\mx4j\3.0.2\mx4j-3.0.2.jar;C:\Users\sorun\.m2\repository\javax\mail\mail\1.4.7\mail-1.4.7.jar;C:\Users\sorun\.m2\repository\org\bouncycastle\bcprov-jdk15on\1.51\bcprov-jdk15on-1.51.jar;C:\Users\sorun\.m2\repository\com\jamesmurty\utils\java-xmlbuilder\1.0\java-xmlbuilder-1.0.jar;C:\Users\sorun\.m2\repository\net\iharder\base64\2.3.8\base64-2.3.8.jar;C:\Users\sorun\.m2\repository\org\apache\curator\curator-recipes\2.6.0\curator-recipes-2.6.0.jar;C:\Users\sorun\.m2\repository\org\apache\curator\curator-framework\2.6.0\curator-framework-2.6.0.jar;C:\Users\sorun\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;C:\Users\sorun\.m2\repository\com\google\guava\guava\16.0.1\guava-16.0.1.jar;C:\Users\sorun\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;C:\Users\sorun\.m2\repository\org\apache\commons\commons-lang3\3.5\commons-lang3-3.5.jar;C:\Users\sorun\.m2\repository\org\apache\commons\commons-math3\3.4.1\commons-math3-3.4.1.jar;C:\Users\sorun\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\sorun\.m2\repository\org\slf4j\slf4j-api\1.7.16\slf4j-api-1.7.16.jar;C:\Users\sorun\.m2\repository\org\slf4j\jul-to-slf4j\1.7.16\jul-to-slf4j-1.7.16.jar;C:\Users\sorun\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.16\jcl-over-slf4j-1.7.16.jar;C:\Users\sorun\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\sorun\.m2\repository\org\slf4j\slf4j-log4j12\1.7.16\slf4j-log4j12-1.7.16.jar;C:\Users\sorun\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\sorun\.m2\repository\org\xerial\snappy\snappy-java\1.1.2.6\snappy-java-1.1.2.6.jar;C:\Users\sorun\.m2\repository\net\jpountz\lz4\lz4\1.3.0\lz4-1.3.0.jar;C:\Users\sorun\.m2\repository\org\roaringbitmap\RoaringBitmap\0.5.11\RoaringBitmap-0.5.11.jar;C:\Users\sorun\.m2\repository\commons-net\commons-net\2.2\commons-net-2.2.jar;C:\Users\sorun\.m2\repository\org\scala-lang\scala-library\2.11.8\scala-library-2.11.8.jar;C:\Users\sorun\.m2\repository\org\json4s\json4s-jackson_2.11\3.2.11\json4s-jackson_2.11-3.2.11.jar;C:\Users\sorun\.m2\repository\org\json4s\json4s-core_2.11\3.2.11\json4s-core_2.11-3.2.11.jar;C:\Users\sorun\.m2\repository\org\json4s\json4s-ast_2.11\3.2.11\json4s-ast_2.11-3.2.11.jar;C:\Users\sorun\.m2\repository\org\scala-lang\scalap\2.11.0\scalap-2.11.0.jar;C:\Users\sorun\.m2\repository\org\scala-lang\scala-compiler\2.11.0\scala-compiler-2.11.0.jar;C:\Users\sorun\.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.1\scala-xml_2.11-1.0.1.jar;C:\Users\sorun\.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.0.1\scala-parser-combinators_2.11-1.0.1.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\core\jersey-client\2.22.2\jersey-client-2.22.2.jar;C:\Users\sorun\.m2\repository\javax\ws\rs\javax.ws.rs-api\2.0.1\javax.ws.rs-api-2.0.1.jar;C:\Users\sorun\.m2\repository\org\glassfish\hk2\hk2-api\2.4.0-b34\hk2-api-2.4.0-b34.jar;C:\Users\sorun\.m2\repository\org\glassfish\hk2\hk2-utils\2.4.0-b34\hk2-utils-2.4.0-b34.jar;C:\Users\sorun\.m2\repository\org\glassfish\hk2\external\aopalliance-repackaged\2.4.0-b34\aopalliance-repackaged-2.4.0-b34.jar;C:\Users\sorun\.m2\repository\org\glassfish\hk2\external\javax.inject\2.4.0-b34\javax.inject-2.4.0-b34.jar;C:\Users\sorun\.m2\repository\org\glassfish\hk2\hk2-locator\2.4.0-b34\hk2-locator-2.4.0-b34.jar;C:\Users\sorun\.m2\repository\org\javassist\javassist\3.18.1-GA\javassist-3.18.1-GA.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\core\jersey-common\2.22.2\jersey-common-2.22.2.jar;C:\Users\sorun\.m2\repository\javax\annotation\javax.annotation-api\1.2\javax.annotation-api-1.2.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\bundles\repackaged\jersey-guava\2.22.2\jersey-guava-2.22.2.jar;C:\Users\sorun\.m2\repository\org\glassfish\hk2\osgi-resource-locator\1.0.1\osgi-resource-locator-1.0.1.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\core\jersey-server\2.22.2\jersey-server-2.22.2.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\media\jersey-media-jaxb\2.22.2\jersey-media-jaxb-2.22.2.jar;C:\Users\sorun\.m2\repository\javax\validation\validation-api\1.1.0.Final\validation-api-1.1.0.Final.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet\2.22.2\jersey-container-servlet-2.22.2.jar;C:\Users\sorun\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet-core\2.22.2\jersey-container-servlet-core-2.22.2.jar;C:\Users\sorun\.m2\repository\io\netty\netty-all\4.0.43.Final\netty-all-4.0.43.Final.jar;C:\Users\sorun\.m2\repository\io\netty\netty\3.9.9.Final\netty-3.9.9.Final.jar;C:\Users\sorun\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;C:\Users\sorun\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.2\metrics-core-3.1.2.jar;C:\Users\sorun\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.2\metrics-jvm-3.1.2.jar;C:\Users\sorun\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.2\metrics-json-3.1.2.jar;C:\Users\sorun\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.2\metrics-graphite-3.1.2.jar;C:\Users\sorun\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.6.5\jackson-module-scala_2.11-2.6.5.jar;C:\Users\sorun\.m2\repository\com\fasterxml\jackson\module\jackson-module-paranamer\2.6.5\jackson-module-paranamer-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;C:\Users\sorun\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;C:\Users\sorun\.m2\repository\net\razorvine\pyrolite\4.13\pyrolite-4.13.jar;C:\Users\sorun\.m2\repository\net\sf\py4j\py4j\0.10.4\py4j-0.10.4.jar;C:\Users\sorun\.m2\repository\org\apache\commons\commons-crypto\1.0.0\commons-crypto-1.0.0.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-catalyst_2.11\2.2.0\spark-catalyst_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\scala-lang\scala-reflect\2.11.8\scala-reflect-2.11.8.jar;C:\Users\sorun\.m2\repository\org\codehaus\janino\janino\3.0.0\janino-3.0.0.jar;C:\Users\sorun\.m2\repository\org\codehaus\janino\commons-compiler\3.0.0\commons-compiler-3.0.0.jar;C:\Users\sorun\.m2\repository\org\antlr\antlr4-runtime\4.5.3\antlr4-runtime-4.5.3.jar;C:\Users\sorun\.m2\repository\commons-codec\commons-codec\1.10\commons-codec-1.10.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-tags_2.11\2.2.0\spark-tags_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\apache\parquet\parquet-column\1.8.2\parquet-column-1.8.2.jar;C:\Users\sorun\.m2\repository\org\apache\parquet\parquet-common\1.8.2\parquet-common-1.8.2.jar;C:\Users\sorun\.m2\repository\org\apache\parquet\parquet-encoding\1.8.2\parquet-encoding-1.8.2.jar;C:\Users\sorun\.m2\repository\org\apache\parquet\parquet-hadoop\1.8.2\parquet-hadoop-1.8.2.jar;C:\Users\sorun\.m2\repository\org\apache\parquet\parquet-format\2.3.1\parquet-format-2.3.1.jar;C:\Users\sorun\.m2\repository\org\apache\parquet\parquet-jackson\1.8.2\parquet-jackson-1.8.2.jar;C:\Users\sorun\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.11\jackson-mapper-asl-1.9.11.jar;C:\Users\sorun\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.11\jackson-core-asl-1.9.11.jar;C:\Users\sorun\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.6.5\jackson-databind-2.6.5.jar;C:\Users\sorun\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.6.0\jackson-annotations-2.6.0.jar;C:\Users\sorun\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.6.5\jackson-core-2.6.5.jar;C:\Users\sorun\.m2\repository\org\apache\xbean\xbean-asm5-shaded\4.4\xbean-asm5-shaded-4.4.jar;C:\Users\sorun\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;C:\Users\sorun\.m2\repository\org\apache\spark\spark-sql-kafka-0-10_2.11\2.2.0\spark-sql-kafka-0-10_2.11-2.2.0.jar;C:\Users\sorun\.m2\repository\org\apache\kafka\kafka-clients\0.10.0.1\kafka-clients-0.10.0.1.jar;C:\Users\sorun\.m2\repository\com\google\code\gson\gson\2.8.3\gson-2.8.3.jar StreamingConsumer
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/06/19 12:39:42 INFO SparkContext: Running Spark version 2.2.0
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/C:/Users/sorun/.m2/repository/org/apache/hadoop/hadoop-auth/2.6.5/hadoop-auth-2.6.5.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/06/19 12:39:43 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/06/19 12:39:44 INFO SparkContext: Submitted application: Streaming-kafka
20/06/19 12:39:44 INFO SecurityManager: Changing view acls to: OZAN-OKAN
20/06/19 12:39:44 INFO SecurityManager: Changing modify acls to: OZAN-OKAN
20/06/19 12:39:44 INFO SecurityManager: Changing view acls groups to:
20/06/19 12:39:44 INFO SecurityManager: Changing modify acls groups to:
20/06/19 12:39:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(OZAN-OKAN); groups with view permissions: Set(); users with modify permissions: Set(OZAN-OKAN); groups with modify permissions: Set()
20/06/19 12:39:45 INFO Utils: Successfully started service 'sparkDriver' on port 50966.
20/06/19 12:39:45 INFO SparkEnv: Registering MapOutputTracker
20/06/19 12:39:45 INFO SparkEnv: Registering BlockManagerMaster
20/06/19 12:39:45 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/06/19 12:39:45 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/06/19 12:39:45 INFO DiskBlockManager: Created local directory at C:\Users\sorun\AppData\Local\Temp\blockmgr-0794380e-6e2b-4559-bf6c-7d10c2074bc8
20/06/19 12:39:45 INFO MemoryStore: MemoryStore started with capacity 1040.4 MB
20/06/19 12:39:45 INFO SparkEnv: Registering OutputCommitCoordinator
20/06/19 12:39:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/06/19 12:39:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.56.1:4040
20/06/19 12:39:46 INFO Executor: Starting executor ID driver on host localhost
20/06/19 12:39:46 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50975.
20/06/19 12:39:46 INFO NettyBlockTransferService: Server created on 192.168.56.1:50975
20/06/19 12:39:46 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
20/06/19 12:39:46 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:50975 with 1040.4 MB RAM, BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.56.1, 50975, None)
20/06/19 12:39:46 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/Users/sorun/IdeaProjects/spark-streaming-kafka/spark-warehouse/').
20/06/19 12:39:46 INFO SharedState: Warehouse path is 'file:/C:/Users/sorun/IdeaProjects/spark-streaming-kafka/spark-warehouse/'.
20/06/19 12:39:47 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
20/06/19 12:39:47 INFO CatalystSqlParser: Parsing command: string
20/06/19 12:39:49 INFO SparkSqlParser: Parsing command: CAST(value AS STRING) message
Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve '`product`' given input columns: [jsontostructs(message)];
at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:88)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4$$anonfun$apply$10.apply(TreeNode.scala:323)
at scala.collection.MapLike$MappedValues$$anonfun$iterator$3.apply(MapLike.scala:246)
at scala.collection.MapLike$MappedValues$$anonfun$iterator$3.apply(MapLike.scala:246)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.IterableLike$$anon$1.foreach(IterableLike.scala:311)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.MapBuilder.$plus$plus$eq(MapBuilder.scala:25)
at scala.collection.TraversableViewLike$class.force(TraversableViewLike.scala:88)
at scala.collection.IterableLike$$anon$1.force(IterableLike.scala:311)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:331)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:268)
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:268)
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:279)
at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:289)
at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$6.apply(QueryPlan.scala:298)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:298)
at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:268)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:78)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:78)
at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:91)
at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.resolveAndBind(ExpressionEncoder.scala:256)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:206)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:170)
at org.apache.spark.sql.Dataset$.apply(Dataset.scala:61)
at org.apache.spark.sql.Dataset.as(Dataset.scala:380)
at StreamingConsumer.main(StreamingConsumer.java:24)
20/06/19 12:39:50 INFO SparkContext: Invoking stop() from shutdown hook
20/06/19 12:39:50 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040
20/06/19 12:39:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/06/19 12:39:50 INFO MemoryStore: MemoryStore cleared
20/06/19 12:39:50 INFO BlockManager: BlockManager stopped
20/06/19 12:39:50 INFO BlockManagerMaster: BlockManagerMaster stopped
20/06/19 12:39:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/06/19 12:39:50 INFO SparkContext: Successfully stopped SparkContext
20/06/19 12:39:50 INFO ShutdownHookManager: Shutdown hook called
20/06/19 12:39:50 INFO ShutdownHookManager: Deleting directory C:\Users\sorun\AppData\Local\Temp\spark-b70ecbcc-e6cf-4328-9069-97cc41cc72d7
Process finished with exit code 1
CODE
Exception in thread "main" org.apache.spark.sql.AnalysisException:
cannot resolve '`product`' given input columns: [jsontostructs(message)];
Above exception message says the column which you are selecting is not available in DataFrame, rename the column jsontostructs(message) to product & use this column in select.
And if you have "message" field in your model,
add it to schema struct type
StructType schema = new StructType().add("product","string").add("time", DataTypes.TimestampType).add("message", DataTypes.StringType);
Change schema).as("json"))
Dataset<SearchProductModel> data = load.selectExpr("CAST(value AS STRING) as message")
.select(functions.from_json(functions.col("message"), schema).as("json"))
.select("json.*")
.as(Encoders.bean(SearchProductModel.class));

jupyter notebook error when Starting Spark application using pyspark kernel

I've been trying to configure jupyter notebook and pyspark kernel. I am actually new to this and ubuntu os. When I tried to run some code in the jupyter notebook using pyspark kernel, I received the error log below.
Note that it used to work before but without SQL magic. After I installed sparkmagic to use SQL magic, this happened.
Appreciate your help, thanks.
ID YARN Application ID Kind State Spark UI Driver log Current session?
1 None pyspark idle ✔
The code failed because of a fatal error:
Session 1 unexpectedly reached final status 'error'. See logs:
stdout:
stderr:
19/10/12 16:47:57 WARN Utils: Your hostname, majd-desktop resolves to a loopback address: 127.0.1.1; using 192.168.1.6 instead (on interface enp1s0)
19/10/12 16:47:57 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/10/12 16:47:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
log4j:WARN No appenders could be found for logger (io.netty.util.internal.logging.InternalLoggerFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/10/12 16:48:00 INFO SparkContext: Running Spark version 2.4.4
19/10/12 16:48:00 INFO SparkContext: Submitted application: livy-session-1
19/10/12 16:48:00 INFO SecurityManager: Changing view acls to: majd
19/10/12 16:48:00 INFO SecurityManager: Changing modify acls to: majd
19/10/12 16:48:00 INFO SecurityManager: Changing view acls groups to:
19/10/12 16:48:00 INFO SecurityManager: Changing modify acls groups to:
19/10/12 16:48:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(majd); groups with view permissions: Set(); users with modify permissions: Set(majd); groups with modify permissions: Set()
19/10/12 16:48:00 INFO Utils: Successfully started service 'sparkDriver' on port 33779.
19/10/12 16:48:00 INFO SparkEnv: Registering MapOutputTracker
19/10/12 16:48:00 INFO SparkEnv: Registering BlockManagerMaster
19/10/12 16:48:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/10/12 16:48:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/10/12 16:48:00 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d9d22c37-be4c-4498-b115-2011ee176dbf
19/10/12 16:48:00 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
19/10/12 16:48:00 INFO SparkEnv: Registering OutputCommitCoordinator
19/10/12 16:48:00 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
19/10/12 16:48:00 INFO Utils: Successfully started service 'SparkUI' on port 4041.
19/10/12 16:48:00 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.6:4041
19/10/12 16:48:00 INFO SparkContext: Added JAR file:///home/majd/anaconda3/share/apache-livy-0.4.0.60ee047/rsc/target/jars/livy-api-0.4.0-incubating-SNAPSHOT.jar at spark://192.168.1.6:33779/jars/livy-api-0.4.0-incubating-SNAPSHOT.jar with timestamp 1570888080918
19/10/12 16:48:00 INFO SparkContext: Added JAR file:///home/majd/anaconda3/share/apache-livy-0.4.0.60ee047/rsc/target/jars/livy-rsc-0.4.0-incubating-SNAPSHOT.jar at spark://192.168.1.6:33779/jars/livy-rsc-0.4.0-incubating-SNAPSHOT.jar with timestamp 1570888080919
19/10/12 16:48:00 INFO SparkContext: Added JAR file:///home/majd/anaconda3/share/apache-livy-0.4.0.60ee047/rsc/target/jars/netty-all-4.0.29.Final.jar at spark://192.168.1.6:33779/jars/netty-all-4.0.29.Final.jar with timestamp 1570888080919
19/10/12 16:48:00 INFO SparkContext: Added JAR file:///home/majd/anaconda3/share/apache-livy-0.4.0.60ee047/repl/scala-2.11/target/jars/commons-codec-1.9.jar at spark://192.168.1.6:33779/jars/commons-codec-1.9.jar with timestamp 1570888080919
19/10/12 16:48:00 INFO SparkContext: Added JAR file:///home/majd/anaconda3/share/apache-livy-0.4.0.60ee047/repl/scala-2.11/target/jars/livy-core_2.11-0.4.0-incubating-SNAPSHOT.jar at spark://192.168.1.6:33779/jars/livy-core_2.11-0.4.0-incubating-SNAPSHOT.jar with timestamp 1570888080920
19/10/12 16:48:00 INFO SparkContext: Added JAR file:///home/majd/anaconda3/share/apache-livy-0.4.0.60ee047/repl/scala-2.11/target/jars/livy-repl_2.11-0.4.0-incubating-SNAPSHOT.jar at spark://192.168.1.6:33779/jars/livy-repl_2.11-0.4.0-incubating-SNAPSHOT.jar with timestamp 1570888080920
19/10/12 16:48:00 INFO Executor: Starting executor ID driver on host localhost
19/10/12 16:48:01 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38259.
19/10/12 16:48:01 INFO NettyBlockTransferService: Server created on 192.168.1.6:38259
19/10/12 16:48:01 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/10/12 16:48:01 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.6, 38259, None)
19/10/12 16:48:01 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.6:38259 with 366.3 MB RAM, BlockManagerId(driver, 192.168.1.6, 38259, None)
19/10/12 16:48:01 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.6, 38259, None)
19/10/12 16:48:01 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.6, 38259, None).
Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context.
b) Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c) Restart the kernel.

Spark Standalone Mode, application runs, but executor is killed with exitStatus 1

I am new to Apache Spark and was trying to run the example Pi Calculation application on my local spark setup (using Standalone Cluster).
Both the Master, Slave and Driver are running on my local machine.
What I am noticing is that, the PI is calculated successfully, however in the slave logs I see that the Worker/Executor is being killed with exitStatus 1.
I do not see any errors/exceptions logged to the console otherwise.
I tried finding help on similar issue, but most of the search hits were referring to exitStatus 137 etc. (e.g: Spark application kills executor)
I have failed miserably to understand why the Worker is being killed instead of completing the execution with 'EXITED' state. I think it's related to how I am executing the app, but am not quite clear what am I doing wrong.
Can someone guide me on identifying the root cause?
Given below is the code I am using for PI calculation and the logs of the master, slave, driver respsectively.
PI Calculation Application
package sparky
import org.apache.spark.scheduler._
import org.apache.spark.sql.SparkSession
import scala.math.random
object Application {
def runSpark(args: Array[String] ): Unit = {
val spark = SparkSession
.builder
.appName("Spark Pi")
.getOrCreate()
spark.sparkContext.addSparkListener(new MyListener())
val slices = if (args.length > 0) args(0).toInt else 2
val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
val count = spark.sparkContext.parallelize(1 until n, slices).map { i =>
val x = random * 2 - 1
val y = random * 2 - 1
if (x * x + y * y <= 1) 1 else 0
}.reduce(_ + _)
println("Pi is roughly " + 4.0 * count / (n - 1))
spark.stop()
}
def main(args: Array[String]) = {
Application.runSpark(args)
}
}
Master Console Output
C:\Servers\apache-spark\2.2.0\bin
λ start-master.cmd -h 0.0.0.0
C:\Platforms\Java\jdk1.8.0_65\bin\java -cp "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" -Xmx1g org.apache.spark.deploy.master.Master
18/01/25 09:01:30,099 INFO Master: Started daemon with process name: 14900#somemachine
18/01/25 09:01:30,580 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/25 09:01:30,680 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:30,681 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:30,682 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:30,683 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:30,684 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(someuser); groups with view permissions: Set(); users with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:31,711 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
18/01/25 09:01:31,829 INFO Master: Starting Spark master at spark://0.0.0.0:7077
18/01/25 09:01:31,833 INFO Master: Running Spark version 2.2.0
18/01/25 09:01:31,903 INFO log: Logging initialized #2692ms
18/01/25 09:01:31,960 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:32,025 INFO Server: Started #2816ms
18/01/25 09:01:32,057 INFO AbstractConnector: Started ServerConnector#106ca013{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
18/01/25 09:01:32,058 INFO Utils: Successfully started service 'MasterUI' on port 8080.
18/01/25 09:01:32,087 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#41cc88b{/app,null,AVAILABLE,#Spark}
18/01/25 09:01:32,088 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1c63bda6{/app/json,null,AVAILABLE,#Spark}
18/01/25 09:01:32,089 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#45ae273f{/,null,AVAILABLE,#Spark}
18/01/25 09:01:32,090 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7a319c60{/json,null,AVAILABLE,#Spark}
18/01/25 09:01:32,098 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#23510beb{/static,null,AVAILABLE,#Spark}
18/01/25 09:01:32,099 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#462c632c{/app/kill,null,AVAILABLE,#Spark}
18/01/25 09:01:32,101 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#436ef27b{/driver/kill,null,AVAILABLE,#Spark}
18/01/25 09:01:32,104 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.56.1:8080
18/01/25 09:01:32,119 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:32,130 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#6f7d1cba{/,null,AVAILABLE}
18/01/25 09:01:32,134 INFO AbstractConnector: Started ServerConnector#3f9e9637{HTTP/1.1,[http/1.1]}{0.0.0.0:6066}
18/01/25 09:01:32,134 INFO Server: Started #2925ms
18/01/25 09:01:32,134 INFO Utils: Successfully started service on port 6066.
18/01/25 09:01:32,135 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
18/01/25 09:01:32,358 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7b3e5adb{/metrics/master/json,null,AVAILABLE,#Spark}
18/01/25 09:01:32,362 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#139cbe00{/metrics/applications/json,null,AVAILABLE,#Spark}
18/01/25 09:01:32,399 INFO Master: I have been elected leader! New state: ALIVE
18/01/25 09:01:41,225 INFO Master: Registering worker 192.168.56.1:48591 with 4 cores, 14.4 GB RAM
18/01/25 09:01:53,510 INFO Master: Registering app Spark Pi
18/01/25 09:01:53,515 INFO Master: Registered app Spark Pi with ID app-20180125090153-0000
18/01/25 09:01:53,569 INFO Master: Launching executor app-20180125090153-0000/0 on worker worker-20180125090140-192.168.56.1-48591
18/01/25 09:02:00,262 INFO Master: Received unregister request from application app-20180125090153-0000
18/01/25 09:02:00,269 INFO Master: Removing app app-20180125090153-0000
18/01/25 09:02:00,323 WARN Master: Got status update for unknown executor app-20180125090153-0000/0
18/01/25 09:02:00,338 INFO Master: 127.0.0.1:48625 got disassociated, removing it.
18/01/25 09:02:00,345 INFO Master: 192.168.56.1:48620 got disassociated, removing it.
Slave Console Output
C:\Servers\apache-spark\2.2.0\bin
λ start-slave.cmd -h 0.0.0.0
C:\Platforms\Java\jdk1.8.0_65\bin\java -cp "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" -Xmx1g org.apache.spark.deploy.worker.Worker spark://0.0.0.0:7077
18/01/25 09:01:38,054 INFO Worker: Started daemon with process name: 14532#somemachine
18/01/25 09:01:38,546 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/25 09:01:38,644 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:38,645 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:38,646 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:38,647 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:38,648 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(someuser); groups with view permissions: Set(); users with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:39,655 INFO Utils: Successfully started service 'sparkWorker' on port 48591.
18/01/25 09:01:40,521 INFO Worker: Starting Spark worker 192.168.56.1:48591 with 4 cores, 14.4 GB RAM
18/01/25 09:01:40,526 INFO Worker: Running Spark version 2.2.0
18/01/25 09:01:40,527 INFO Worker: Spark home: C:\Servers\apache-spark\2.2.0\bin\..
18/01/25 09:01:40,586 INFO log: Logging initialized #3430ms
18/01/25 09:01:40,636 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:40,657 INFO Server: Started #3503ms
18/01/25 09:01:40,787 WARN Utils: Service 'WorkerUI' could not bind on port 8081. Attempting port 8082.
18/01/25 09:01:40,797 INFO AbstractConnector: Started ServerConnector#24c54ec4{HTTP/1.1,[http/1.1]}{0.0.0.0:8082}
18/01/25 09:01:40,797 INFO Utils: Successfully started service 'WorkerUI' on port 8082.
18/01/25 09:01:40,832 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#6e86345{/logPage,null,AVAILABLE,#Spark}
18/01/25 09:01:40,833 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#43dbfd42{/logPage/json,null,AVAILABLE,#Spark}
18/01/25 09:01:40,834 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#768b7729{/,null,AVAILABLE,#Spark}
18/01/25 09:01:40,836 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#382e7183{/json,null,AVAILABLE,#Spark}
18/01/25 09:01:40,844 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#459d7b70{/static,null,AVAILABLE,#Spark}
18/01/25 09:01:40,845 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#5bf4fc9c{/log,null,AVAILABLE,#Spark}
18/01/25 09:01:40,849 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://192.168.56.1:8082
18/01/25 09:01:40,853 INFO Worker: Connecting to master 0.0.0.0:7077...
18/01/25 09:01:40,885 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#4e93ba9d{/metrics/json,null,AVAILABLE,#Spark}
18/01/25 09:01:40,971 INFO TransportClientFactory: Successfully created connection to /0.0.0.0:7077 after 82 ms (0 ms spent in bootstraps)
18/01/25 09:01:41,246 INFO Worker: Successfully registered with master spark://0.0.0.0:7077
18/01/25 09:01:53,621 INFO Worker: Asked to launch executor app-20180125090153-0000/0 for Spark Pi
18/01/25 09:01:53,661 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:53,663 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:53,664 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:53,668 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:53,669 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(someuser); groups with view permissions: Set(); users with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:53,695 INFO ExecutorRunner: Launch command: "C:\Platforms\Java\jdk1.8.0_65\bin\java" "-cp" "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" "-Xmx1024M" "-Dspark.driver.port=48620" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler#192.168.56.1:48620" "--executor-id" "0" "--hostname" "192.168.56.1" "--cores" "4" "--app-id" "app-20180125090153-0000" "--worker-url" "spark://Worker#192.168.56.1:48591"
18/01/25 09:02:00,297 INFO Worker: Asked to kill executor app-20180125090153-0000/0
18/01/25 09:02:00,303 INFO ExecutorRunner: Runner thread for executor app-20180125090153-0000/0 interrupted
18/01/25 09:02:00,305 INFO ExecutorRunner: Killing process!
18/01/25 09:02:00,323 INFO Worker: Executor app-20180125090153-0000/0 finished with state KILLED exitStatus 1
18/01/25 09:02:00,336 INFO ExternalShuffleBlockResolver: Application app-20180125090153-0000 removed, cleanupLocalDirs = true
18/01/25 09:02:00,340 INFO Worker: Cleaning up local directories for application app-20180125090153-0000
Driver Console Output
9:01:47 AM: Executing task 'submitToSpark'...
C:\Applications\scala\sparky\app\build\libs\sparky-app-0.0.1.jar
:app:compileJava NO-SOURCE
:app:compileScala UP-TO-DATE
:app:processResources NO-SOURCE
:app:classes UP-TO-DATE
:app:jar UP-TO-DATE
:runner:submitToSpark
C:\Platforms\Java\jdk1.8.0_65\bin\java -cp "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" -Xmx1g org.apache.spark.deploy.SparkSubmit --master spark://localhost:7077 C:\Applications\scala\sparky\app\build\libs\sparky-app-0.0.1.jar
18/01/25 09:01:51,111 INFO SparkContext: Running Spark version 2.2.0
18/01/25 09:01:51,465 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/25 09:01:51,677 INFO SparkContext: Submitted application: Spark Pi
18/01/25 09:01:51,711 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:51,712 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:51,712 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:51,713 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:51,714 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(someuser); groups with view permissions: Set(); users with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:52,639 INFO Utils: Successfully started service 'sparkDriver' on port 48620.
18/01/25 09:01:52,669 INFO SparkEnv: Registering MapOutputTracker
18/01/25 09:01:52,695 INFO SparkEnv: Registering BlockManagerMaster
18/01/25 09:01:52,699 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/01/25 09:01:52,700 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/01/25 09:01:52,712 INFO DiskBlockManager: Created local directory at C:\Users\someuser\AppData\Local\Temp\blockmgr-f9908c61-a91a-43d5-8d24-e0fd86d55d1c
18/01/25 09:01:52,740 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
18/01/25 09:01:52,808 INFO SparkEnv: Registering OutputCommitCoordinator
18/01/25 09:01:52,924 INFO log: Logging initialized #3539ms
18/01/25 09:01:53,009 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:53,038 INFO Server: Started #3654ms
18/01/25 09:01:53,067 INFO AbstractConnector: Started ServerConnector#21a5fd96{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
18/01/25 09:01:53,067 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/01/25 09:01:53,099 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#40bffbca{/jobs,null,AVAILABLE,#Spark}
18/01/25 09:01:53,100 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#6c4f9535{/jobs/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,100 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#30c31dd7{/jobs/job,null,AVAILABLE,#Spark}
18/01/25 09:01:53,101 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#c1fca1e{/jobs/job/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,102 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#344344fa{/stages,null,AVAILABLE,#Spark}
18/01/25 09:01:53,103 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#70e659aa{/stages/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,103 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#285f09de{/stages/stage,null,AVAILABLE,#Spark}
18/01/25 09:01:53,105 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#48e64352{/stages/stage/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,106 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#4362d7df{/stages/pool,null,AVAILABLE,#Spark}
18/01/25 09:01:53,106 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1c25b8a7{/stages/pool/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,107 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#750fe12e{/storage,null,AVAILABLE,#Spark}
18/01/25 09:01:53,108 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#3e587920{/storage/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,108 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#24f43aa3{/storage/rdd,null,AVAILABLE,#Spark}
18/01/25 09:01:53,109 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1e11bc55{/storage/rdd/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,110 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#70e0accd{/environment,null,AVAILABLE,#Spark}
18/01/25 09:01:53,112 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#6ab72419{/environment/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,112 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#4fdfa676{/executors,null,AVAILABLE,#Spark}
18/01/25 09:01:53,113 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#5be82d43{/executors/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,114 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#345e5a17{/executors/threadDump,null,AVAILABLE,#Spark}
18/01/25 09:01:53,115 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#443dbe42{/executors/threadDump/json,null,AVAILABLE,#Spark}
18/01/25 09:01:53,125 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#1734f68{/static,null,AVAILABLE,#Spark}
18/01/25 09:01:53,125 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#31c269fd{/,null,AVAILABLE,#Spark}
18/01/25 09:01:53,127 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#47747fb9{/api,null,AVAILABLE,#Spark}
18/01/25 09:01:53,128 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#70eecdc2{/jobs/job/kill,null,AVAILABLE,#Spark}
18/01/25 09:01:53,129 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#7db0565c{/stages/stage/kill,null,AVAILABLE,#Spark}
18/01/25 09:01:53,133 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.56.1:4040
18/01/25 09:01:53,174 INFO SparkContext: Added JAR file:/C:/Applications/scala/sparky/app/build/libs/sparky-app-0.0.1.jar at spark://192.168.56.1:48620/jars/sparky-app-0.0.1.jar with timestamp 1516888913174
18/01/25 09:01:53,318 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://localhost:7077...
18/01/25 09:01:53,389 INFO TransportClientFactory: Successfully created connection to localhost/127.0.0.1:7077 after 42 ms (0 ms spent in bootstraps)
18/01/25 09:01:53,554 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20180125090153-0000
18/01/25 09:01:53,577 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 48642.
18/01/25 09:01:53,578 INFO NettyBlockTransferService: Server created on 192.168.56.1:48642
18/01/25 09:01:53,582 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/01/25 09:01:53,590 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,595 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:48642 with 366.3 MB RAM, BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,600 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,601 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,667 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20180125090153-0000/0 on worker-20180125090140-192.168.56.1-48591 (192.168.56.1:48591) with 4 cores
18/01/25 09:01:53,668 INFO StandaloneSchedulerBackend: Granted executor ID app-20180125090153-0000/0 on hostPort 192.168.56.1:48591 with 4 cores, 1024.0 MB RAM
18/01/25 09:01:53,901 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#74fef3f7{/metrics/json,null,AVAILABLE,#Spark}
18/01/25 09:01:55,026 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20180125090153-0000/0 is now RUNNING
18/01/25 09:01:55,096 INFO EventLoggingListener: Logging events to file:///C:/Dustbin/spark-events/app-20180125090153-0000
18/01/25 09:01:55,127 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
18/01/25 09:01:55,218 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/Applications/scala/sparky/runner/spark-warehouse/').
18/01/25 09:01:55,219 INFO SharedState: Warehouse path is 'file:/C:/Applications/scala/sparky/runner/spark-warehouse/'.
18/01/25 09:01:55,228 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#50a691d3{/SQL,null,AVAILABLE,#Spark}
18/01/25 09:01:55,228 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#3b95d13c{/SQL/json,null,AVAILABLE,#Spark}
18/01/25 09:01:55,229 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#54d901aa{/SQL/execution,null,AVAILABLE,#Spark}
18/01/25 09:01:55,230 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#573284a5{/SQL/execution/json,null,AVAILABLE,#Spark}
18/01/25 09:01:55,233 INFO ContextHandler: Started o.s.j.s.ServletContextHandler#507b79f7{/static/sql,null,AVAILABLE,#Spark}
18/01/25 09:01:56,232 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
18/01/25 09:01:56,609 INFO SparkContext: Starting job: reduce at Application.scala:29
18/01/25 09:01:56,636 INFO DAGScheduler: Got job 0 (reduce at Application.scala:29) with 2 output partitions
18/01/25 09:01:56,637 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at Application.scala:29)
18/01/25 09:01:56,638 INFO DAGScheduler: Parents of final stage: List()
18/01/25 09:01:56,640 INFO DAGScheduler: Missing parents: List()
18/01/25 09:01:56,654 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at Application.scala:25), which has no missing parents
18/01/25 09:01:56,815 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1800.0 B, free 366.3 MB)
18/01/25 09:01:56,980 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1168.0 B, free 366.3 MB)
18/01/25 09:01:56,984 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.56.1:48642 (size: 1168.0 B, free: 366.3 MB)
18/01/25 09:01:56,988 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
18/01/25 09:01:57,016 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at Application.scala:25) (first 15 tasks are for partitions Vector(0, 1))
18/01/25 09:01:57,018 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
18/01/25 09:01:58,617 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.56.1:48660) with ID 0
18/01/25 09:01:58,661 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.56.1, executor 0, partition 0, PROCESS_LOCAL, 4829 bytes)
18/01/25 09:01:58,665 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 192.168.56.1, executor 0, partition 1, PROCESS_LOCAL, 4829 bytes)
18/01/25 09:01:59,242 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:48678 with 366.3 MB RAM, BlockManagerId(0, 192.168.56.1, 48678, None)
18/01/25 09:01:59,819 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.56.1:48678 (size: 1168.0 B, free: 366.3 MB)
18/01/25 09:02:00,139 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1500 ms on 192.168.56.1 (executor 0) (1/2)
18/01/25 09:02:00,142 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1478 ms on 192.168.56.1 (executor 0) (2/2)
18/01/25 09:02:00,143 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
18/01/25 09:02:00,150 INFO DAGScheduler: ResultStage 0 (reduce at Application.scala:29) finished in 3.109 s
18/01/25 09:02:00,156 INFO DAGScheduler: Job 0 finished: reduce at Application.scala:29, took 3.546255 s
Pi is roughly 3.1363756818784094
18/01/25 09:02:00,168 INFO AbstractConnector: Stopped Spark#21a5fd96{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
18/01/25 09:02:00,170 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040
18/01/25 09:02:00,247 INFO StandaloneSchedulerBackend: Shutting down all executors
18/01/25 09:02:00,249 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
18/01/25 09:02:00,269 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/01/25 09:02:00,300 INFO MemoryStore: MemoryStore cleared
18/01/25 09:02:00,301 INFO BlockManager: BlockManager stopped
18/01/25 09:02:00,321 INFO BlockManagerMaster: BlockManagerMaster stopped
18/01/25 09:02:00,328 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/01/25 09:02:00,353 INFO SparkContext: Successfully stopped SparkContext
2018-01-25 09:02:00.353
18/01/25 09:02:00,358 INFO ShutdownHookManager: Shutdown hook called
18/01/25 09:02:00,360 INFO ShutdownHookManager: Deleting directory C:\Users\someuser\AppData\Local\Temp\spark-ac6369a0-abb8-476e-a527-91e0a8011302
BUILD SUCCESSFUL in 13s
3 actionable tasks: 1 executed, 2 up-to-date
9:02:01 AM: Task execution finished 'submitToSpark'.

Resources