I am trying to install Spark on Hadoop-Yarn and I'm getting the error which I believe is due to a configuration error. I have a fully functioning hadoop-yarn installation on Ubuntu. When I execute the spark-submit command or spark-shell command I get the following error. I would like to know whether I have set the IP addresses correctly in the respective files? Currently I'm usung the same IP for both hadoop and Spark. As I want to configure spark to use hdfs and yarn do I need to have seperate IP addresses for SPARK_LOCAL_IP and SPARK_MASTER_IP in spark-env.sh?
ERROR spark.SparkContext: Error initializing SparkContext.
java.net.ConnectException: Call From hadoop-VirtualBox/127.0.1.1 to
hadoop-VirtualBox:9000 failed on connection exception:
java.net.ConnectException: Connection refused;
Following are the versions of software I'm using
Ubuntu: 18.01.1 LTS
Hadoop: 3.0.3
Spark: 2.44
Scala: 2.12.0
Java: 1.8.0
I downloaded a pre-built version of Spark for hadoop from this link. Following is the IP given to hadoop in /etc/hosts.txt
127.0.0.1 localhost
127.0.1.1 hadoop-VirtualBox #hadoop node master
.profile configuration file (I have my environment set up in .profile instead of .bashrc)
export PATH=$PATH:/usr/local/hadoop/bin/:/usr/local/hadoop/sbin/
export CLASSPATH=$CLASSPATH:/usr/local/hadoop/lib/*:.
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=$JAVA_HOME/bin:$PATH
PATH=/usr/local/Spark/bin:$PATH
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
export YARN_CONF_DIR=/usr/local/hadoop/etc/hadoop
export SPARK_HOME=/usr/local/Spark
export LD_LIBRARY_PATH=/usr/local/hadoop/lib/native:$LD_LIBRARY_PATH
export SCALA_HOME=/usr/local/Scala
export PATH=$SCALA_HOME:bin:$PATH
spark-env.sh
export SCALA_HOME=/usr/local/Scala
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export SPARK_WORKER_MEMORY=1g
export SPARK_WORKER_INSTANCES=2
export SPARK_MASTER_IP=127.0.1.1
#export SPARK_MASTER_PORT=9000
export SPARK_WORKER_DIR=/usr/local/Spark/tmp
# Options read in YARN client mode
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
export SPARK_CONF_DIR=/usr/local/Spark/conf
export YARN_CONF_DIR=/usr/local/hadoop/etc/hadoop
export SPARK_EXECUTOR_INSTANCES=2
export SPARK_EXECUTOR_CORES=2
export SPARK_EXECUTOR_MEMORY=1G
export SPARK_DRIVER_MEMORY=1G
export SPARK_YARN_APP_NAME=Spark
spark-default.conf
spark.master yarn
spark.eventLog.enabled true
spark.eventLog.dir hdfs://hadoop-VirtualBox:9000/spark-logs
spark.yarn.am.memory 512m
spark.serializer org.apache.spark.serializer.KryoSerializer
spark.yarn.jars hdfs://hadoop-VirtualBox:9000/spark-jars
First I start the hadoop services and then start the spark services as follows:
start-dfs.sh
start-yarn.sh
jps
hdfs dfs -mkdir /spark-logs
hdfs dfs -mkdir /spark-jars
#spark-jars.zip is a zip file of the jars folder in $SPARK_HOME
hdfs dfs -put /usr/local/Spark/spark-jars.zip /spark-jars
cd /usr/local/Spark/sbin
./start-all.sh
spark-submit --class org.apache.spark.examples.JavaSparkPi --master yarn --deploy-mode client /usr/local/Spark/examples/jars/spark-examples_2.11-2.4.4.jar 10
Follwoing is the trace in the terminal.
2019-10-20 11:55:39,512 WARN util.Utils: Your hostname, hadoop-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface enp0s3)
2019-10-20 11:55:39,519 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
2019-10-20 11:55:43,942 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-10-20 11:55:47,883 INFO spark.SparkContext: Running Spark version 2.4.4
2019-10-20 11:55:48,135 INFO spark.SparkContext: Submitted application: JavaSparkPi
2019-10-20 11:55:48,858 INFO spark.SecurityManager: Changing view acls to: hadoop
2019-10-20 11:55:48,858 INFO spark.SecurityManager: Changing modify acls to: hadoop
2019-10-20 11:55:48,859 INFO spark.SecurityManager: Changing view acls groups to:
2019-10-20 11:55:48,859 INFO spark.SecurityManager: Changing modify acls groups to:
2019-10-20 11:55:48,859 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set()
2019-10-20 11:55:50,722 INFO util.Utils: Successfully started service 'sparkDriver' on port 44765.
2019-10-20 11:55:53,863 INFO spark.SparkEnv: Registering MapOutputTracker
2019-10-20 11:55:54,364 INFO spark.SparkEnv: Registering BlockManagerMaster
2019-10-20 11:55:54,395 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-10-20 11:55:54,407 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2019-10-20 11:55:55,024 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-75df7314-58f9-4c97-b827-e66072015afa
2019-10-20 11:55:55,815 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
2019-10-20 11:55:56,962 INFO spark.SparkEnv: Registering OutputCommitCoordinator
2019-10-20 11:55:58,780 INFO util.log: Logging initialized #26940ms
2019-10-20 11:56:00,794 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2019-10-20 11:56:01,372 INFO server.Server: Started #29549ms
2019-10-20 11:56:01,754 INFO server.AbstractConnector: Started ServerConnector#6b648010{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-10-20 11:56:01,772 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
2019-10-20 11:56:02,378 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#6ac4944a{/jobs,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,422 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#1e34c607{/jobs/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,468 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#5215cd9a{/jobs/job,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,528 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#31198ceb{/jobs/job/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,596 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#9257031{/stages,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,656 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#75201592{/stages/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,672 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#7726e185{/stages/stage,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,721 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#5dda14d0{/stages/stage/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,759 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#1db0ec27{/stages/pool,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,815 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#3d9fc57a{/stages/pool/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,855 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#d4ab71a{/storage,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,923 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#3b4ef7{/storage/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,941 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#1af05b03{/storage/rdd,null,AVAILABLE,#Spark}
2019-10-20 11:56:02,982 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#5987e932{/storage/rdd/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,051 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#1ad777f{/environment,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,098 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#5bbbdd4b{/environment/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,135 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#438bad7c{/executors,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,160 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#25230246{/executors/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,194 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#4fdf8f12{/executors/threadDump,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,234 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#4a8b5227{/executors/threadDump/json,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,479 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#54f5f647{/static,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,503 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#2899a8db{/,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,559 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#1e8823d2{/api,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,602 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#4c432866{/jobs/job/kill,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,657 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler#12365c88{/stages/stage/kill,null,AVAILABLE,#Spark}
2019-10-20 11:56:03,776 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.2.15:4040
2019-10-20 11:56:04,336 INFO spark.SparkContext: Added JAR file:/usr/local/Spark/examples/jars/spark-examples_2.11-2.4.4.jar at spark://10.0.2.15:44765/jars/spark-examples_2.11-2.4.4.jar with timestamp 1571568964292
2019-10-20 11:56:15,228 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
2019-10-20 11:56:19,448 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
2019-10-20 11:56:20,449 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
2019-10-20 11:56:20,455 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
2019-10-20 11:56:20,476 INFO yarn.Client: Setting up container launch context for our AM
2019-10-20 11:56:20,553 INFO yarn.Client: Setting up the launch environment for our AM container
2019-10-20 11:56:20,753 INFO yarn.Client: Preparing resources for our AM container
2019-10-20 11:56:21,859 INFO yarn.Client: Deleted staging directory hdfs://localhost:9000/user/hadoop/.sparkStaging/application_1571568174433_0002
2019-10-20 11:56:21,887 ERROR spark.SparkContext: Error initializing SparkContext.
java.net.ConnectException: Call From hadoop-VirtualBox/127.0.1.1 to hadoop-VirtualBox:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1657)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$5.apply(Client.scala:528)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$5.apply(Client.scala:524)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:524)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:865)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:179)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at org.apache.spark.examples.JavaSparkPi.main(JavaSparkPi.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
... 47 more
2019-10-20 11:56:22,320 INFO server.AbstractConnector: Stopped Spark#6b648010{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-10-20 11:56:22,378 INFO ui.SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
2019-10-20 11:56:22,721 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
2019-10-20 11:56:22,922 INFO cluster.YarnClientSchedulerBackend: Stopped
2019-10-20 11:56:23,156 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
2019-10-20 11:56:23,355 INFO memory.MemoryStore: MemoryStore cleared
2019-10-20 11:56:23,365 INFO storage.BlockManager: BlockManager stopped
2019-10-20 11:56:23,566 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
2019-10-20 11:56:23,569 WARN metrics.MetricsSystem: Stopping a MetricsSystem that is not running
2019-10-20 11:56:23,627 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
2019-10-20 11:56:23,725 INFO spark.SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.net.ConnectException: Call From hadoop-VirtualBox/127.0.1.1 to hadoop-VirtualBox:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy13.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1657)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$5.apply(Client.scala:528)
at org.apache.spark.deploy.yarn.Client$$anonfun$prepareLocalResources$5.apply(Client.scala:524)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:524)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:865)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:179)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:183)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at org.apache.spark.examples.JavaSparkPi.main(JavaSparkPi.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
... 47 more
2019-10-20 11:56:23,899 INFO util.ShutdownHookManager: Shutdown hook called
2019-10-20 11:56:23,913 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-1659f92f-aa82-4f31-9183-f9b95d9375e3
2019-10-20 11:56:23,946 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-f353495e-4f40-48b2-91a3-a3e2caeb3500
You must set your Local IP in Spark-env.sh file.
like: SPARK_LOCAL_IP=""
This question already has answers here:
UnsatisfiedLinkError: no snappyjava in java.library.path when running Spark MLLib Unit test within Intellij
(4 answers)
UnsatisfiedLinkError: /tmp/snappy-1.1.4-libsnappyjava.so Error loading shared library ld-linux-x86-64.so.2: No such file or directory
(8 answers)
spark returns error libsnappyjava.so: failed to map segment from shared object: Operation not permitted
(2 answers)
Closed 3 years ago.
I am running CDH 5.16 standalone singlenode in a RHEL 7 Server.
I have written a simple spark code that reads a text file from HDFS and load it as parquet file in a separate location in HDFS. But when ever i am running this code in the server(i am using SBT to build jar and deploy it in cluster using spark-submit), following error is thrown:
19/06/07 12:56:04 INFO spark.SparkContext: Running Spark version 1.6.0
19/06/07 12:56:04 INFO spark.SecurityManager: Changing view acls to: ak_bng
19/06/07 12:56:04 INFO spark.SecurityManager: Changing modify acls to: ak_bng
19/06/07 12:56:04 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ak_bng); users with modify permissions: Set(ak_bng)
19/06/07 12:56:05 INFO util.Utils: Successfully started service 'sparkDriver' on port 44220.
19/06/07 12:56:05 INFO slf4j.Slf4jLogger: Slf4jLogger started
19/06/07 12:56:05 INFO Remoting: Starting remoting
19/06/07 12:56:05 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem#10.188.223.5:36304]
19/06/07 12:56:05 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem#10.188.223.5:36304]
19/06/07 12:56:05 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 36304.
19/06/07 12:56:05 INFO spark.SparkEnv: Registering MapOutputTracker
19/06/07 12:56:05 INFO spark.SparkEnv: Registering BlockManagerMaster
19/06/07 12:56:05 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-c38a27e3-c483-4f56-ab7f-56e4c1be0832
19/06/07 12:56:05 INFO storage.MemoryStore: MemoryStore started with capacity 530.0 MB
19/06/07 12:56:06 INFO spark.SparkEnv: Registering OutputCommitCoordinator
19/06/07 12:56:06 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
19/06/07 12:56:06 INFO ui.SparkUI: Started SparkUI at http://10.188.223.5:4040
19/06/07 12:56:06 INFO spark.SparkContext: Added JAR file:/home/ak_bng/spark_jars/Simple_Project-assembly-1.0.jar at spark://10.188.223.5:44220/jars/Simple_Project-assembly-1.0.jar with timestamp 1559892366578
19/06/07 12:56:06 INFO executor.Executor: Starting executor ID driver on host localhost
19/06/07 12:56:06 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46170.
19/06/07 12:56:06 INFO netty.NettyBlockTransferService: Server created on 46170
19/06/07 12:56:06 INFO storage.BlockManager: external shuffle service port = 7337
19/06/07 12:56:06 INFO storage.BlockManagerMaster: Trying to register BlockManager
19/06/07 12:56:06 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:46170 with 530.0 MB RAM, BlockManagerId(driver, localhost, 46170)
19/06/07 12:56:06 INFO storage.BlockManagerMaster: Registered BlockManager
19/06/07 12:56:07 INFO scheduler.EventLoggingListener: Logging events to hdfs://indelsrv185.in.kworld.kpmg.com:8020/user/spark/applicationHistory/local-1559892366602
19/06/07 12:56:07 INFO spark.SparkContext: Registered listener com.cloudera.spark.lineage.ClouderaNavigatorListener
19/06/07 12:56:08 INFO parquet.ParquetRelation: Listing hdfs://10.188.223.5:8020/user/ak_bng/products on driver
19/06/07 12:56:08 INFO parquet.ParquetRelation: Listing hdfs://10.188.223.5:8020/user/ak_bng/categories on driver
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:312)
at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
at org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(CompressionCodec.scala:169)
at org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version$lzycompute(CompressionCodec.scala:168)
at org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version(CompressionCodec.scala:168)
at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:74)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:81)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1334)
at org.apache.spark.sql.execution.datasources.DataSourceStrategy$.apply(DataSourceStrategy.scala:126)
at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:48)
at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:46)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:53)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:53)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$toString$5.apply(QueryExecution.scala:81)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$toString$5.apply(QueryExecution.scala:81)
at org.apache.spark.sql.execution.QueryExecution.stringOrError(QueryExecution.scala:61)
at org.apache.spark.sql.execution.QueryExecution.toString(QueryExecution.scala:81)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:50)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation.run(InsertIntoHadoopFsRelation.scala:106)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:56)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:56)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:256)
at org.apache.spark.sql.DataFrameWriter.dataSource$lzycompute$1(DataFrameWriter.scala:181)
at org.apache.spark.sql.DataFrameWriter.org$apache$spark$sql$DataFrameWriter$$dataSource$1(DataFrameWriter.scala:181)
at org.apache.spark.sql.DataFrameWriter$$anonfun$save$1.apply$mcV$sp(DataFrameWriter.scala:188)
at org.apache.spark.sql.DataFrameWriter.executeAndCallQEListener(DataFrameWriter.scala:154)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:188)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:172)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:370)
at SimpleApp$.main(SimpleApp.scala:169)
at SimpleApp.main(SimpleApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.UnsatisfiedLinkError: /tmp/snappy-1.0.4.1-libsnappyjava.so: /tmp/snappy-1.0.4.1-libsnappyjava.so: failed to map segment from shared object: Operation not permitted
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.xerial.snappy.SnappyNativeLoader.load(SnappyNativeLoader.java:39)
... 65 more
Exception in thread "main" java.lang.IllegalArgumentException: java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy
at org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(CompressionCodec.scala:171)
at org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version$lzycompute(CompressionCodec.scala:168)
at org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$SnappyCompressionCodec$$version(CompressionCodec.scala:168)
at org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:65)
at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:74)
at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:81)
at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1334)
at org.apache.spark.sql.execution.datasources.DataSourceStrategy$.apply(DataSourceStrategy.scala:126)
at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:48)
at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:46)
at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:53)
at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:53)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:51)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation.run(InsertIntoHadoopFsRelation.scala:106)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:56)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:56)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:256)
at org.apache.spark.sql.DataFrameWriter.dataSource$lzycompute$1(DataFrameWriter.scala:181)
at org.apache.spark.sql.DataFrameWriter.org$apache$spark$sql$DataFrameWriter$$dataSource$1(DataFrameWriter.scala:181)
at org.apache.spark.sql.DataFrameWriter$$anonfun$save$1.apply$mcV$sp(DataFrameWriter.scala:188)
at org.apache.spark.sql.DataFrameWriter.executeAndCallQEListener(DataFrameWriter.scala:154)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:188)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:172)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:370)
at SimpleApp$.main(SimpleApp.scala:169)
at SimpleApp.main(SimpleApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.xerial.snappy.Snappy
at org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(CompressionCodec.scala:169)
... 53 more
19/06/07 12:56:08 INFO spark.SparkContext: Invoking stop() from shutdown hook
19/06/07 12:56:08 INFO ui.SparkUI: Stopped Spark web UI at http://10.188.223.5:4040
19/06/07 12:56:08 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/06/07 12:56:09 INFO storage.MemoryStore: MemoryStore cleared
19/06/07 12:56:09 INFO storage.BlockManager: BlockManager stopped
19/06/07 12:56:09 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
19/06/07 12:56:09 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/06/07 12:56:09 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
19/06/07 12:56:09 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
19/06/07 12:56:09 INFO Remoting: Remoting shut down
19/06/07 12:56:09 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
19/06/07 12:56:09 INFO spark.SparkContext: Successfully stopped SparkContext
19/06/07 12:56:09 INFO util.ShutdownHookManager: Shutdown hook called
19/06/07 12:56:09 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-111712ef-39a8-41b6-bf6d-5d317d954fa1
spark submit command:
spark-submit --class SimpleApp --master local[8] /home/ak_bng/spark_jars/Simple_Project-assembly-1.0.jar.
I went through few links to resolve this issue(Snappy Compression not working due to tmp folder previliges, Apache Spark - Parquet / Snappy compression error, ) but none couldn't really provide a solution for this.
I had run Spark on HDFS (separate installation) successfully without any errors before. The problem started coming once CDHwas installed.
I am new to setting up cluster and quite don't understand what the issue is here and how to resolve it. Can some one help please shed some light on this.
I am using:
CDH 5.16
Spark 1.6.0
Server OS: RHEL 7
Hadoop 2.6
My spark-shell --master yarn came up with this error when I started it.
Can you help me out in understanding the reason of this container failure? There are no errors/info present in application logs.
[root#Master ~]# spark-shell --master yarn-client
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/spark-1.6.1-bin-2.6.0-cdh5.7.0/lib/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/04/23 00:40:03 INFO spark.SecurityManager: Changing view acls to: root
18/04/23 00:40:03 INFO spark.SecurityManager: Changing modify acls to: root
18/04/23 00:40:03 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
18/04/23 00:40:04 INFO spark.HttpServer: Starting HTTP Server
18/04/23 00:40:04 INFO server.Server: jetty-8.y.z-SNAPSHOT
18/04/23 00:40:04 INFO server.AbstractConnector: Started SocketConnector#0.0.0.0:46698
18/04/23 00:40:04 INFO util.Utils: Successfully started service 'HTTP class server' on port 46698.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.1
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_151)
Type in expressions to have them evaluated.
Type :help for more information.
18/04/23 00:40:06 INFO spark.SparkContext: Running Spark version 1.6.1
18/04/23 00:40:06 INFO spark.SecurityManager: Changing view acls to: root
18/04/23 00:40:06 INFO spark.SecurityManager: Changing modify acls to: root
18/04/23 00:40:06 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
18/04/23 00:40:06 INFO util.Utils: Successfully started service 'sparkDriver' on port 56838.
18/04/23 00:40:06 INFO slf4j.Slf4jLogger: Slf4jLogger started
18/04/23 00:40:06 INFO Remoting: Starting remoting
18/04/23 00:40:07 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem#127.0.0.1:52497]
18/04/23 00:40:07 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 52497.
18/04/23 00:40:07 INFO spark.SparkEnv: Registering MapOutputTracker
18/04/23 00:40:07 INFO spark.SparkEnv: Registering BlockManagerMaster
18/04/23 00:40:07 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-3d448992-8dce-467c-9146-c3382d586e6b
18/04/23 00:40:07 INFO storage.MemoryStore: MemoryStore started with capacity 2.7 GB
18/04/23 00:40:07 INFO spark.SparkEnv: Registering OutputCommitCoordinator
18/04/23 00:40:09 INFO server.Server: jetty-8.y.z-SNAPSHOT
18/04/23 00:40:09 INFO server.AbstractConnector: Started SelectChannelConnector#0.0.0.0:4040
18/04/23 00:40:09 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
18/04/23 00:40:09 INFO ui.SparkUI: Started SparkUI at http://127.0.0.1:4040
18/04/23 00:40:09 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.1.254:8032
18/04/23 00:40:09 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
18/04/23 00:40:09 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2048 MB per container)
18/04/23 00:40:09 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
18/04/23 00:40:09 INFO yarn.Client: Setting up container launch context for our AM
18/04/23 00:40:09 INFO yarn.Client: Setting up the launch environment for our AM container
18/04/23 00:40:09 WARN yarn.Client: SPARK_JAR detected in the system environment. This variable has been deprecated in favor of the spark.yarn.jar configuration variable.
18/04/23 00:40:09 INFO yarn.Client: Preparing resources for our AM container
18/04/23 00:40:10 WARN yarn.Client: SPARK_JAR detected in the system environment. This variable has been deprecated in favor of the spark.yarn.jar configuration variable.
18/04/23 00:40:10 INFO yarn.Client: Uploading resource file:/usr/spark-1.6.1-bin-2.6.0-cdh5.7.0/lib/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar -> hdfs://master:9000/user/root/.sparkStaging/application_1524413274967_0004/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar
18/04/23 00:40:11 INFO yarn.Client: Uploading resource file:/tmp/spark-0e77eb17-395d-425b-bda7-a8b3e7f35ee1/__spark_conf__477163183947757155.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1524413274967_0004/__spark_conf__477163183947757155.zip
18/04/23 00:40:11 INFO spark.SecurityManager: Changing view acls to: root
18/04/23 00:40:11 INFO spark.SecurityManager: Changing modify acls to: root
18/04/23 00:40:11 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
18/04/23 00:40:11 INFO yarn.Client: Submitting application 4 to ResourceManager
18/04/23 00:40:11 INFO impl.YarnClientImpl: Submitted application application_1524413274967_0004
18/04/23 00:40:12 INFO yarn.Client: Application report for application_1524413274967_0004 (state: ACCEPTED)
18/04/23 00:40:12 INFO yarn.Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: root.root
start time: 1524415211241
final status: UNDEFINED
tracking URL: http://Master:8088/proxy/application_1524413274967_0004/
user: root
18/04/23 00:40:13 INFO yarn.Client: Application report for application_1524413274967_0004 (state: ACCEPTED)
18/04/23 00:40:14 INFO yarn.Client: Application report for application_1524413274967_0004 (state: ACCEPTED)
18/04/23 00:40:15 INFO yarn.Client: Application report for application_1524413274967_0004 (state: FAILED)
18/04/23 00:40:15 INFO yarn.Client:
client token: N/A
diagnostics: Application application_1524413274967_0004 failed 2 times due to AM Container for appattempt_1524413274967_0004_000002 exited with exitCode: -1000
For more detailed output, check application tracking page:http://Master:8088/proxy/application_1524413274967_0004/Then, click on links to logs of each attempt.
Diagnostics: Wrong FS: file://usr/hadoop/tmp/nm-local-dir, expected: file:///
Failing this attempt. Failing the application.
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: root.root
start time: 1524415211241
final status: FAILED
tracking URL: http://Master:8088/cluster/app/application_1524413274967_0004
user: root
18/04/23 00:40:15 INFO yarn.Client: Deleting staging directory .sparkStaging/application_1524413274967_0004
18/04/23 00:40:15 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:124)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.<init>(<console>:15)
at $line3.$read$$iwC.<init>(<console>:24)
at $line3.$read.<init>(<console>:26)
at $line3.$read$.<init>(<console>:30)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.<init>(<console>:7)
at $line3.$eval$.<clinit>(<console>)
at $line3.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
18/04/23 00:40:15 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
18/04/23 00:40:15 INFO ui.SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
18/04/23 00:40:15 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
18/04/23 00:40:15 INFO cluster.YarnClientSchedulerBackend: Asking each executor to shut down
18/04/23 00:40:15 INFO cluster.YarnClientSchedulerBackend: Stopped
18/04/23 00:40:15 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/04/23 00:40:15 INFO storage.MemoryStore: MemoryStore cleared
18/04/23 00:40:15 INFO storage.BlockManager: BlockManager stopped
18/04/23 00:40:15 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
18/04/23 00:40:15 WARN metrics.MetricsSystem: Stopping a MetricsSystem that is not running
18/04/23 00:40:15 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/04/23 00:40:15 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
18/04/23 00:40:15 INFO spark.SparkContext: Successfully stopped SparkContext
18/04/23 00:40:15 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
18/04/23 00:40:15 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:124)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
java.lang.NullPointerException
at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql
^
scala>
The below error line explains the problem. Its missing one '/' (slash)
Wrong FS: file://usr/hadoop/tmp/nm-local-dir, expected: file:///
Ensure that your hdfs-site.xml is configured properly for Namenode directory. Check dfs.namenode.name.dir property.
Can you provide full driver stack trace. Using below command
Yarn logs --applicationId
I am very new to Apache Spark and trying to run spark on my local machine.
First I tried to start the master using the following command:
./sbin/start-master.sh
Which got successfully started. And then I tried to start the worker using
./bin/spark-class org.apache.spark.deploy.worker.Worker spark://localhost:7077 -c 1 -m 512M
which eventually failed with the following log:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/06/09 17:01:58 INFO Worker: Started daemon with process name: 9301#sumit-Inspiron-5537
17/06/09 17:01:58 INFO SignalUtils: Registered signal handler for TERM
17/06/09 17:01:58 INFO SignalUtils: Registered signal handler for HUP
17/06/09 17:01:58 INFO SignalUtils: Registered signal handler for INT
17/06/09 17:01:58 WARN Utils: Your hostname, sumit-Inspiron-5537 resolves to a loopback address: 127.0.1.1; using 192.168.1.16 instead (on interface wlp2s0)
17/06/09 17:01:58 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
17/06/09 17:01:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/06/09 17:01:59 INFO SecurityManager: Changing view acls to: sumit
17/06/09 17:01:59 INFO SecurityManager: Changing modify acls to: sumit
17/06/09 17:01:59 INFO SecurityManager: Changing view acls groups to:
17/06/09 17:01:59 INFO SecurityManager: Changing modify acls groups to:
17/06/09 17:01:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sumit); groups with view permissions: Set(); users with modify permissions: Set(sumit); groups with modify permissions: Set()
17/06/09 17:01:59 INFO Utils: Successfully started service 'sparkWorker' on port 35827.
17/06/09 17:02:00 INFO Worker: Starting Spark worker 192.168.1.16:35827 with 1 cores, 512.0 MB RAM
17/06/09 17:02:00 INFO Worker: Running Spark version 2.1.1
17/06/09 17:02:00 INFO Worker: Spark home: /home/sumit/spark-2.1.1-bin-hadoop2.7
17/06/09 17:02:00 WARN Utils: Service 'WorkerUI' could not bind on port 8081. Attempting port 8082.
17/06/09 17:02:00 WARN Utils: Service 'WorkerUI' could not bind on port 8082. Attempting port 8083.
17/06/09 17:02:00 INFO Utils: Successfully started service 'WorkerUI' on port 8083.
17/06/09 17:02:00 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://192.168.1.16:8083
17/06/09 17:02:00 INFO Worker: Connecting to master localhost:7077...
17/06/09 17:02:00 WARN Worker: Failed to connect to master localhost:7077
org.apache.spark.SparkException: Exception thrown in awaitResult
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
at org.apache.spark.deploy.worker.Worker$$anonfun$org$apache$spark$deploy$worker$Worker$$tryRegisterAllMasters$1$$anon$1.run(Worker.scala:218)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Failed to connect to localhost/127.0.0.1:7077
at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:232)
at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:182)
at org.apache.spark.rpc.netty.NettyRpcEnv.createClient(NettyRpcEnv.scala:197)
at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:194)
at org.apache.spark.rpc.netty.Outbox$$anon$1.call(Outbox.scala:190)
... 4 more
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:7077
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:257)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:291)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:640)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
... 1 more
This has gone into attempts loop. I had checked localhost on 8080 and Master is working properly. Please suggest what can be done in this situation to get the slave up and working, because only then the spark job can be run. Thank you.