I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using
"./bin/spark-shell".
It has the error below. I also tried to install different versions of Spark but all have the same error. This is the second time I'm running Spark. My previous run works fine.
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
java.lang.NullPointerException
at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql
Then I add
export SPARK_LOCAL_IP="127.0.0.1"
to spark-env.sh, error changes to:
ERROR : No route to host
java.net.ConnectException: No route to host
at java.net.Inet6AddressImpl.isReachable0(Native Method)
at java.net.Inet6AddressImpl.isReachable(Inet6AddressImpl.java:77)
at java.net.InetAddress.isReachable(InetAddress.java:475)
...
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
Following steps might help:
Get your hostname by using "hostname" command.
Make an entry in the /etc/hosts file for your hostname if not present as follows:
127.0.0.1 your_hostname
I always get that when switching between networks. This solves it:
$ sudo hostname -s 127.0.0.1
I've built it from the current master branch with version 2.0.0-SNAPSHOT. After adding export SPARK_LOCAL_IP="127.0.0.1" to load-spark-env.sh it worked for me. I'm using Macos 10.10.5. So it could be version issue?
Just set the spark.driver.host to be your localhost if you use IDE
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");
JavaSparkContext sc = new JavaSparkContext(conf);
There are two errors I think.
Your spark local ip was not correct and needs to be change to 127.0.0.1.
You didn't difine sqlContext properly.
For 1. I tried:
1) exported SPARK_LOCAL_IP="127.0.0.1" in ~/.bash_profile
2) added export SPARK_LOCAL_IP="127.0.0.1" in load-spark-env.sh under $SPARK_HOME
But neither worked. Then I tried the following and it worked:
val conf = new SparkConf().
setAppName("SparkExample").
setMaster("local[*]").
set("spark.driver.bindAddress","127.0.0.1")
val sc = new SparkContext(conf)
For 2. you can try:
sqlContext = SparkSession.builder.config("spark.master","local[*]").getOrCreate()
and then import sqlContext.implicits._
The builder in SparkSession will automatically use the SparkContext if it exists, otherwise it will create one. You can explicitly create two if necessary.
If you don't want to change the hostname of your Mac, you can do the following:
Find the template file spark-env.sh.template on your machine (It is probably in /usr/local/Cellar/apache-spark/2.1.0/libexec/conf/).
cp spark-env.sh.template spark-env.sh
Add export SPARK_LOCAL_IP=127.0.0.1 under the comment for local IP.
Start spark-shell and enjoy it.
If you are using Scala to run the code in an IDE, and if you face the same issue and you are not using SparkConf() as pointed out above and using SparkSession() then you could bind the localhost address as follows as set only works in SparkConf(). You should use .config() to set the spark configuration as shown below:
val spark = SparkSession
.builder()
.appName("CSE512-Phase1")
.master("local[*]").config("spark.driver.bindAddress", "localhost")
.getOrCreate()
export SPARK_LOCAL_IP=127.0.0.1
For mac .bash_profile.
This happens when you switched between different networks(VPN - PROD, CI based on your company networks to access different environments).
I had the same issue, whenever I switch the VPN.
update sudo /etc/hosts with the hostname value on your Mac.
Sometimes firewall prevents creating and binding a socket. make sure that your firewall is not enable and also you have to check the ip of your machine in /etc/hosts and make sure it's OK then try again:
sudo ufw disable
sparkContext = new JavaSparkContext("local[4]", "Appname")
export SPARK_LOCAL_IP=127.0.0.1
just doing above worked for me.
In Mac,Check IP in System Preference -> Network -> click the Wifi you are connected(it should show green icon) -> check the IP just above your Network Name.
Make following entry in ../conf/spark-env.sh :
SPARK_MASTER_HOST=<<your-ip>>
SPARK_LOCAL_IP=<<your-ip>>
and than try spark-shell.
Doing above changes worked for me.
When ever you will switch network while working with spark, It happens due to network switch.
for instant fix you have to add "spark.driver.bindAddress" in "localhost" mode or "127.0.0.1" in your spark application
// Create a SparkContext using every core of the local machine
val confSpark = new SparkConf().set("spark.driver.bindAddress", "localhost")
val sc = new SparkContext("local[*]", "appname", conf = confSpark)
Related
I have installed apache spark by following these instructions. When I get to step 5, or when I have to execute start-master.sh in terminal I get the following output:
21/09/25 12:41:33 WARN Utils: Your hostname, petar-X580VD resolves to a loopback address: 127.0.1.1; using 192.168.0.105 instead (on interface wlp3s0)
21/09/25 12:41:33 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
at org.apache.spark.internal.config.package$.<init>(package.scala:1095)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
at org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
at scala.Option.orElse(Option.scala:447)
at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:1022)
at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:1022)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module #4434095f
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
... 13 more
I don't know how to fix this.
As #werner suggested in the comments, changing to java version 11 fixed the problem.
I installed Spark 2.3.0 on Ubuntu 18.04 with Java 1.8, following these steps:
https://github.com/ashishtam/apache-spark-multi-node-installation/blob/master/index.md
I have a simple cluster made of 1 master and 1 slave, the /etc/hosts file being
# For Spark cluster
172.16.10.20 master
172.16.10.30 slave01
#127.0.0.1 localhost.localdomain localhost
::1 localhost6.localdomain6 localhost6
# The following lines are desirable for IPv6 capable hosts
::1 localhost ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
ff02::3 ip6-allhosts
Once Spark installed and configured, I started the master and the slaves
./sbin/start-master.sh
./sbin/start-slaves.sh spark://master-pc:7077
checking the status with jps, on the master node:
> jps
13440 Master
13701 Worker
13963 Jps
and on the slave node:
> jps
4202 Master
4332 Worker
4958 Jps
On the master node, when opening the spark-shell I got:
18/08/30 19:12:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://master:4040
Spark context available as 'sc' (master = spark://master:7077, app id = app-20180830191300-0000).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181)
Type in expressions to have them evaluated.
Type :help for more information.
scala> Exception in thread "main" java.io.IOException: Resource temporarily unavailable
at java.io.FileInputStream.read0(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:207)
at jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:169)
at jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:137)
at jline.internal.NonBlockingInputStream.read(NonBlockingInputStream.java:246)
at jline.internal.InputStreamReader.read(InputStreamReader.java:261)
at jline.internal.InputStreamReader.read(InputStreamReader.java:198)
at jline.console.ConsoleReader.readCharacter(ConsoleReader.java:2145)
at jline.console.ConsoleReader.readLine(ConsoleReader.java:2349)
at jline.console.ConsoleReader.readLine(ConsoleReader.java:2269)
at scala.tools.nsc.interpreter.jline.InteractiveReader.readOneLine(JLineReader.scala:57)
at scala.tools.nsc.interpreter.InteractiveReader$class.readLine(InteractiveReader.scala:38)
at scala.tools.nsc.interpreter.jline.InteractiveReader.readLine(JLineReader.scala:28)
at scala.tools.nsc.interpreter.ILoop.readOneLine(ILoop.scala:404)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:413)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:923)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:76)
at org.apache.spark.repl.Main$.main(Main.scala:56)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
and it broke...
by curiosity I did the same on the slave node:
18/08/30 19:18:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
18/08/30 19:18:43 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
I ended up with a spark shell, but errors about a spark-driver.
I have checked the IP in /etc/hosts, the SPARK_HOME variable in the master node, the configuration files of Spark, the ssh keys...
the spark local ip is also defined as (for the master node):
export SPARK_LOCAL_IP=172.16.10.20
does anyone see what I am doing wrong here?
The aim is to get started with Spark by executing some examples and investigate the output.
I have cloned the Apache Spark repository, built it following the instructions in the Readme and ran ./bin/spark-shell that results in:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
16/11/10 08:47:48 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:505)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:490)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
16/11/10 08:47:48 ERROR SparkContext: Error stopping SparkContext after init error.
java.lang.NullPointerException
at org.apache.spark.SparkContext.stop(SparkContext.scala:1764)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:591)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2309)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:843)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:835)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:835)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101)
at $line3.$read$$iw$$iw.<init>(<console>:15)
at $line3.$read$$iw.<init>(<console>:42)
at $line3.$read.<init>(<console>:44)
at $line3.$read$.<init>(<console>:48)
at $line3.$read$.<clinit>(<console>)
at $line3.$eval$.$print$lzycompute(<console>:7)
at $line3.$eval$.$print(<console>:6)
at $line3.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:68)
at org.apache.spark.repl.Main$.main(Main.scala:51)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:505)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:490)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0-SNAPSHOT
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_92)
Type in expressions to have them evaluated.
Type :help for more information.
Running one of the examples fails as well:
scala> sc.parallelize(1 to 1000).count()
<console>:18: error: not found: value sc
sc.parallelize(1 to 1000).count()
Try the following two directions:
1) Help spark find your IP:
if your hostname isn't included in /etc/hosts add it to /etc/hosts
127.0.0.1 your_hostname
set environment variable SPARK_LOCAL_IP="127.0.0.1"
2) if exists - kill old spark process
You need to set environment variable SPARK_LOCAL_IP="127.0.0.1",
or run,
export SPARK_LOCAL_IP="127.0.0.1"
If this doesn't solve your problem, run command hostname on terminal, make sure this hostname exists in your \etc\hosts file.
This solution is not for someone running on cluster machine of their own but on clouderaVM. While working on clouderaVM, I was getting the same error and the above solution didn't work.
Cause when i did
cat /etc/hosts/
192.168.245.*** quickstart.cloudera
This id is different than my actual ip which my VM was running on. you can check that using ifconfig.
Solution is to shutdown the VM and restart it.
I am running a Kafka server. (when i use the command bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test --from-beginning it gives me all ma data in my topic).
When I want to test the example JavaDirectKafkaWordCount in spark in order to understand how it works i get the following error:
$ ./run-example streaming.JavaDirectKafkaWordCount localhost:2181 test
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/08/17 11:19:33 INFO StreamingExamples: Setting log level to [WARN] for streaming example. To override add a custom log4j.properties to the classpath.
16/08/17 11:19:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/17 11:19:33 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 10.66.212.132 instead (on interface enp5s0)
16/08/17 11:19:33 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Exception in thread "main" org.apache.spark.SparkException: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
at scala.util.Either.fold(Either.scala:97)
at org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:222)
at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala)
at org.apache.spark.examples.streaming.JavaDirectKafkaWordCount.main(JavaDirectKafkaWordCount.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I would like to know what the error means and how i could possible solve it.
Thank you very much for your attention and your help.
Joe
I used the following command to run the spark java example of wordcount:-
time spark-submit --deploy-mode cluster --master spark://192.168.0.7:6066 --class org.apache.spark.examples.JavaWordCount /home/pi/Desktop/example/new/target/javaword.jar /books_50.txt
When I run it, the following is the output:-
Running Spark using the REST application submission protocol.
16/07/18 03:55:41 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submission successfully created as driver-20160718035543-0000. Polling submission state...
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Submitting a request for the status of submission driver-20160718035543-0000 in spark://192.168.0.7:6066.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: State of driver driver-20160718035543-0000 is now RUNNING.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Driver is running on worker worker-20160718041005-192.168.0.12-42405 at 192.168.0.12:42405.
16/07/18 03:55:44 INFO rest.RestSubmissionClient: Server responded with CreateSubmissionResponse:
{
"action" : "CreateSubmissionResponse",
"message" : "Driver successfully submitted as driver-20160718035543-0000",
"serverSparkVersion" : "1.6.2",
"submissionId" : "driver-20160718035543-0000",
"success" : true
}
I checked the particular worker (192.168.0.12) for its log and it says:-
Launch Command: "/usr/lib/jvm/jdk-8-oracle-arm32-vfp-hflt/jre/bin/java" "-cp" "/opt/spark/conf/:/opt/spark/lib/spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/spark/lib/datanucleus-api-jdo-3.2.6.jar:/opt/spark/lib/datanucleus-core-3.2.10.jar:/opt/spark/lib/datanucleus-rdbms-3.2.9.jar" "-Xms1024M" "-Xmx1024M" "-Dspark.driver.supervise=false" "-Dspark.app.name=org.apache.spark.examples.JavaWordCount" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/home/pi/Desktop/example/new/target/javaword.jar" "-Dspark.master=spark://192.168.0.7:7077" "-Dspark.executor.memory=10M" "org.apache.spark.deploy.worker.DriverWrapper" "spark://Worker#192.168.0.12:42405" "/opt/spark/work/driver-20160718035543-0000/javaword.jar" "org.apache.spark.examples.JavaWordCount" "/books_50.txt"
========================================
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/07/18 04:10:58 INFO SecurityManager: Changing view acls to: pi
16/07/18 04:10:58 INFO SecurityManager: Changing modify acls to: pi
16/07/18 04:10:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pi); users with modify permissions: Set(pi)
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
16/07/18 04:11:00 WARN Utils: Service 'Driver' could not bind on port 0. Attempting port 1.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'Driver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'Driver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
My spark-env.sh file (for master) contains:-
export SPARK_MASTER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"
My spark-env.sh file (for worker) contains:-
export SPARK_WORKER_WEBUI_PORT="8080"
export SPARK_MASTER_IP="192.168.0.7"
export SPARK_EXECUTOR_MEMORY="10M"
Please help...!!
I had the same issue when trying to run the shell, and was able to get this working by setting the SPARK_LOCAL_IP environment variable. You can assign this from the command line when running the shell:
SPARK_LOCAL_IP=127.0.0.1 ./bin/spark-shell
For a more permanent solution, create a spark-env.sh file in the conf directory of your Spark root. Add the following line:
SPARK_LOCAL_IP=127.0.0.1
Give execute permissions to the script using chmod +x ./conf/spark-env.sh, and this will set this environment variable by default.
I am using Maven/SBT to manage dependencies and the Spark core is contained in a jar file.
You can override the SPARK_LOCAL_IP at runtime by setting the "spark.driver.bindAddress" (here in Scala):
val config = new SparkConf()
config.setMaster("local[*]")
config.setAppName("Test App")
config.set("spark.driver.bindAddress", "127.0.0.1")
val sc = new SparkContext(config)
I also had this issue.
The reason (for me) was that the IP of my local system was not reachable from my local system.
I know that statement makes no sense, but please read the following.
My system name (uname -s) shows that my system is named "sparkmaster".
In my /etc/hosts file, I have assigned a fixed IP address for the sparkmaster system as "192.168.1.70". There were additional fixed IP addresses for sparknode01 and sparknode02 at ...1.71 & ...1.72 respectively.
Due to some other problems I had, I needed to change all of my network adapters to DHCP. This meant that they were getting addresses like 192.168.90.123.
The DHCP addresses were not in the same network as the ...1.70 range and there was no route configured.
When spark starts, is seems to want to try to connect to the host named in uname (i.e. sparkmaster in my case). This was the IP 192.168.1.70 - but there was no way to connect to that because that address was in an unreachable network.
My solution was to change one of my Ethernet adapters back to a fixed static address (i.e. 192.168.1.70) and voila - problem solved.
So the issues seems to be that when spark starts in "local mode" it attempts to connect to a system named after your system's name (rather than local host).
I guess this makes sense if you are wanting to setup a cluster (Like I did) but it can result in the above confusing message.
Possibly putting your system's host name on the 127.0.0.1 entry in /etc/hosts may also solve this problem, but I did not try it.
You need to enter the hostname in your /etc/hosts file.
Something like:
127.0.0.1 localhost "hostname"
This is possibly a duplicate of Spark 1.2.1 standalone cluster mode spark-submit is not working
I have tried the same steps, but able to run the job. Kindly post the full spark-env.sh and spark-defaults if possible.
I had this problem and it is because of changing real IP with my IP in /etc/hosts.
This issue is related to IP address alone. Error messages in the log file are not informative.
check with following 3 steps:
check your IP address - can be checked with ifconfig or ip commands. If your service is not a Public service. IP addresses with 192.168 should be good enough. 127.0.0.1 cannot be used if you are planning a cluster.
check your environment variable SPARK_MASTER_HOST - check there are no typos in the name of the variable or actual IP address.
env | grep SPARK_
check the port you are planning to use for sparkMaster is free with command netstat. Do not use a port below 1024. For example:
netstat -a | 9123
After your sparkmaster starts running if you are not able see webui from a different machine, then open the webui port with command iptables.
Use as below in dataframes
val spark=SparkSession.builder.appName("BinarizerExample").master("local[*]").config("spark.driver.bindAddress", "127.0.0.1").getOrCreate()
First Option :-
Following steps might help:
Get your hostname by using "hostname" command.
xxxxxx.ssssss (e) base ~ hostname
xxxxxx.ssssss.net
Make an entry in the /etc/hosts file for your hostname if not present as follows:
127.0.0.1 xxxxxx.ssssss.net
Second Option:-
you can set spark.driver.bindAddress value in your spark.conf file
spark.driver.bindAddress=127.0.0.1
Thanks!!
I solved this problem by modifying the slave file.its spark-2.4.0-bin-hadoop2.7/conf/slave
please check your configure。