How to get a list of consumers connected to a Kafka topic?
August 18, 2021LeaseExpiredException: No lease error on HDFS
August 23, 2021java.net.BindException is a common exception when Spark is trying to initialize SparkContext. This is especially a common error when you try to run Spark locally.
16/01/04 13:49:40 ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries! at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) at java.lang.Thread.run(Thread.java:745)
Do you like us to send you a 47 page Definitive guide on Spark join algorithms? ===>
Reason
Most common reason is Spark is trying to bind to the localhost (that is your computer) for the master node and not able to do so.
Solution
Find the hostname of your computer and add it the /etc/hosts.
Find hostname
hostname command will get you the hostname
[osboxes@wk1 ~]$ hostname Wk1.hirw.com
Add hostname to hosts file
Add an entry to your /etc/hosts file like below
[osboxes@wk1 ~]$ cat /etc/hosts 127.0.0.1 wk1.hirw.com
If you are using Windows, hosts file will be under C:\Windows\System32\drivers\etc
By doing this when Spark ping 127.0.0.1 it will properly resolve to a hostname and will be able to bind to the address.