问题描述
win7下运行的scala程序无法连接到Linux中的spark集群,老是报这样的错误:16/01/0610:59:32INFOAbstractConnector:StartedSelectChannelConnector@0.0.0.0:404016/01/0610:59:32INFOUtils:Successfullystartedservice'SparkUI'onport4040.16/01/0610:59:32INFOSparkUI:StartedSparkUIathttp://Silence-PC:404016/01/0610:59:32INFOAppClient$ClientActor:Connectingtomasterakka.tcp://sparkMaster@10.1.170.201:7077/user/Master...16/01/0610:59:33WARNReliableDeliverySupervisor:Associationwithremotesystem[akka.tcp://sparkMaster@10.1.170.201:7077]hasfailed,addressisnowgatedfor[5000]ms.Reasonis:[Disassociated].16/01/0610:59:52INFOAppClient$ClientActor:Connectingtomasterakka.tcp://sparkMaster@10.1.170.201:7077/user/Master...16/01/0610:59:52WARNReliableDeliverySupervisor:Associationwithremotesystem[akka.tcp://sparkMaster@10.1.170.201:7077]hasfailed,addressisnowgatedfor[5000]ms.Reasonis:[Disassociated].16/01/0611:00:12INFOAppClient$ClientActor:Connectingtomasterakka.tcp://sparkMaster@10.1.170.201:7077/user/Master...16/01/0611:00:12WARNReliableDeliverySupervisor:Associationwithremotesystem[akka.tcp://sparkMaster@10.1.170.201:7077]hasfailed,addressisnowgatedfor[5000]ms.Reasonis:[Disassociated].16/01/0611:00:32ERRORSparkDeploySchedulerBackend:Applicationhasbeenkilled.Reason:Allmastersareunresponsive!Givingup.16/01/0611:00:32WARNSparkDeploySchedulerBackend:ApplicationIDisnotinitializedyet.16/01/0611:00:32ERRORTaskSchedulerImpl:Exitingduetoerrorfromclusterscheduler:Allmastersareunresponsive!Givingup.
解决方案
解决方案二:
我也遇到了这个问题,怎么解决呀?我的IDEA连接集群就报这个错误,
解决方案三:
试试把$SPARK_HOME/conf/下,自己配置好的xml复制到项目的src/main/resources目录下(如果你的项目是基于mave构建的)。