问题描述
问题:安装了SPARK1.5.1,但就是启动报错,请大家帮忙指导下。报错信息:[root@Mastersbin]#./start-all.shstartingorg.apache.spark.deploy.master.Master,loggingto/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.outfailedtolaunchorg.apache.spark.deploy.master.Master:/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../bin/spark-class:line78:cygpath:commandnotfound错误:找不到或无法加载主类org.apache.spark.deploy.master.Masterfulllogin/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-Master.Hadoop.outSlave2.Hadoop:startingorg.apache.spark.deploy.worker.Worker,loggingto/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.outMaster.Hadoop:startingorg.apache.spark.deploy.worker.Worker,loggingto/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Master.Hadoop.outSlave1.Hadoop:startingorg.apache.spark.deploy.worker.Worker,loggingto/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.outSlave2.Hadoop:failedtolaunchorg.apache.spark.deploy.worker.Worker:Slave2.Hadoop:错误:找不到或无法加载主类org.apache.spark.launcher.MainSlave2.Hadoop:fulllogin/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave2.Hadoop.outMaster.Hadoop:failedtolaunchorg.apache.spark.deploy.worker.Worker:Master.Hadoop:/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../bin/spark-class:line78:cygpath:commandnotfoundMaster.Hadoop:错误:找不到或无法加载主类org.apache.spark.deploy.worker.WorkerMaster.Hadoop:fulllogin/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Master.Hadoop.outSlave1.Hadoop:failedtolaunchorg.apache.spark.deploy.worker.Worker:Slave1.Hadoop:错误:找不到或无法加载主类org.apache.spark.launcher.MainSlave1.Hadoop:fulllogin/usr/local/spark/spark-1.5.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Slave1.Hadoop.out环境:exportJAVA_HOME=/usr/jdk1.7.0_05exportJRE_HOME=$JAVA_HOME/jreexportCLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATHexportJAVA_OPTS="-Xms512M-Xmx1024M"exportCATALINA_OPTS="-Djava.awt.headless=true"HADOOP_HOME=/opt/spark/hadoop-2.7.1SCALA_HOME=/usr/local/scala/scala-2.11.7SPARK_HOME=/usr/local/spark/spark-1.5.1-bin-hadoop2.6exportPATH=$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SCALA_HOME/bin:$SPARK_HOME/bin:$PATHexportHADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/nativeexportHADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
解决方案
解决方案二:
请问解决了没?我一出现这个问题,求指导
解决方案三:
failedtolaunchorg.apache.spark.deploy.master.Master:你这是master就没起来,所以后面的都有问题了
解决方案四:
CLASS_PATH里面,增加下spark的jar包路径试试
解决方案五:
/想问一下楼主,这个问题解决来么?怎么解决的?麻烦来
解决方案六:
SPARK_HOME/sbin加入PATH路径