问题描述
集群3台机器,一个是2g内存做主节点,另外两个都是1g内存做从节点。想问下大神是不是我的内存太小导致的Thanks..报错如下:[hadoop@masterspark-1.3.0-bin-hadoop2.4]$./bin/spark-submit--classSimpleApp--masterspark://172.21.7.182:7077~/spark_wordcount/target/scala-2.10/simple-project_2.10-1.0.jarSparkassemblyhasbeenbuiltwithHive,includingDatanucleusjarsonclasspathUsingSpark'sdefaultlog4jprofile:org/apache/spark/log4j-defaults.properties15/04/2214:30:30WARNTaskSchedulerImpl:Initialjobhasnotacceptedanyresources;checkyourclusterUItoensurethatworkersareregisteredandhavesufficientresources15/04/2214:30:32INFOSparkDeploySchedulerBackend:Registeredexecutor:Actor[akka.tcp://sparkExecutor@bananapi:38979/user/Executor#266790798]withID115/04/2214:30:32INFOTaskSetManager:Startingtask0.0instage0.0(TID0,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:32INFOTaskSetManager:Startingtask1.0instage0.0(TID1,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:32INFOSparkDeploySchedulerBackend:Registeredexecutor:Actor[akka.tcp://sparkExecutor@bananapi:43806/user/Executor#-850130035]withID015/04/2214:30:33INFOBlockManagerMasterActor:Registeringblockmanagerbananapi:60321with267.3MBRAM,BlockManagerId(1,bananapi,60321)15/04/2214:30:33INFOBlockManagerMasterActor:Registeringblockmanagerbananapi:51018with267.3MBRAM,BlockManagerId(0,bananapi,51018)15/04/2214:30:34WARNTaskSetManager:Losttask1.0instage0.0(TID1,bananapi):java.io.IOException:java.lang.reflect.InvocationTargetExceptionatorg.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1155)atorg.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:164)atorg.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)atorg.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64)atorg.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:87)atorg.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58)atorg.apache.spark.scheduler.Task.run(Task.scala:64)atorg.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)atjava.lang.Thread.run(Thread.java:745)Causedby:java.lang.reflect.InvocationTargetExceptionatsun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeMethod)atsun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)atsun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)atjava.lang.reflect.Constructor.newInstance(Constructor.java:526)atorg.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68)atorg.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)atorg.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)atorg.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:166)atorg.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1152)...11moreCausedby:java.lang.IllegalArgumentExceptionatorg.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)...20more15/04/2214:30:34INFOTaskSetManager:Losttask0.0instage0.0(TID0)onexecutorbananapi:java.io.IOException(java.lang.reflect.InvocationTargetException)[duplicate1]15/04/2214:30:34INFOTaskSetManager:Startingtask0.1instage0.0(TID2,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:34INFOTaskSetManager:Startingtask1.1instage0.0(TID3,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:34INFOTaskSetManager:Losttask0.1instage0.0(TID2)onexecutorbananapi:java.io.IOException(java.lang.reflect.InvocationTargetException)[duplicate2]15/04/2214:30:34INFOTaskSetManager:Startingtask0.2instage0.0(TID4,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:35INFOTaskSetManager:Losttask1.1instage0.0(TID3)onexecutorbananapi:java.io.IOException(java.lang.reflect.InvocationTargetException)[duplicate3]15/04/2214:30:35INFOTaskSetManager:Startingtask1.2instage0.0(TID5,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:35INFOTaskSetManager:Losttask0.2instage0.0(TID4)onexecutorbananapi:java.io.IOException(java.lang.reflect.InvocationTargetException)[duplicate4]15/04/2214:30:35INFOTaskSetManager:Startingtask0.3instage0.0(TID6,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:35INFOTaskSetManager:Losttask1.2instage0.0(TID5)onexecutorbananapi:java.io.IOException(java.lang.reflect.InvocationTargetException)[duplicate5]15/04/2214:30:35INFOTaskSetManager:Startingtask1.3instage0.0(TID7,bananapi,PROCESS_LOCAL,1389bytes)15/04/2214:30:35INFOTaskSetManager:Losttask0.3instage0.0(TID6)onexecutorbananapi:java.io.IOException(java.lang.reflect.InvocationTargetException)[duplicate6]15/04/2214:30:35ERRORTaskSetManager:Task0instage0.0failed4times;abortingjob15/04/2214:30:35INFOTaskSchedulerImpl:Cancellingstage015/04/2214:30:35INFOTaskSchedulerImpl:Stage0wascancelled15/04/2214:30:35INFODAGScheduler:Job0failed:countatSimpleApp.scala:11,took20.358100sExceptioninthread"main"org.apache.spark.SparkException:Jobabortedduetostagefailure:Task0instage0.0failed4times,mostrecentfailure:Losttask0.3instage0.0(TID6,bananapi):java.io.IOException:java.lang.reflect.InvocationTargetExceptionatorg.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1155)atorg.apache.spark.broadcast.TorrentBroadcast.readBroadcastBlock(TorrentBroadcast.scala:164)atorg.apache.spark.broadcast.TorrentBroadcast._value$lzycompute(TorrentBroadcast.scala:64)atorg.apache.spark.broadcast.TorrentBroadcast._value(TorrentBroadcast.scala:64)atorg.apache.spark.broadcast.TorrentBroadcast.getValue(TorrentBroadcast.scala:87)atorg.apache.spark.broadcast.Broadcast.value(Broadcast.scala:70)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58)atorg.apache.spark.scheduler.Task.run(Task.scala:64)atorg.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)atjava.lang.Thread.run(Thread.java:745)Causedby:java.lang.reflect.InvocationTargetExceptionatsun.reflect.NativeConstructorAccessorImpl.newInstance0(NativeMethod)atsun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)atsun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)atjava.lang.reflect.Constructor.newInstance(Constructor.java:526)atorg.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68)atorg.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)atorg.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)atorg.apache.spark.broadcast.TorrentBroadcast$$anonfun$readBroadcastBlock$1.apply(TorrentBroadcast.scala:166)atorg.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1152)...11moreCausedby:java.lang.IllegalArgumentExceptionatorg.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:152)...20moreDriverstacktrace:atorg.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1203)atorg.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1192)atorg.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1191)atscala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)atscala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)atorg.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1191)atorg.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)atorg.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:693)atscala.Option.foreach(Option.scala:236)atorg.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:693)atorg.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1393)atorg.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1354)atorg.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
解决方案
解决方案二:
1G的话,确实有点小。不过这样的问题,大多数都是缺少包导致的,就是说环境所需的包,你在shell中运行,应该是没有问题,但是,在submit下就比较严格了。我在用submit提交的时候,都是出现很多类似你这样的问题,大部分是缺少包,还有一部分是你在执行submit的时候,后面是需要加一些参数,或者是指定一些jar等等的。还要看你部署的方式是什么,命令也有所不同。我也是刚学不久,我说的这些只能给你一些提示而已啦
解决方案三:
java.io.IOException:io异常,是不是你脚本读写文件出问题