java.io.IOException: Call to Hadoop.Master/IP:port failed on local e

问题描述

java.io.IOException:CalltoHadoop.Master/192.168.10.196:9111failedonlocalexception:java.io.EOFExceptionatorg.apache.hadoop.ipc.Client.wrapException(Client.java:1144)atorg.apache.hadoop.ipc.Client.call(Client.java:1112)atorg.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)atcom.sun.proxy.$Proxy5.getProtocolVersion(UnknownSource)atorg.apache.hadoop.ipc.RPC.getProxy(RPC.java:411)atorg.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:135)atorg.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:276)atorg.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:241)atorg.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)atorg.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1411)atorg.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)atorg.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1429)atorg.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)atorg.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)atorg.apache.hadoop.fs.FileSystem.get(FileSystem.java:238)atorg.apache.hadoop.fs.Path.getFileSystem(Path.java:187)atorg.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:372)atorg.apache.hadoop.examples.WordCount.run(WordCount.java:75)atorg.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)atorg.apache.hadoop.examples.WordCount.main(WordCount.java:85)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)atjava.lang.reflect.Method.invoke(Method.java:597)atorg.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)atorg.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)atorg.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:55)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)atjava.lang.reflect.Method.invoke(Method.java:597)atorg.apache.hadoop.util.RunJar.main(RunJar.java:156)Causedby:java.io.EOFExceptionatjava.io.DataInputStream.readInt(DataInputStream.java:375)atorg.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:841)atorg.apache.hadoop.ipc.Client$Connection.run(Client.java:786)

解决方案

解决方案二:
你好,你的程序报错是:EOFException,这个错误表示输入过程中意外地到达文件尾或流尾,从而导致异常,检查下你的程序代码呢
解决方案三:
谢谢。已解决,找到了产生错误的原因

时间: 2024-08-03 00:37:48

java.io.IOException: Call to Hadoop.Master/IP:port failed on local e的相关文章

java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

1:这个问题都被大家玩烂了,这里我也记载一下,方便以后脑补: 1 SLF4J: Class path contains multiple SLF4J bindings. 2 SLF4J: Found binding in [jar:file:/E:/360Downloads/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] 3 SLF4J:

hadoop错误,重新格式化namenode后,出现java.io.IOException Incompatible clusterIDs

错误:     java.io.IOException: Incompatible clusterIDs in /data/dfs/data: namenode clusterID = CID-d1448b9e-da0f-499e-b1d4-78cb18ecdebb; datanode clusterID = CID-ff0faa40-2940-4838-b321-98272eb0dee3 原因:     每次namenode format会重新创建一个namenodeId,而data目录包含了

hive -- java.io.IOException: Cannot create an instance of InputSplit

hive -- java.io.IOException: Cannot create an instance of InputSplit 博客分类: hadoophbasehive hadoophbasehive&http://www.aliyun.com/zixun/aggregation/37954.html">nbsp; hive中执行: select * from ht_custmer;  没问题: 执行: select * from ht_customer where b

Caused by: java.io.IOException: Filesystem closed的处理

org.apache.hadoop.hive.ql.metadata.HiveException: Unable to rename output from: hdfs://nameservice/user/hive/warehouse/om_dw.db/mac_wifi_day_data/tid=CYJOY/.hive-staging_hive_2016-01-20_10-19-09_200_1283758166994658237-1/_task_tmp.-ext-10002/c_date=2

java.io.IOException: Timed out waiting 20000ms for a quorum of nodes to respond

16-11-14 21:23:41,540 FATAL org.apache.hadoop.hdfs.server.namenode.FSEditLog: Error: starting log segment 4234 failed for required journal (JournalAndStream(mgr=QJM to [192.168.58.183:8485, 192.168.58.181:8485, 192.168.58.182:8485], stream=null))java

在eclipse的console栏中一直显示java.io.IOException

在eclipse的console栏中一直显示java.io.IOException: 您的主机中的软件中止了一个已建立错误.具体如下. [2013-09-02 17:24:14 - ddmlib] 您的主机中的软件中止了一个已建立的连接. java.io.IOException: 您的主机中的软件中止了一个已建立的连接. at sun.nio.ch.SocketDispatcher.write0(Native Method) at sun.nio.ch.SocketDispatcher.writ

jsp标签-在JSP中抛出java.io.IOException: tmpFile.renameTo(classFile) failed异常怎么解决

问题描述 在JSP中抛出java.io.IOException: tmpFile.renameTo(classFile) failed异常怎么解决 在JSP中抛出java.io.IOException: tmpFile.renameTo(classFile) failed异常怎么解决 解决方案 你确定不是控制台抛出而是JSP抛出! 解决方案二: java.io.IOException: tmpFile.renameTo(classFile) failedjava.io.IOException:

Execute failed: java.io.IOException: Cannot run program &amp;quot;sdk-linux/build-tools/22.0.0/aapt&amp;quot;: error=2

在Linux上使用ant编译打包apk的时候,出现下面的错误及解决方法: 1./usr/local/android-sdk-linux/tools/ant/build.xml:698: Execute failed: java.io.IOException: Cannot run program "/usr/local/android-sdk-linux/build-tools/22.0.0/aapt": error=2, No such file or directory BUILD

java.io.IOException: Server returned HTTP response code: 505 for URL

问题描述 碰到个很奇怪的问题:代码如下:URL url = new URL(urlStr);URLConnection hpCon = url.openConnection();InputStream in = hpCon.getInputStream(); 然后我发送的一个URL是 : http://localhost:8000/account/accountTo.query?param=SR4A V12f报错:java.io.IOException: Server returned HTTP