代码-java连接hadoop hdfs文件系统报错

问题描述

java连接hadoop hdfs文件系统报错 10C
报错信息:
java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: ""localhost.localdomain/127.0.0.1""; destination host is: ""172.16.6.57"":9000;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
at org.apache.hadoop.ipc.Client.call(Client.java:1229)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at $Proxy9.create(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at $Proxy9.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:193)
at org.apache.hadoop.hdfs.DFSOutputStream.(DFSOutputStream.java:1324)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1343)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1255)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1212)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:276)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:265)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:781)
at com.zk.hdfs.FileCopyToHdfs.uploadToHdfs(FileCopyToHdfs.java:44)
at com.zk.hdfs.FileCopyToHdfs.main(FileCopyToHdfs.java:21)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
at com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
at com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
at com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
at com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
at com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
at com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)

代码是在网上找的:package com.zk.hdfs;

import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.util.Progressable;

public class FileCopyToHdfs {

     public static void main(String[] args) throws Exception {      try {       uploadToHdfs();          //deleteFromHdfs();       //getDirectoryFromHdfs();

// appendToHdfs();
// readFromHdfs();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
finally
{
System.out.println(""SUCCESS"");
}
}
/**上传文件到HDFS上去*/

 public static void uploadToHdfs() throws FileNotFoundExceptionIOException {  String localSrc = ""e:/test.txt"";  String dst = ""hdfs://172.16.6.57:9000/user/abc/zk/test1.txt"";  InputStream in = new BufferedInputStream(new FileInputStream(localSrc));  Configuration conf = new Configuration();  FileSystem fs = FileSystem.get(URI.create(dst) conf);  OutputStream out = fs.create(new Path(dst) new Progressable() {   public void progress() {    System.out.print(""."");   }  });  IOUtils.copyBytes(in out 4096 true); }

}

总是报连接问题,网上搜不到资料,大牛帮下忙啊

解决方案

访问Hadoop的HDFS文件系统的Java实现
hadoop 的HDFS文件系统
Hadoop HDFS文件系统通过java FileSystem 实现上传下载等

解决方案二:
链接地址 跟你的电脑IP一样吗?

解决方案三:
local host is: ""localhost.localdomain/127.0.0.1""; destination host is: ""172.16.6.57"":9000; 这访问ip问题吧

时间: 2025-01-24 05:41:33

代码-java连接hadoop hdfs文件系统报错的相关文章

输入输出-hadoop hdfs 挂载报错

问题描述 hadoop hdfs 挂载报错 hadoop hdfs挂载在linux上市报错fuse-dfs didn't recognize /hadoop/hdfs,-2,然后打开/hadoop/hdfs时报输入输出目录错误 解决方案 http://blog.csdn.net/knowledgeaaa/article/details/25896581

window下连接hadoop集群报错

问题描述 window下连接hadoop集群报错 window下连接hadoop集群报错,已经把hadoop.dll放在window下的hadoop的bin目录了,system32也放了,还是无效,请问怎么办?? 解决方案 这是异常了吧,调用有问题.对应hadoop安装好了没 解决方案二: http://www.cnblogs.com/heyonggang/archive/2012/12/21/2827838.html 解决方案三: window下连接hadoop集群基础超详细版Window下使

Java访问Hadoop分布式文件系统HDFS的配置说明_java

配置文件 m103替换为hdfs服务地址. 要利用Java客户端来存取HDFS上的文件,不得不说的是配置文件hadoop-0.20.2/conf/core-site.xml了,最初我就是在这里吃了大亏,所以我死活连不上HDFS,文件无法创建.读取. <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <co

hadoop删除文件报错“NullPointException” 文件路径中文件名中文乱码

问题描述 hadoop删除文件报错"NullPointException" 文件路径中文件名中文乱码 大家好,我想请教一下关于hadoop文件删除的问题,希望有大牛在空闲之余给予指导. 问题描述:之前别的用户通过hadoop上传了文件,由于编码问题,之前人家上传的文件名读取出来 为中文乱码,例如:fileUrl==========hdfs://localhost:9000/user11417591508779_???????.docx, 现在我想删除这些乱码文件,由于文件路径中包含中文

软件开发-JAVA在下载的时候报错,各位大神路过顺便帮忙看下吧

问题描述 JAVA在下载的时候报错,各位大神路过顺便帮忙看下吧 ClientAbortException: java.io.IOException at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:369) at org.apache.tomcat.util.buf.ByteChunk.append(ByteChunk.java:368) at org.apache.catalina.co

证书服务-java访问AD活动目录报错 在线等

问题描述 java访问AD活动目录报错 在线等 最近在写向AD里面插入用户名和密码的代码,用ldap协议和389端口时可以向AD里面插入无密码的用户,现在改用ldaps方式和636端口后却连不上AD了,郁闷了好多天了,求大神帮助... 注:AD服务和证书服务没有在一台服务器上,我已经在证书服务器上下载了.cer文件,并导入到了d:/zhouwd.keystore文件里. 代码: Hashtable env = new Hashtable(); String adminName = "xdadmi

java-jcifs访问远程共享文件系统报错

问题描述 jcifs访问远程共享文件系统报错 在使用jcifs访问远程共享文件系统时:java.lang.RuntimeException: Plain text passwords are disabled. 前几天还是可以访问的.现在连接就报这个错,用户名,密码都是对的,可以远程连接进去.这是怎么回事啊 String smbMachine="smb://"+userName+":"+password+"@"+url+"/"

hadoop datanode日志报错

问题描述 hadoop datanode日志报错 2016-04-10 11:35:08,998 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: IOException in offerService java.io.EOFException: End of File Exception between local host is: "master/10.13.6.186"; destination host is: &quo

hadoop-windows平台安装Hadoop,启动报错No such file or directory

问题描述 windows平台安装Hadoop,启动报错No such file or directory 这几天在折腾windows下安装Hadoop,完全按照网上写的标准步骤. 参考博文:http://www.cnblogs.com/kinglau/p/3270160.html 好不容易到最后了,在启动Hadoop时,一直报错如标题. 格式化hdfs日志: $ bin/hadoop namenode -format DEPRECATED: Use of this script to execu