问题描述
HADOOP版本2.4.0JDK1.7.0代码:在map里面FSDataOutputStreamout=fs.create(tmpPath,true);if(!ftp.download(src_file,out)){log.error(10014,"下载文件"+src_file+"失败,采集点:"+collectId);out.close();}报错:org.apache.commons.net.io.CopyStreamException:IOExceptioncaughtwhilecopying.atorg.apache.commons.net.io.Util.copyStream(Util.java:135)atorg.apache.commons.net.ftp.FTPClient._retrieveFile(FTPClient.java:1695)atorg.apache.commons.net.ftp.FTPClient.retrieveFile(FTPClient.java:1669)atcom.eshore.icps.ftp.Ftp.download(Ftp.java:119)atcom.eshore.icps.collect.CollectMapper.map(CollectMapper.java:228)atcom.eshore.icps.collect.CollectMapper.run(CollectMapper.java:99)atorg.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)atorg.apache.hadoop.mapred.MapTask.run(MapTask.java:340)atorg.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)atjava.security.AccessController.doPrivileged(NativeMethod)atjavax.security.auth.Subject.doAs(Subject.java:415)atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)atorg.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)Causedby:java.nio.channels.ClosedChannelExceptionatorg.apache.hadoop.hdfs.DFSOutputStream.checkClosed(DFSOutputStream.java:1528)atorg.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:98)atorg.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)atjava.io.DataOutputStream.write(DataOutputStream.java:107)atorg.apache.commons.net.io.Util.copyStream(Util.java:123)...12more可以部分下载成功,比如本来要下载10个文件,下载成功了2个,开始下载第3个的时候卡了一会就会报这个错误,一般是遇到大文件,速度在300K/S左右