ubuntu 64位编译hadoop-2.6.0失败,网上看了好久,都没解决,大神速来解救

问题描述

ubuntu 64位编译hadoop-2.6.0失败,网上看了好久,都没解决,大神速来解救
[exec] CMake Error at /usr/local/share/cmake-2.6/Modules/FindPackageHandleStandardArgs.cmake:52 (MESSAGE):
[exec] Could NOT find ZLIB
[exec] Call Stack (most recent call first):
[exec] /usr/local/share/cmake-2.6/Modules/FindZLIB.cmake:22 (FIND_PACKAGE_HANDLE_STANDARD_ARGS)
[exec] CMakeLists.txt:107 (find_package)

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...... @ 4:139 in /home/cj/workspace/hadoop-2.6.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems you can resume the build with the command
[ERROR] mvn -rf :hadoop-common

好像是 Could NOT find ZLIB引起的

解决方案

重新装下findbugs

解决方案二:
楼主怎么解决这个问题的啊?

解决方案三:
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.381 s]
[INFO] Apache Hadoop Project POM .......................... FAILURE [ 0.416 s]
[INFO] Apache Hadoop Annotations .......................... SKIPPED
[INFO] Apache Hadoop Assemblies ........................... SKIPPED
[INFO] Apache Hadoop Project Dist POM ..................... SKIPPED
[INFO] Apache Hadoop Maven Plugins ........................ SKIPPED
[INFO] Apache Hadoop MiniKDC .............................. SKIPPED
[INFO] Apache Hadoop Auth ................................. SKIPPED
[INFO] Apache Hadoop Auth Examples ........................ SKIPPED
[INFO] Apache Hadoop Common ............................... SKIPPED
[INFO] Apache Hadoop NFS .................................. SKIPPED
[INFO] Apache Hadoop KMS .................................. SKIPPED
[INFO] Apache Hadoop Common Project ....................... SKIPPED
[INFO] Apache Hadoop HDFS ................................. SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] hadoop-yarn-site ................................... SKIPPED
[INFO] hadoop-yarn-registry ............................... SKIPPED
[INFO] hadoop-yarn-project ................................ SKIPPED
[INFO] hadoop-mapreduce-client ............................ SKIPPED
[INFO] hadoop-mapreduce-client-core ....................... SKIPPED
[INFO] hadoop-mapreduce-client-common ..................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED
[INFO] hadoop-mapreduce-client-app ........................ SKIPPED
[INFO] hadoop-mapreduce-client-hs ......................... SKIPPED
[INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
[INFO] hadoop-mapreduce ................................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
[INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
[INFO] Apache Hadoop Archives ............................. SKIPPED
[INFO] Apache Hadoop Rumen ................................ SKIPPED
[INFO] Apache Hadoop Gridmix .............................. SKIPPED
[INFO] Apache Hadoop Data Join ............................ SKIPPED
[INFO] Apache Hadoop Ant Tasks ............................ SKIPPED
[INFO] Apache Hadoop Extras ............................... SKIPPED
[INFO] Apache Hadoop Pipes ................................ SKIPPED
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.966 s
[INFO] Finished at: 2015-11-22T09:13:36+08:00
[INFO] Final Memory: 46M/361M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /usr/local/hadoop-2.6.0-src/hadoop-project/target/antrun/build-main.xml (No such file or directory) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems you can resume the build with the command
[ERROR] mvn -rf :hadoop-project
不知道楼主如何解决此问题,十分感谢

解决方案四:

   这个是因为没有从网络上下载相关的文件,当你执行编译时会自动下一些文件,可能是你没下完中断了,

还有可能是hadoop文件自身有损坏倒是的没有下完整,重新下个hadoop再试试。实在不行给你个链接吧!
现在官网的native从之前的32位都改成64位了,所以之前的64位要自己编译,但现在32位的要自己编译,这
个是别人编译好了的32位。
http://pan.baidu.com/s/1sjsN0T3

时间: 2024-10-22 06:43:17

ubuntu 64位编译hadoop-2.6.0失败,网上看了好久,都没解决,大神速来解救的相关文章

ubuntu15.10 64位编译Android 5.0源码

问题描述 ubuntu15.10 64位编译Android 5.0源码 使用的是ubuntu15.10 64 bit的,下载了Android 5.0的源码在, 准备编译,编译过程报如下错误: Install: out/host/linux-x86/bin/acp Yacc: aidl <= frameworks/base/tools/aidl/aidl_language_y.y prebuilts/misc/linux-x86/bison/bison -d -o out/host/linux-x

mac OS X Yosemite 上编译hadoop 2.6.0/2.7.0及TEZ 0.5.2/0.7.0 注意事项

1.jdk 1.7问题 hadoop 2.7.0必须要求jdk 1.7.0,而oracle官网已经声明,jdk 1.7 以后不准备再提供更新了,所以趁现在还能下载,赶紧去down一个mac版吧 http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html 应该选择mac ox 64位的版本 http://download.oracle.com/otn-pub/java/jdk/7u79-b1

Win7 64位编译boost内存占满卡死

问题描述 Win7 64位编译boost内存占满卡死 Win7 4g内存 VS2013 64位编译Boost 1_57_0 运行命令bjam.exe stage --toolset=msvc-12.0 --without-graph --without-graph_parallel --without-math --without-mpi --without-serialization --without-wave --without-test --without-program_options

ubuntu 64位android项目报错的解决方案,打开64位 Ubuntu 的32位支持功能

ubuntu的64位下的android环境,说实话,还真得费点精力了,解决一个问题,又出来一个新问题. 小编昨天刚好不容易将android的环境搭建好了,这不,刚建了个项目,直接就报错,下面是罗列出的几条: 1. libstdc++.so.6:cannot open shared object file:no such file or directory 2. Description Resource Path Location Type Error executing aapt: Cannot

64位编译的AES256加密算法,32位编译的能解密吗

问题描述 64位编译的AES256加密算法,32位编译的能解密吗 采用加密算法AES256(libtomcrypt),加密报文的应用程序采用的64位编译的, 解密报文的应用程序是32位,结果解密不了,是AES256针对不同系统,32位和64位编译有啥区别吗 解决方案 按理说没问题,只是你的加密和解密某一侧的代码写错了. 解决方案二: 应该和编译的位数没有关系... 解决方案三: 先看看是不是只有三十二位程序解密失败 然后就是看解密出错的时候错误信息

win7 32位-vs2005 64位编译时变成编译32位的工程

问题描述 vs2005 64位编译时变成编译32位的工程 操作系统为:WIN7 32位,旗舰版:使用VS2005编译器(由于项目限制要求,我也知道有VS2010),属于SP2补丁,已经设置活动平台为X64,但每次编译时,都是启动的编译Release32位的编译过程并且成功,那64位设置根本没用,请问到底如何解决这个无法编译的问题.附上一些图片: 如果看不到图片大家可看http://zhidao.baidu.com/question/431965115531269684.html?quesup2&

UPUPW 更新 64 位 Apache 系列 PHP 7.0 正式版_php实例

UPUPW PHP特点 UPUPW PHP环境集成包Apache最新版全程采用Discuz!X2.5 phpwind9.0 dedecms5.7 wordpress3.5.1等程序的UTF8版作为PHP环境的测试对象进行开发,完美运行以上程序! UPUPW PHP环境集成包可运行于任意版本任何架构的Windows系统之上(Windows Server 2003/2008 ; XP/Vista/Win7/Win8 ; 32/64位) ; UPUPW PHP环境集成包完全兼容IIS,独具代理虚拟主机

为什么64位360杀毒5.0尝鲜版禁止搜狗输入法?

问题描述 对于占用内存的角度来看360PC产品是最占用内存导致计算机卡死的主要原因,但是偏偏64位360杀毒5.0尝鲜版就以搜狗输入法占用内存为由禁止搜狗输入法的运行进程,网帖地址是:http://bbs.360safe.com/thread-2788468-1-1.html.

lua android-cocos2dx lua 打包成android 出现的问题纠结了一天 都没解决,有哪位大神能教教吗

问题描述 cocos2dx lua 打包成android 出现的问题纠结了一天 都没解决,有哪位大神能教教吗 编译了几次都不行,在网上找 也没有能解决问题,帮帮忙