CentOS 64位上编译 Hadoop2.6.0

由于hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时会出错,比如:

java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V 

所以需要重新编译

1.编译环境

yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst

2.安装JDK(下载JDK1.7,只能用1.7,否则编译会出错)
下载页面:   http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html

tar -zxvf jdk-7u75-linux-x64.tar.gz -C /usr/local

export JAVA_HOME=/usr/local/jdk1.7.0_75
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export PATH=$PATH:$JAVA_HOME/bin

3.安装protobuf

下载protobuf-2.5.0,不能用高版本,否则Hadoop编译不能通过
wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz    或者 在百度云盘上下载:http://yun.baidu.com/share/link?shareid=830873155&uk=3573928349

tar -zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
./configure
make
make install

protoc --version

4.安装ANT

wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz
 tar -zxvf apache-ant-1.9.4-bin.tar.gz -C /usr/local

vi /etc/profile
export ANT_HOME=/usr/local/apache-ant-1.9.4
export PATH=$PATH:$ANT_HOME/bin

5.安装maven

wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.3.1/binaries/apache-maven-3.3.1-bin.tar.gz

tar -zxvf apache-maven-3.3.1-bin.tar.gz -C /usr/local

vi /etc/profile
export MAVEN_HOME=/usr/local/apache-maven-3.3.1
export PATH=$PATH:$MAVEN_HOME/bin

修改配置文件
vi /usr/local/apache-maven-3.3.1/conf/settings.xml

更改maven资料库,在<mirrors></mirros>里添加如下内容:

   <mirror>
        <id>nexus-osc</id>
         <mirrorOf>*</mirrorOf>
     <name>Nexusosc</name>
     <url>http://maven.oschina.net/content/groups/public/</url>
   </mirror>

在<profiles></profiles>内新添加

<profile>
       <id>jdk-1.7</id>
       <activation>
         <jdk>1.7</jdk>
       </activation>
       <repositories>
         <repository>
           <id>nexus</id>
           <name>local private nexus</name>
           <url>http://maven.oschina.net/content/groups/public/</url>
           <releases>
             <enabled>true</enabled>
           </releases>
           <snapshots>
             <enabled>false</enabled>
           </snapshots>
         </repository>
       </repositories>
       <pluginRepositories>
         <pluginRepository>
           <id>nexus</id>
          <name>local private nexus</name>
           <url>http://maven.oschina.net/content/groups/public/</url>
           <releases>
             <enabled>true</enabled>
           </releases>
           <snapshots>
             <enabled>false</enabled>
           </snapshots>
         </pluginRepository>
       </pluginRepositories>
</profile>

在shell下执行,使环境变量生效
source /etc/profile

7.编译 Hadoop2.6.0

wget http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz
cd hadoop-2.6.0-src
mvn package -DskipTests -Pdist,native -Dtar

如果是第一次使用maven,会打印很多如下日志信息

Downloading: http://maven.oschina.net/...

Scanning for projects...

...

[INFO] Apache Hadoop Main ................................. SUCCESS [  4.590 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  3.503 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  5.870 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.540 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  3.921 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  7.731 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  6.805 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  9.008 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  6.991 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [03:12 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 16.557 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 24.476 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.115 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [05:09 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 40.145 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 15.876 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  9.236 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.125 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.129 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [02:49 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [01:01 min]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.099 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 25.019 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 33.655 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  5.761 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 13.714 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 41.930 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 13.364 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 17.408 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.042 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  5.131 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  3.710 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.107 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [ 12.531 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  7.781 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.116 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 47.915 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 38.104 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  9.073 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [01:01 min]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 18.149 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  9.002 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  3.222 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 13.224 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  6.571 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  9.781 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 16.254 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  5.302 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 13.760 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  8.858 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  6.252 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  4.276 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  6.206 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  1.945 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 12.239 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 38.137 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 13.213 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.169 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 13.206 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 15.248 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.162 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:09 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:19 min
[INFO] Finished at: 2015-03-26T17:54:10+08:00
[INFO] Final Memory: 106M/402M
[INFO] ------------------------------------------------------------------------

经过漫长的等待编译过程后,编译成功后会打包,放在hadoop-dist/target

#ll

total 528824
drwxr-xr-x 2 root root      4096 Mar 26 17:53 antrun
-rw-r--r-- 1 root root      1874 Mar 26 17:53 dist-layout-stitching.sh
-rw-r--r-- 1 root root       647 Mar 26 17:53 dist-tar-stitching.sh
drwxr-xr-x 9 root root      4096 Mar 26 17:53 hadoop-2.6.0
-rw-r--r-- 1 root root 180222548 Mar 26 17:53 hadoop-2.6.0.tar.gz
-rw-r--r-- 1 root root      2777 Mar 26 17:53 hadoop-dist-2.6.0.jar
-rw-r--r-- 1 root root 361254421 Mar 26 17:54 hadoop-dist-2.6.0-javadoc.jar
drwxr-xr-x 2 root root      4096 Mar 26 17:53 javadoc-bundle-options
drwxr-xr-x 2 root root      4096 Mar 26 17:53 maven-archiver
drwxr-xr-x 2 root root      4096 Mar 26 17:53 test-dir

编译后的文件见百度云盘

然后把lib下native下的文件覆盖掉hadoop下native中文件就ok了

时间: 2024-09-30 11:35:38

CentOS 64位上编译 Hadoop2.6.0的相关文章

ubuntu 64位编译hadoop-2.6.0失败,网上看了好久,都没解决,大神速来解救

问题描述 ubuntu 64位编译hadoop-2.6.0失败,网上看了好久,都没解决,大神速来解救 [exec] CMake Error at /usr/local/share/cmake-2.6/Modules/FindPackageHandleStandardArgs.cmake:52 (MESSAGE): [exec] Could NOT find ZLIB [exec] Call Stack (most recent call first): [exec] /usr/local/sha

在ubuntu12.04 64位下编译从github上下载的android系统最新源码,出现编译错误

问题描述 在ubuntu12.04 64位下编译从github上下载的android系统最新源码,出现编译错误 在ubuntu12.04 64位下编译从github上下载的android系统最新源码,出现编译错误 make: *** 没有规则可以创建"out/build-full.ninja"需要的目标"prebuilts/build-tools/linux-x86/bin/ckati". 停止. 耗费了2天了还没解决,急 解决方案 http://zhidao.ba

Centos 64位安装aapt、jdk、tomcat的详细教程_Linux

1.安装jdk #查看系统自带的jdk [root@localhost ~]# rpm -qa | grep jdk java-1.7.0-openjdk-1.7.0.79-2.5.5.4.el6.x86_64 java-1.6.0-openjdk-1.6.0.35-1.13.7.1.el6_6.x86_64 #卸载系统自带openjdk [root@localhost ~]# rpm -e --nodeps java-1.6.0-openjdk-1.6.0.35-1.13.7.1.el6_6.

centos 64位系统安装

由于centos 64位镜像大于4G,所以U盘装不进去.iso镜像,选择网络安装的方法或者使用一个U盘制作启动盘和一个硬盘来装镜像的方法. 1 网络安装第一步 下载 CentOS 安装 ISO 浏览 CentOS 镜像 页面. 单击镜像中靠近右变的一列的其中一个 HTTP 链接. 单击要使用的CentOS版本的文件夹链接.例如, 6.4 / . 点击isos/ 文件夹的链接. 要下载64位的文件点击 x86_64/ 文件夹链接. 点击你想下载的网络安装ISO镜像文件,例如: CentOS-6.4

CentOS 7.2 下编译安装PHP7.0.10+MySQL5.7.14+Nginx1.10.1的方法详解(mini版本)_php实例

一.安装前的准备工作 1.yum update #更新系统 2.yum install gcc gcc-c++ autoconf automake cmake bison m4 libxml2 libxml2-devel libcurl-devel libjpeg-devel libpng-devel libicu-devel #安装php.MySQL.Nngix所依赖的包 3.下载以下包 #我把所有源文件都下载在root目录,读者可自行修改源文件存放目录 3.1 libmcrypt-2.5.8

CentOS 7.2 下编译安装PHP7.0.10+MySQL5.7.14+Nginx1.10.1的方法详解(mini版本)

一.安装前的准备工作 1.yum update #更新系统 2.yum install gcc gcc-c++ autoconf automake cmake bison m4 libxml2 libxml2-devel libcurl-devel libjpeg-devel libpng-devel libicu-devel #安装php.MySQL.Nngix所依赖的包 3.下载以下包 #我把所有源文件都下载在root目录,读者可自行修改源文件存放目录 3.1 libmcrypt-2.5.8

Ubuntu10.10(64位)编译Android2.3源码问题解决

http://www.cnblogs.com/jvlstudio/archive/2011/05/14/2046562.html 1.安装Ubuntu 10.10      可选择使用Wubi安装. 2.安装Python      Ubuntu 10.10 已经安装了2.6版. 3.安装JDK      更新软件库:      sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner"      s

centos6 5 64位-centos6.5 64位上gdb调试无法使用了

问题描述 centos6.5 64位上gdb调试无法使用了 centos6.5 64位系统,突然gdb调试没办法是用了,使用gdb program_name时,打印如下信息: Could not find platform independent libraries Could not find platform dependent libraries Consider setting $PYTHONHOME to [:] ImportError: No module named site 网上

UPUPW 更新 64 位 Apache 系列 PHP 7.0 正式版_php实例

UPUPW PHP特点 UPUPW PHP环境集成包Apache最新版全程采用Discuz!X2.5 phpwind9.0 dedecms5.7 wordpress3.5.1等程序的UTF8版作为PHP环境的测试对象进行开发,完美运行以上程序! UPUPW PHP环境集成包可运行于任意版本任何架构的Windows系统之上(Windows Server 2003/2008 ; XP/Vista/Win7/Win8 ; 32/64位) ; UPUPW PHP环境集成包完全兼容IIS,独具代理虚拟主机