One, ready to compile software
1. downloaded from the website JDK1.7, maven3.2.1, ANT1.9.4, set up the environment variable can be used to extract.
Environment variables are set as follows:
(1)/profile
(2) at the end of the file add:
export JAVA_HOME=/home/spark/jdk1.7
export MAVEN_HOME=/home/spark/apache-maven-3.2.1
export ANT_HOME=/home/spark/apache-ant-1.9.4
export PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin:$PATH
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
(3)/profile
2. the official website download protobuf2.5.0 unzip into the directory are executed in the following order:
./configure
make
make check
make install
3. install other software, I installed using yum command.
Yum install cmake openssl-devel autoconf automake libtool (some may bring their own software operating system, installation, use the command to view)
Second, download Hadoop source compiles
1. log into the official website: http://Hadoop.Apache.org/and find the download link, download.
2. extract the compilation:
tar zxvf hadoop-2.4.0-src.tar.gz
cd hadoop-2.4.0-src
mvn package -Pdist,native -DskipTests -Dtar
Half an hour after the BUILD SUCCESS, which compiles successfully, compiled files can be located in the Hadoop-2.4.0-src/Hadoop-dist/target folder.
Third, reference documents
http://www.cnblogs.com/shishanyuan/p/4164104.html
http://www.aboutyun.com/thread-8130-1-1.html
Leave a Reply