場(chǎng)景介紹:主機(jī)mac PRo,安裝了兩臺(tái)虛擬機(jī),虛擬機(jī)均為Ubuntu系統(tǒng)
ubuntu系統(tǒng)配置jdk
1、到 Sun 的官網(wǎng)下載
2、解壓所下載的文件
lixiaojiao@ubuntu:~/software$ tar -zxvf jdk-7u79-linux-x64.tar.gz
3、配置java環(huán)境變量
lixiaojiao@ubuntu:~/software$ vi ~/.bashrc
跳至文件結(jié)束處

添加
export JAVA_HOME=/home/lixiaojiao/software/jdk1.7.0_79export CLASS_PATH=.:$CLASS_PATH:$JAVA_HOME/libexport PATH=.:$PATH:$JAVA_HOME/bin
如圖所示

保存并退出
4、配置完畢后并沒有立即生效,需要使用下面命令后生效
lixiaojiao@ubuntu:~/software$ source ~/.bashrc
5、驗(yàn)證是否配置成功,出現(xiàn)下面效果證明成功
lixiaojiao@ubuntu:~/software$ java -version

mac機(jī)器配置hadoop2.6.1環(huán)境(java環(huán)境之前已經(jīng)配置)
1、解壓hadoop下載文件
lixiaojiaodeMacBook-Pro:zipFiles lixiaojiao$ tar -zxvf hadoop-2.6.1.tar.gz
2.本人將云計(jì)算相關(guān)的放入到目錄中cloudcomputing,查看目錄結(jié)構(gòu)

3.設(shè)置ssh遠(yuǎn)程登錄
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ssh-keygen -t rsa -P ""
執(zhí)行以下命令
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
驗(yàn)證是否成功
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ssh localhost
出現(xiàn)下圖,失敗。。

原因是系統(tǒng)的ssh遠(yuǎn)程登錄沒有打開
首先在系統(tǒng)偏好設(shè)置->共享->遠(yuǎn)程登錄,打開遠(yuǎn)程登錄

再次執(zhí)行命令
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ssh localhost
出現(xiàn)下圖證明成功

3. 切換到etc目錄中,查看配置文件
lixiaojiaodeMacBook-Pro:cloudcomputing lixiaojiao$ cd hadoop-2.6.1/etc/hadoop/

4.修改配置文件
切換到/Users/lixiaojiao/software/cloudcomputing/hadoop-2.6.1/etc/hadoop目錄
(1) 配置core-site.xml
lixiaojiaodeMacBook-Pro:hadoop lixiaojiao$ vi core-site.xml
添加在
<configuration>
</configuration>中間增加如下配置
<property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value></property>
(2) 配置yarn-site.xml
增加如下配置
<property> <name>yarn.noCHdemanager.aux-services</name> <value>mapreduce_shuffle</value></property><property> <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name> <value>org.apache.hadoop.mapred.ShuffleHandler</value></property>(3) 創(chuàng)建和配置mapred-site.xml,將該目錄中的mapred-site.xml.template復(fù)制為mapred-site.xml并添加配置如下
<property> <name>mapreduce.framework.name</name> <value>yarn</value></property>
(4) 配置hdfs-site.xml,首先在
/Users/lixiaojiao/software/cloudcomputing/hadoop-2.6.1/中新建目錄,hdfs/data和hdfs/name,并添加如下配置
<property> <name>dfs.replication</name> <value>1</value></property><property> <name>dfs.namenode.name.dir</name> <value>file:/Users/lixiaojiao/software/cloudcomputing/hadoop-2.6.1/hdfs/name</value></property><property> <name>dfs.datanode.data.dir</name> <value>file:/Users/lixiaojiao/software/cloudcomputing/hadoop-2.6.1/hdfs/data</value></property>
(5) 格式化hdfs
lixiaojiaodeMacBook-Pro:bin lixiaojiao$ ./hdfs namenode -format
出現(xiàn)下圖


(6) 啟動(dòng)hadoop
切換到sbin目錄下
lixiaojiaodeMacBook-Pro:bin lixiaojiao$ cd ../sbin/
執(zhí)行
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ./start-dfs.sh

執(zhí)行
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ./start-yarn.sh

瀏覽器打開 http://localhost:50070/,會(huì)看到hdfs管理頁面

瀏覽器打開 http://localhost:8088/,會(huì)看到hadoop進(jìn)程管理頁面

在第六部運(yùn)行
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ./start-dfs.sh
時(shí)候出現(xiàn)下面提示
computing/hadoop-2.6.1/logs/hadoop-lixiaojiao-secondarynamenode-lixiaojiaodeMacBook-Pro.local.out2015-10-18 10:08:43.887 java[1871:37357] Unable to load realm info from SCDynamicStore15/10/18 10:08:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicablelixiaojiaodeMacBook-Pro:sbin lixiaojiao$
原因?yàn)楣俜教峁┑膌ib目錄中.so文件是在32位系統(tǒng)下編譯的,如果是但是本人的mac機(jī)器是64位系統(tǒng),需要自己下載源碼在64位上重新編譯,由于本人下載源碼嘗試了很久也沒成功,最終放棄了,下載了牛人編譯好的64位打包程序,地址位http://yun.baidu.com/s/1c0rfIOo#dir/path=%252Fbuilder,并下載這個(gè)正常的32位hadoop程序包,http://www.aboutyun.com/thread-6658-1-1.html,下載成功后,將下載的64位build文件中的native覆蓋掉lib目錄下的native文件,并重新按照上面的部分進(jìn)行配置。
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ./start-all.sh
當(dāng)重新執(zhí)行到上面的命令時(shí)候出現(xiàn)下面問題:
lixiaojiaodeMacBook-Pro:sbin lixiaojiao$ ./start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh2015-10-19 21:18:29.414 java[5782:72819] Unable to load realm info from SCDynamicStore
按照網(wǎng)上牛人的做法還是不行,最后只有換jdk,重新安裝jdk
#export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Homeexport JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/驗(yàn)證是否成功
創(chuàng)建input目錄
lixiaojiaodeMacBook-Pro:hadoop-2.2.0 lixiaojiao$ hadoop fs -mkdir -p input
將本地文件上傳到hdfs文件系統(tǒng)中

lixiaojiaodeMacBook-Pro:cloudcomputing lixiaojiao$ hadoop fs -copyFromLocal README.txt input
出現(xiàn)下面新問題

將fs.default.name中的IP地址改為127.0.0.1。
切換到share/hadoop/mapreduce目錄中執(zhí)行下面語句
hadoop jar hadoop-mapreduce-examples-2.2.0.jar Wordcount input output


執(zhí)行下面命令看是否生成output目錄
hadoop fs -ls


執(zhí)行下面命令查看具體結(jié)果
hadoop fs -cat

因此本人為了方便直接將配置好的hadoop文件copy到其他的Ubuntu系統(tǒng)中,直接使用scp命令,需要Ubuntu系統(tǒng)中開啟ssh
lixiaojiaodeMacBook-Pro:cloudcomputing lixiaojiao$ scp -r hadoop-2.2.0 lixiaojiao@192.168.31.126:/home/lixiaojiao/software
lixiaojiaodeMacBook-Pro:cloudcomputing lixiaojiao$ scp -r hadoop-2.2.0 lixiaojiao@192.168.31.218:/home/lixiaojiao/software
然后通過執(zhí)行之上的wordcount程序驗(yàn)證是否成功
新聞熱點(diǎn)
疑難解答
圖片精選
網(wǎng)友關(guān)注