1. 配置ssh本地登录证书
注:单机伪分布模式运行情况下,Hadoop不会使用SSH进行通信,可以省略如下步骤:
[root@localhost hadoop-1.2.1]# which ssh
/usr/bin/ssh
[root@localhost hadoop-1.2.1]# which ssh-keygen
/usr/bin/ssh-keygen
[root@localhost hadoop-1.2.1]# which sshd
/usr/sbin/sshd
[root@localhost hadoop-1.2.1]# ssh-keygen -t rsa
Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa):
/home/hadoop/hadoop-1.2.1/.ssh/id_rsa ----需要在hadoop应用下建立.ssh文件目录
www.002pc.com认为此文章对《Windows7下eclipse调试Fedora虚拟机的hadoop+hbase伪分布式(第二节)》说的很在理。
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in
/home/hadoop/hadoop-1.2.1/.ssh/id_rsa.
Your public key has been saved in
/home/hadoop/hadoop-1.2.1/.ssh/id_rsa.pub.
The key fingerprint is:
51:67:a7:2d:58:09:60:eb:3e:f8:18:bf:8c:56:96:a0 root@localhost.localdomain
The key's randomart image is:
+--[ RSA 2048]----+
| o.o.+.. |
| . o =.+ |
| o . o . |
| .. . . |
| . .S. |
| E o+ |
| ooo |
| .B . |
| .o =. |
+-----------------+
最终将在/home/hadoop/.ssh/路径下生成私钥id_rsa和公钥id_rsa.pub
[root@localhost hadoop-1.2.1]# cd /home/hadoop/hadoop-1.2.1/.ssh
[root@localhost .ssh]#ls –l
total 8
-rw-------. 1 root root 1675 Nov 25 20:58 id_rsa
-rw-r--r--. 1 root root 408 Nov 25 20:58 id_rsa.pub
修改sshd服务配置文件:
[root@localhost .ssh]# vi /etc/ssh/sshd_config
修改如下内容:
RSAAuthentication yes
PubkeyAuthentication yes
AuthorizedKeysFile .ssh/authorized_keys
保存并退出,然后重启sshd服务
[root@localhost .ssh]# service sshd restart
将ssh证书公钥拷贝至/home/hadoop/hadoop-1.2.1/.ssh/authorized_keys文件中
[root@localhost .ssh]# ls
id_rsa id_rsa.pub
[root@localhost .ssh]# cat id_rsa.pub >> authorized_keys
非root用户需要修改~/.ssh/authorized_keys文件的权限为644,~/.ssh文件夹的权限为700,/home/hadoop文件夹的权限为700(权限正确是成功认证的先决条件)
[root@localhost .ssh]# chmod 644 authorized_keys
[root@localhost .ssh]# ssh 192.168.11.129
The authenticity of host '192.168.11.129 (192.168.11.129)' can't be established.
RSA key fingerprint is
9c:7a:90:bf:ab:7a:22:fb:5c:7b:2c:7c:06:eb:a6:ee.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '192.168.11.129' (RSA) to the list of known hosts.
root@192.168.11.129's password:
Last login: Sun Nov 23 19:10:12 2014 from 192.168.11.1
2.在安装hadoop应用服务前,需要安装JDK运行环境
[root@localhost ~]# java -version
java version "1.8.0_25"
Java(TM) SE Runtime Environment (build 1.8.0_25-b17)
Java HotSpot(TM) Client VM (build 25.25-b02, mixed mode, sharing)
如果没有安装需要到http://java.com/zh_CN/download/help/linux_install.xml下载安装包jre-7u71-linux-i586.tar.gz
Java文件将安装在当前目录jre1.7.0_07目录中
,解压缩 tarball
并安装 Java
tar zxvf jre-7u7-linux-i586.tar.gz
rpm –ivh jre-7u7-linux-i586.rpm
设置java环境变量(若为开发者使用,建议将所有用户的shell都有权使用这些环境变量)
用文本编辑器打开/etc/profile·在profile文件末尾加入:
export JAVA_HOME=/usr/java/jre1.8.0_25
export PATH=$JAVA_HOME/bin:$PATH
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
注解
a.
你要将/usr/java/jre1.8.0_25 改为你的jre安装目录
b. linux下用冒号“:”来分隔路径
c. $PATH / $CLASSPATH / $JAVA_HOME
是用来引用原来的环境变量的值。在设置环境变量时特别要注意不能把原来的值给覆盖掉了,这是一种常见的错误。
d. CLASSPATH中当前目录“.”不能丢,把当前目录丢掉也是常见的错误。
e. export是把这三个变量导出为全局变量。
f.
大小写必须严格区分。
使环境变量的更改马上起作用:source /etc/profile
检查环境变量更改是否生效: java -version
3.安装hadoop
在Hadoop官网下载hadoop-1.2.1.tar.gz并上传至服务器/home/hadoop路径下
1.
[root@localhost ~]$ tar -xzf hadoop-1.2.1.tar.gz
2.
[root@localhost ~]$ rm -rf hadoop-1.2.1.tar.gz
3.
[root@localhost ~]$ cd hadoop-1.2.1/conf/
4.
[root@localhost conf]$ vi hadoop-env.sh
更多:Windows7下eclipse调试Fedora虚拟机的hadoop+hbase伪分布式(第二节)
https://www.002pc.com/diannaojichu/898.html
你可能感兴趣的虚拟机,Fedora,eclipse,hadoop,Windows7,hbase
