-->

Error in starting namenode in hadoop 2.4.1

2020-03-08 05:45发布

问题:

When I try to start dfs using:

start-dfs.sh

I get an error saying :

14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable Starting namenodes on [OpenJDK 64-Bit Server VM
warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'. localhost] sed: -e
expression #1, char 6: unknown option to `s' Server: ssh: Could not
resolve hostname Server: Name or service not known
-c: Unknown cipher type 'cd' stack: ssh: Could not resolve hostname stack: Name or service not known 64-Bit: ssh: Could not resolve
hostname 64-Bit: Name or service not known guard.: ssh: Could not
resolve hostname guard.: Name or service not known The: ssh: Could not
resolve hostname The: Name or service not known guard: ssh: Could not
resolve hostname guard: Name or service not known might: ssh: Could
not resolve hostname might: Name or service not known stack: ssh:
Could not resolve hostname stack: Name or service not known will: ssh:
Could not resolve hostname will: Name or service not known the: ssh:
Could not resolve hostname the: Name or service not known fix: ssh:
Could not resolve hostname fix: Name or service not known VM: ssh:
Could not resolve hostname VM: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known It's: ssh:
Could not resolve hostname It's: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
localhost: namenode running as process 4463. Stop it first. library:
ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known VM: ssh: Could not resolve hostname VM: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known loaded: ssh: Could not resolve hostname loaded: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known <libfile>',: ssh: Could not resolve hostname
<libfile>',: Name or service not known to: ssh: connect to host to
port 22: Connection refused OpenJDK: ssh: Could not resolve hostname
OpenJDK: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known with: ssh: Could not resolve
hostname with: Name or service not known fix: ssh: Could not resolve
hostname fix: Name or service not known noexecstack'.: ssh: Could not
resolve hostname noexecstack'.: Name or service not known that: ssh:
Could not resolve hostname that: Name or service not known you: ssh:
Could not resolve hostname you: Name or service not known or: ssh:
Could not resolve hostname or: Name or service not known highly: ssh:
Could not resolve hostname highly: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or
service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known link: ssh: Could not resolve
hostname link: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known '-z: ssh: Could not resolve
hostname '-z: Name or service not known localhost: datanode running as
process 4561. Stop it first. Starting secondary namenodes [OpenJDK
64-Bit Server VM warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.
0.0.0.0] sed: -e expression #1, char 6: unknown option to `s' OpenJDK: ssh: Could not resolve hostname OpenJDK: Name or service not known
-c: Unknown cipher type 'cd' VM: ssh: Could not resolve hostname VM: Name or service not known The authenticity of host '0.0.0.0 (0.0.0.0)'
can't be established. ECDSA key fingerprint is
dd:64:53:7e:c0:62:40:c0:63:2b:5c:6d:1e:b6:cd:23. Are you sure you want
to continue connecting (yes/no)? might: ssh: Could not resolve
hostname might: Name or service not known Server: ssh: Could not
resolve hostname Server: Name or service not known guard.: ssh: Could
not resolve hostname guard.: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known The: ssh:
Could not resolve hostname The: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known will: ssh: Could not resolve hostname will: Name or service
not known the: ssh: Could not resolve hostname the: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known that: ssh: Could not resolve hostname that: Name or
service not known highly: ssh: Could not resolve hostname highly: Name
or service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known try: ssh: Could not resolve
hostname try: Name or service not known guard: ssh: Could not resolve
hostname guard: Name or service not known 64-Bit: ssh: Could not
resolve hostname 64-Bit: Name or service not known loaded: ssh: Could
not resolve hostname loaded: Name or service not known library: ssh:
Could not resolve hostname library: Name or service not known fix:
ssh: Could not resolve hostname fix: Name or service not known to:
ssh: connect to host to port 22: Connection refused link: ssh: Could
not resolve hostname link: Name or service not known stack: ssh: Could
not resolve hostname stack: Name or service not known '-z: ssh: Could
not resolve hostname '-z: Name or service not known you: ssh: Could
not resolve hostname you: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known recommended: ssh:
Could not resolve hostname recommended: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known <libfile>',: ssh: Could not resolve hostname <libfile>',: Name
or service not known or: ssh: Could not resolve hostname or: Name or
service not known noexecstack'.: ssh: Could not resolve hostname
noexecstack'.: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known ^C0.0.0.0: Host key
verification failed. ^C

My core-site.xml file contains this:

<configuration>
    <property>
       <name>fs.default.name</name>
       <value>hdfs://localhost:9000</value>
    </property>
</configuration>

My .profile (replacement for .bashrc) contains these lines:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

And I can easily ssh my localhost saying:

ssh localhost

Welcome to Linux Mint 16 Petra (GNU/Linux
3.11.0-12-generic x86_64)

Welcome to Linux Mint  * Documentation:  http://www.linuxmint.com Last
login: Wed Jul  2 16:51:15 2014 from localhost

回答1:

Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


Do it by replacing in your etc/hadoop/hadoop-env.sh line:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

with:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


(This solution has been found on Sumit Chawla's blog)



回答2:

Edit your .bashrc file and add the following lines:

export HADOOP_HOME=path_to_your_hadoop_folder
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

And although your ssh should be working by what you have just said, do it again just in case:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys


回答3:

It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...

export HADOOP_INSTALL=/path/to/hadoop/


回答4:

please check your HADOOP_CONF_DIR (most likely in bashrc) It should be pointing to $HADOOP_HOME/etc/hadoop