Problem
I’m configuring Hadoop on a CentOS server right now. I receive the following error when I execute start-dfs.sh or stop-dfs.sh:
Hadoop 2.2.0 is the version I’m using.
A web search yielded the following link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html
However, on Hadoop 2.x, the contents of the /native/ directory appear to be different, so I’m not sure what to do.
In hadoop-env.sh, I’ve additionally included the following two environment variables:
Any ideas?
Asked by Olshansk
Solution #1
I’m assuming you’re running Hadoop on CentOS 64bit. The native Hadoop library $HADOOP HOME/lib/native/libhadoop.so is the source of that warning. 1.0.0 was compiled in 32-bit mode.
It’s merely a heads-up, and it won’t affect Hadoop’s functionality.
Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile libhadoop.so. 1.0.0 on 64bit system, then replace the 32bit one.
For Ubuntu, here are the steps for recompiling source code:
Good luck.
Answered by zhutoulala
Solution #2
Simply add the word native to your HADOOP OPTS as follows:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
PS: Thank Searene
Answered by Hoai-Thu Vuong
Solution #3
The solution is debatable… On CentOS 6.6 64-bit, I recently installed Hadoop 2.6 from a tarball. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:
/opt/hadoop/lib/native/libhadoop.so.1.0.0
And I know it is 64-bit:
[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 => (0x00007fff43510000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
/lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)
Unfortunately, while I was focused on “Is this library 32 or 64 bit?” I mistakenly disregarded the answer there in front of me:
`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
As a result, the lesson has been learned. At the very least, the rest enabled me to silence the warning. So I kept trying to specify the library location using the HADOOP OPTS environment variable as suggested in the previous answers, but to no effect. So I went over the source code. The error module (util.NativeCodeLoader) provides the following hint:
15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
So, let’s check what it does here:
http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/
Ah, there’s some debug level logging available; let’s enable it and see if we can gain some extra assistance. Add the following line to the $HADOOP CONF DIR/log4j.properties file to do this:
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG
Then I ran a program like stop-dfs.sh, which generates the initial warning, and got this:
15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)
And the answer may be found in this sample of the debug message (which is the same as what the previous ldd command ‘tried’ to tell me:
`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)
What GLIBC version do I have? Here’s a quick way to figure it out:
[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12
As a result, I am unable to upgrade to OS 2.14. The only options are to build the native libraries from sources on my operating system or to suppress the warning and disregard it for the time being. I chose to disable the obnoxious warning for the time being (though I do want to build from sources in the future) by utilizing the same logging parameters we used to receive the debug message, but this time setting it to ERROR level.
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
I hope this demonstrates to people that one of the major advantages of open source software is that you can figure things out by following some easy logical steps.
Answered by chromeeagle
Solution #4
The same thing happened to me. It can be fixed by adding the following lines to your.bashrc file:
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
Answered by Neeraj
Solution #5
In my situation, I updated the native library in hadoop/lib after building hadoop on my 64-bit Linux mint OS. The issue continues to exist. Then I realized hadoop was pointing to hadoop/lib, not hadoop/lib/native. As a result, I simply copied all of the content from the native library to its parent. And the warning is no longer in effect.
Answered by koti
Post is based on https://stackoverflow.com/questions/19943766/hadoop-unable-to-load-native-hadoop-library-for-your-platform-warning