培森的Blog Hadoop无法启动

Hadoop无法启动

Unable to load native-hadoop library for your platform.…

Unable to load native-hadoop library for your platform... using builtin-java classes where appl...

1、查看日志信息,找寻报错原因;
$ export HADOOP_ROOT_LOGGER=DEBUG,console    # 输出日志
$ hadoop fs ls /
显示日志信息:
 
18/04/29 21:38:44 DEBUG util.Shell: setsid exited with exit code 0
18/04/29 21:38:44 DEBUG conf.Configuration: parsing URL jar:file:/usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar!/core-default.xml
18/04/29 21:38:44 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@54b24c03
18/04/29 21:38:44 DEBUG conf.Configuration: parsing URL file:/usr/local/hadoop-2.9.0/etc/hadoop/core-site.xml
18/04/29 21:38:44 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@7bc9a682
18/04/29 21:38:44 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
18/04/29 21:38:44 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
18/04/29 21:38:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of successful kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
18/04/29 21:38:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of failed kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
18/04/29 21:38:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[GetGroups], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
18/04/29 21:38:45 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Renewal failures since startup], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
18/04/29 21:38:45 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Renewal failures since last successful login], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
18/04/29 21:38:45 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
18/04/29 21:38:45 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true
18/04/29 21:38:45 DEBUG security.Groups:  Creating new Groups object
18/04/29 21:38:45 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
18/04/29 21:38:45 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/local/hadoop-2.9.0/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /usr/local/hadoop-2.9.0/lib/native/libhadoop.so.1.0.0)
18/04/29 21:38:45 DEBUG util.NativeCodeLoader: java.library.path=/usr/local/hadoop-2.9.0/lib/native
18/04/29 21:38:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/04/29 21:38:45 DEBUG util.PerformanceAdvisory: Falling back to shell based
18/04/29 21:38:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
18/04/29 21:38:45 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
18/04/29 21:38:45 DEBUG security.UserGroupInformation: hadoop login
18/04/29 21:38:45 DEBUG security.UserGroupInformation: hadoop login commit
18/04/29 21:38:45 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
18/04/29 21:38:45 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: hadoop" with name hadoop
18/04/29 21:38:45 DEBUG security.UserGroupInformation: User entry: "hadoop"
18/04/29 21:38:45 DEBUG security.UserGroupInformation: Assuming keytab is managed externally since logged in from subject.
18/04/29 21:38:45 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
18/04/29 21:38:45 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
18/04/29 21:38:45 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
18/04/29 21:38:45 DEBUG fs.FileSystem: Loading filesystems
18/04/29 21:38:45 DEBUG fs.FileSystem: file:// = class org.apache.hadoop.fs.LocalFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: ftp:// = class org.apache.hadoop.fs.ftp.FTPFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: har:// = class org.apache.hadoop.fs.HarFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/common/hadoop-common-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: hftp:// = class org.apache.hadoop.hdfs.web.HftpFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: hsftp:// = class org.apache.hadoop.hdfs.web.HsftpFileSystem from /usr/local/hadoop-2.9.0/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.9.0.jar
18/04/29 21:38:46 DEBUG fs.FileSystem: Looking for FS supporting hdfs
18/04/29 21:38:46 DEBUG fs.FileSystem: looking for configuration option fs.hdfs.impl
18/04/29 21:38:46 DEBUG fs.FileSystem: Looking in service filesystems for implementation class
18/04/29 21:38:46 DEBUG fs.FileSystem: FS for hdfs is class org.apache.hadoop.hdfs.DistributedFileSystem
18/04/29 21:38:46 DEBUG impl.DfsClientConf: dfs.client.use.legacy.blockreader.local = false
18/04/29 21:38:46 DEBUG impl.DfsClientConf: dfs.client.read.shortcircuit = false
18/04/29 21:38:46 DEBUG impl.DfsClientConf: dfs.client.domain.socket.data.traffic = false
18/04/29 21:38:46 DEBUG impl.DfsClientConf: dfs.domain.socket.path =
18/04/29 21:38:46 DEBUG hdfs.DFSClient: Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0
18/04/29 21:38:46 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
18/04/29 21:38:46 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@4977b5
18/04/29 21:38:46 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@42583a89
18/04/29 21:38:47 DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
18/04/29 21:38:47 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
18/04/29 21:38:47 DEBUG ipc.Client: The ping interval is 60000 ms.
18/04/29 21:38:47 DEBUG ipc.Client: Connecting to master/192.168.181.170:9000
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop: starting, having connections 1
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop sending #0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop got value #0
18/04/29 21:38:47 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 97ms
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop sending #1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop got value #1
18/04/29 21:38:47 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 8ms
18/04/29 21:38:47 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@42583a89
18/04/29 21:38:47 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@42583a89
18/04/29 21:38:47 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@42583a89
18/04/29 21:38:47 DEBUG ipc.Client: Stopping client
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop: closed
18/04/29 21:38:47 DEBUG ipc.Client: IPC Client (136930299) connection to master/192.168.181.170:9000 from hadoop: stopped, remaining connections 0
18/04/29 21:38:47 DEBUG util.ShutdownHookManager: ShutdownHookManger complete shutdown.
发现报错说glibc-2.14版本找不到(被我框出的红色部分)。
$ strings /lib64/libc.so.6 | grep GLIBC     # 查看系统支持的版本,最高到2.12版本。
2、下载解压glibc-2.14版本;
1)下载地址:http://ftp.gnu.org/gnu/glibc/,找到glibc-2.14.tar.gz下载;
2)解压到任意路径下,我的解压路径/usr/local;
$  sudo tar -zxf ~/Downloads/hadoop-2.9.0.tar.gz -C /usr/local
2、在glibc源码目录下构建目录
$ cd glibc-2.14
$ mkdir build
$ cd build
3、运行configure配置,安装;
$ ../configure --prefix=/opt/glibc-2.14
$ sudo make -j4 # 这个时间可能稍微长些
$ sudo make install
4、配置
$ cp /etc/ld.so.c* /opt/glibc-2.14/etc/
cp: omitting directory `/etc/ld.so.conf.d'
$ sudo ln -sf /opt/glibc-2.14/lib/libc-2.14.so /lib64/libc.so.6
5、查看版本库的支持;
$ strings /lib64/libc.so.6 | grep GLIBC

本文来自网络,不代表培森的Blog立场,转载请注明出处:https://blog.xupeisen.com/archives/94

作者: 培森

联系我们

联系我们

13262951234

在线咨询: QQ交谈

邮箱: admin@xupeisen.com

工作时间:周一至周五,9:00-17:30,节假日休息

关注微信
微信扫一扫关注我们

微信扫一扫关注我们

关注微博
返回顶部