The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Installing HDP – Core Dump

  • #14201

    I am trying to install HDP using the manual steps. When I execute the command to format the HDFS filesytem, I receive a core dump.

    Executing this command: /usr/lib/hadoop/bin/hadoop namenode –format
    –Errors with the following:
    # A fatal error has been detected by the Java Runtime Environment:
    #
    #  SIGBUS (0x7) at pc=0x00007f3265010e38, pid=33045, tid=139854567274240
    #
    # JRE version: 6.0_31-b04
    # Java VM: Java HotSpot(TM) 64-Bit Server VM (20.6-b01 mixed mode linux-amd64 compressed oops)
    # Problematic frame:
    # Segmentation fault (core dumped)

    –Contents of /etc/hadoop/conf/hadoop.env.sh
                    –List of parameters for the namenode
                                    export HADOOP_NAMENODE_OPTS=”-server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC -XX:ErrorFile=/var/log/hadoop/$USER/hs_err_pid%p.log -XX:NewSize=640m -XX:MaxNewSize=128m -Xloggc:/var/log/hadoop/$USER/gc.log-`date +’%Y%m%d%H%M’` -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xms1G -Xmx1G -Dhadoop.security.logger=INFO,DRFAS -Dhdfs.audit.logger=INFO,DRFAAUDIT ${HADOOP_NAMENODE_OPTS}”

    STrace output:
    open(“/usr/jdk64/jdk1.6.0_31/bin/../jre/lib/amd64/jli/libm.so.6”, O_RDONLY) = -1 ENOENT (No such file or directory)
    open(“/usr/jdk64/jdk1.6.0_31/jre/lib/amd64/server/libm.so.6”, O_RDONLY) = -1 ENOENT (No such file or directory)
    open(“/usr/jdk64/jdk1.6.0_31/jre/lib/amd64/libm.so.6”, O_RDONLY) = -1 ENOENT (No such file or directory)
    open(“/etc/ld.so.cache”, O_RDONLY)      = 3
    fstat(3, {st_mode=S_IFREG|0644, st_size=40371, …}) = 0
    mmap(NULL, 40371, PROT_READ, MAP_PRIVATE, 3, 0) = 0x7f5ce1d5e000
    close(3)                                = 0
    open(“/lib64/libm.so.6”, O_RDONLY)      = 3
    read(3, “\177ELF\2\1\1\3\3>\1\240>y;”…, 832) = 832
    fstat(3, {st_mode=S_IFREG|0755, st_size=598800, …}) = 0
    mmap(0x3b79000000, 2633944, PROT_READ|PROT_EXEC, MAP_PRIVATE|MAP_DENYWRITE, 3, 0) = 0x3b79000000
    mprotect(0x3b79083000, 2093056, PROT_NONE) = 0
    mmap(0x3b79282000, 8192, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_FIXED|MAP_DENYWRITE, 3, 0x82000) = 0x3b79282000
    close(3)                                = 0
    mprotect(0x3b79282000, 4096, PROT_READ) = 0
    munmap(0x7f5ce1d5e000, 40371)           = 0
    mmap(NULL, 1052672, PROT_READ|PROT_WRITE|PROT_EXEC, MAP_PRIVATE|MAP_ANONYMOUS|MAP_STACK, -1, 0) = 0x7f5ce0f44000
    mprotect(0x7f5ce0f44000, 4096, PROT_NONE) = 0
    clone(child_stack=0x7f5ce1043ff0, flags=CLONE_VM|CLONE_FS|CLONE_FILES|CLONE_SIGHAND|CLONE_THREAD|CLONE_SYSVSEM|CLONE_SETTLS|CLONE_PARENT_SETTID|CLONE_CHILD_CLEARTID, parent_tidptr=0x7f5ce10449d0, tls=0x7f5ce1044700, child_tidptr=0x7f5ce10449d0) = 41345
    futex(0x7f5ce10449d0, FUTEX_WAIT, 41345, NULL#

  • Author
    Replies
  • #14205
    Larry Liu
    Moderator

    Hi, Kirk,

    Thanks for trying HDP.

    Can you please provide the following information?

    1. What version of HDP you are trying?
    2. Provide the following logs:

    /var/log/hadoop/$USER/hs_err_pid%p.log
    /var/log/hadoop/$USER/gc.log

    Please following instruction below to upload log to us.
    http://hortonworks.com/community/forums/topic/hmc-installation-support-help-us-help-you/

    Thanks

    Larry

    #14206

    HDP 1.2

    /var/log/hadoop/hdfs/gc.log is created for every failed attempt, but is empty. No other logs files exist.

    #14209
    Larry Liu
    Moderator

    Hi, Kirk

    Can you please get the namenode log?

    Larry

    #14211

    > more /var/log/hadoop/hdfs/hadoop-hdfs-namenode-azc01.out

    #
    # A fatal error has been detected by the Java Runtime Environment:
    #
    # SIGBUS (0x7) at pc=0x00007f4a60ceee38, pid=21833, tid=139957552826112
    #
    # JRE version: 6.0_31-b04
    # Java VM: Java HotSpot(TM) 64-Bit Server VM (20.6-b01 mixed mode linux-amd64 compressed oops)
    # Problematic frame:
    #

    #14212
    Sasha J
    Moderator

    Just found this article:

    I had the following happen for every new java process on one of my servers the other day:

    server:~$ java
    #
    # A fatal error has been detected by the Java Runtime Environment:
    #
    # SIGBUS (0x7) at pc=0x00007f3e0c5aad9b, pid=17280, tid=139904457242368
    #
    # JRE version: 6.0_24-b07
    # Java VM: Java HotSpot(TM) 64-Bit Server VM (19.1-b02 mixed mode linux-amd64 compressed oops)
    # Problematic frame:
    # C [libc.so.6+0x7ed9b] memset+0xa5b
    #
    # An error report file with more information is saved as:
    # /home/user/hs_err_pid17280.log
    Segmentation fault
    Turns out this is Java’s way of telling you that the /tmp directory is full. It’s trying to mmap some performance/hotspot-related file in /tmp which succeeds, but when it’s trying to access this area, it will get the SIGBUS signal.

    http://efod.se/blog/archive/2011/05/02/java-sigbus

    Check if your /tmp have space…

    Thank you!
    Sasha

    #14213
    Sasha J
    Moderator

    one more article on the same problem:

    http://bugs.sun.com/view_bug.do?bug_id=6563308
    6563308 : Java VM dies with SIGBUS when temp directory is full on linux

    Sasha

    #14276

    /tmp has space and is writable. It is only 2% full.

    #14288
    Sasha J
    Moderator

    did you try formatting again and grab the full log for name node?

    #14323

    Issue resolved, it was a java incompatibility issue. Thanks

    #14324
    Larry Liu
    Moderator

    Hi, Kirk,

    THis is great news.

    Thanks

    Larry

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.