HDP on Linux – Installation Forum

hadoop-env.sh Xmx …

  • #24626
    Vladislav Pernin

    Manual RPMs installation.

    In /etc/hadoop/conf/hadoop-env.sh, the HADOOP_NAMENODE_OPTS contains wrongs values :
    we have -Xmx1000m and -XX:NewSize=200m -XX:MaxNewSize=640m and -Xms256m -Xmx256m in the same line.

    Other problem, the full comand line of the name node shows lots and lots of duplicated options :
    -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true -Djava.net.preferIPv4Stack=true
    -Djava.library.path=/usr/lib/hadoop/lib/native -Djava.library.path=/usr/lib/hadoop/lib/native
    -server -XX:ParallelGCThreads=8 -XX:+UseConcMarkSweepGC

    The configuration file seems to be sourced multiple times by startup script, and since the options is concatenated each times, we obtain lots of duplicates.
    The startup script /usr/lib/hadoop/sbin/hadoop-daemon.sh calls /usr/lib/hadoop/libexec/hadoop-config.sh which add some parameters like JAVA_HEAP_MAX=-Xmx1000m and calls /etc/hadoop/conf/hadoop-env.sh.
    /etc/hadoop/conf/hadoop-env.sh is then called again …

to create new topics or reply. | New User Registration

  • Author
  • #25362
    Larry Liu

    Hi, Vladislav

    Thanks for trying HDP 2.0.

    I am checking into it now. It seems the default value doesn’t make sense.


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.