HDP on Linux – Installation Forum

hadoop-env.sh and hbase-env.sh

  • #11839

    The JAVA_HOME variable hadoop-env.sh and hbase-env.sh for some of the nodes in my cluster install does not get set correctly. I am not using the service allocation offered by the install process. I am trying to customize the “master services” allocations during my setup.

    When would a new release hmc / installer would be available ?

to create new topics or reply. | New User Registration

  • Author
  • #12015
    Jerry Lam

    I have the same problem. I have 6 nodes. The nodes that take the responsibility as datanodes have JAVA_HOME set to undefined. It seems the installer doesn’t work well. It didn’t propagate the configuration to the datanodes. Anyone is working on this?

    Jeff Sposetti

    We are working an issue that when you modify the default recommended master layout, configs do not get pushed to all nodes during install (i.e. you see some property values not set). That will result in services on those nodes to not function.

    We will correct this issue in our next update to HDP 2.0 Alpha.

    Jerry Lam

    Hi Jeff:

    I’m trying out the manual mode as I don’t want to mess with the installer if it doesn’t work.
    I follow the instructions thoroughly. At one point, I need to “yum install hadoop-sbin.i386”, there is no package for it. Do you know why?

    Jerry Lam

    Those are the package successfully installed:

    [root@hadoop04 ~]# rpm -qa | grep hadoop
    [root@hadoop04 ~]#

    Jerry Lam

    The documentation needs to be updated. The below doesn’t work.


    Install the Hadoop RPMs

    From a terminal window, type:
    yum -y install hadoop hadoop-libhdfs hadoop-libhdfs.i386 mysql-connector-java hadoop-native hadoop-native.i386 hadoop-pipes hadoop-pipes.i386 hadoop-sbin.i386 openssl openssl.i686

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.