HDP on Linux – Installation Forum

HDP Installer

  • #11105
    Rajan Ananthan

    I am using the gsInstaller to deploy a single node cluster on a Cent OS 6.3 Virtual Machine using VMWare Player.

    On Step 7 of the tutorial, I receive the following error/failures when running the gsInstaller.sh script as per the log file:

    Starting All Hadoop Services
    CMD -> su – hdfs -c ‘yes Y | hadoop –config /etc/hadoop/conf namenode -format’;
    In nodes: localhost
    CMD -> su – hdfs -c ‘/usr/lib/hadoop/bin/hadoop-daemon.sh –config /etc/hadoop/conf start namenode ‘
    In nodes: localhost
    localhost: starting namenode, logging to /var/lib/hadoop/log/hdfs/hadoop-hdfs-namenode-localhost.localdomain.out
    localhost: /usr/lib/hadoop/libexec/../bin/hadoop: line 320: /usr/hadoop-jdk1.6.0_31/bin/java: Permission denied

    Waiting 600 seconds for namenode to come out of safe mode
    on localhost
    on localhost running ssh -o ConnectTimeOut=3 -q -t root@localhost “su – hdfs -c ‘ hadoop –config /etc/hadoop/conf dfsadmin -safemode get'”
    Output from gwhost safe mode command: /usr/lib/hadoop/bin/hadoop: line 320: /usr/hadoop-jdk1.6.0_31/bin/java: Permission denied

    /usr/lib/hadoop/bin/hadoop: line 390: /usr/hadoop-jdk1.6.0_31/bin/java: Permission denied
    /usr/lib/hadoop/bin/hadoop: line 390: exec: /usr/hadoop-jdk1.6.0_31/bin/java: cannot execute: Permission denied

    **There are several attempts to connect but they all fail.

to create new topics or reply. | New User Registration

  • Author
  • #11106
    Sasha J

    Do you have JDK installed on your system?
    Why you use gsInstaller?
    HMC is a way to go, please, use it instead.

    Rajan Ananthan

    Yes, I have the JDK installed. Less is hidden behind the UI and I was told it is convenient for a single cluster install.

    Sasha J

    Based on the error message, your JDK is not functioning.
    And what do you mean by ” Less is hidden behind the UI” and who told you such things?

The topic ‘HDP Installer’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.