The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Securing Cluster after HMC Installation

  • #13077

    I stopped all services, generated all keytabs and pushed them to all the nodes. I modified all XML file templates (*.erb files) on the master to ensure each node uses the appropriate principals and keytab files

    I am unable to get my datanodes to start, and encounter the following error “cannot start secure cluster without privileged resources”:

    2012-12-21 08:44:35,288 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
    2012-12-21 08:44:35,329 INFO org.apache.hadoop.metrics2.impl.MetricsSinkAdapter: Sink ganglia started
    2012-12-21 08:44:35,541 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
    2012-12-21 08:44:35,546 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
    2012-12-21 08:44:35,546 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
    2012-12-21 08:44:35,742 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered.
    2012-12-21 08:44:36,031 INFO Asked the TGT renewer thread to terminate
    2012-12-21 08:44:36,542 INFO Login successful for user dn/slave1.hadoop.local@HADOOP.LOCAL using keytab file /etc/security/keytabs/dn.service.keytab
    2012-12-21 08:44:36,543 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.lang.RuntimeException: Cannot start secure cluster without privileged resources.
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(

    2012-12-21 08:44:36,551 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
    SHUTDOWN_MSG: Shutting down DataNode at slave1.hadoop.local/

    Does anyone know how to resolve this problem? I know the service is trying to bind to ports 1024 and leaving them as the regulary 50000+ ports, however still encounter the same problem.

  • Author
  • #13083
    Larry Liu

    Hi, Rishab

    I have a few questions for you.

    1. How did you start datanode? From HMC UI or command line?
    2. What XML file templates did you edit?

    Can you please provide the system logs (/var/log) and relevant kerberos logs?

    There is one thing you can try is to start the datanode from command line as root.




    I turned off All services using the HMC interface. I am only trying to start HDFS backup.

    #1 – I started hdfs using the HMC Manage Cluster
    #2 – For hdfs, i edited core-site.xml.erb and the hdfs-site.xml.erb files on the puppet/master/templates directory.

    I am uploading “Securing Cluster after HMC Insallation.rar” which contains all logs to your ftp site


    Larry Liu

    Hi, Rishab

    From the namenode log file, I found the following:

    Connection from for protocol org.apache.hadoop.hdfs.server.protocol.NamenodeProtocol is unauthorized for user nn/slave1.hadoop.local@HADOOP.LOCAL

    Look like your nn is not set up to allow access to data node. Can you please confirm?




    Larry – the firewall (iptables) is disabled (both ipv6 and ipv4) on all nodes. SELINUX is also disabled on both logs as well. All hosts also resolve forward and reverse dns for all other nodes.

    Larry Liu

    HI, Rishab,

    Can you please list the /etc/security/keytabs on all hosts?

    Please refer to the following documentation to verify the kerberos is set up correctly.




    Larry – Here are the keytabs on each host:

    Master Node:
    [root@master ~]# ls /etc/security/keytabs/
    dn.service.keytab jt.service.keytab rs.service.keytab
    hive.service.keytab nn.service.keytab spnego.service.keytab
    hm.service.keytab oozie.service.keytab tt.service.keytab

    Slave1 Node:
    [root@slave1 ~]# ls /etc/security/keytabs/
    dn.service.keytab nn.service.keytab spnego.service.keytab
    hive.service.keytab oozie.service.keytab tt.service.keytab
    jt.service.keytab rs.service.keytab

    Slave2 Node:
    [root@slave2 ~]# ls /etc/security/keytabs/
    dn.service.keytab rs.service.keytab tt.service.keytab
    hive.service.keytab spnego.service.keytab

    I used the script from the gsInstaller package to set up my Kerberos server and generate keytabs. When I try and setup key tabs myself manually, I still encounter the same issue.

    Seth Lyubich

    Hi Rishab,

    Can you please clarify your issue? Are you trying to secure your HMC installed cluster using script from gsInstaller I just want to make sure that we completely understand.



    Seth – i used the gsInstaller script to generate and push the appropriate keytabs to the /etc/security/keytabs directory

    I modified the core-site.xml.erb and hdfs-site.xml.erb files on the master by hand. My namenode will start up, but none of the datanodes will start due to the error in the initial post.


    Hi Rishab,

    It looks from your reply, then, that you have mixed the HMC installer and the gsInstaller. If so, this is something that we really don’t support. If you need a secure cluster I recommend that you re-install with gsInstaller. You can download the gsInstaller scripts here:

    Thanks for continuing to use HDP.

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.