Kerberos Implementation – Datanodes won't start

This topic contains 1 reply, has 1 voice, and was last updated by  Jan Peters 6 months, 2 weeks ago.

  • Creator
  • #57820

    Jan Peters

    I’m not a Kerberos wizard, so I’m on a bit of a learning curve. I’ve followed all of the Kerberos instructions in the HDP 2.1 documentation and run into an issue where my datanodes won’t start (3 node cluster). If I roll back all of the xml files to non-kerberos versions, I can start everything from the command line. When I shut down the cluster and roll in the kerberos versions of the xml files, I’m able to start the namenode, but all of the datanodes refuse to start and the only clue I have is as follows;
    2014-07-24 11:04:22,181 INFO datanode.DataNode ( - registered UNIX signal handlers for [TERM, HUP, INT]
    2014-07-24 11:04:22,399 WARN common.Util ( - Path /opt/hadoop/hdfs/dn should be specified as a URI in configuration files. Please update hdfs configuration.
    2014-07-24 11:04:23,055 INFO security.UserGroupInformation ( - Login successful for user dn/cvm0930.dg.local@DGKERB.COM using keytab file /etc/security/keytabs/dn.service.keytab
    2014-07-24 11:04:23,210 INFO impl.MetricsConfig ( - loaded properties from
    2014-07-24 11:04:23,274 INFO impl.MetricsSystemImpl ( - Scheduled snapshot period at 60 second(s).
    2014-07-24 11:04:23,274 INFO impl.MetricsSystemImpl ( - DataNode metrics system started
    2014-07-24 11:04:23,279 INFO datanode.DataNode (<init>(269)) - File descriptor passing is enabled.
    2014-07-24 11:04:23,283 INFO datanode.DataNode (<init>(280)) - Configured hostname is cvm0932.dg.local
    2014-07-24 11:04:23,284 FATAL datanode.DataNode ( - Exception in secureMain
    java.lang.RuntimeException: Cannot start secure cluster without privileged resources.
    at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(
    at org.apache.hadoop.hdfs.server.datanode.DataNode.main(
    2014-07-24 11:04:23,287 INFO util.ExitUtil ( - Exiting with status 1
    2014-07-24 11:04:23,289 INFO datanode.DataNode ( - SHUTDOWN_MSG:

    Thanks in advance for your collective help with this.
    mit Freundlichen Grüßen (with Friendly Greetings),

Viewing 1 replies (of 1 total)

The forum ‘HDP 2.1 Technical Preview’ is closed to new topics and replies.

  • Author
  • #60438

    Jan Peters

    OK, so we sorted this issue out; it seems that we weren’t paying attention to permissions and ownership in the linux file system and when starting various processes for the cluster. Yeah, this is sorta hadoop 101 stuff, but it will bust your chops if you don’t pay attention to the details. We dug through a lot of items and found that a number of directory structures (logging included) were sensitive to permissions/ownership issues. We had the Kerberos implementation done correctly with the keytab files and such but that’s only a fraction of the journey. To summarize the rest of our learnings, we found that ‘who’ was starting a process was important in getting the datanodes running. We also determined that linux container-executor was not properly configured both in the container-executor.cfg and the yarn-site.xml files. Additionally, when we changed ownership on the container-executor, the permissions (specifically the sticky bit) were changed. Once we sifted through all the details, we were finally able to get our 3-node cluster running and run DgSecure discovery/masking tasks against this cluster. I hope this posting (although never responded to) will help others through the Kerberizing process.
    mit Freundlichen Grüßen (with Friendly Greetings),

Viewing 1 replies (of 1 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.