Home Forums HDFS Most services crashed on HDP2.0 with HA

This topic contains 1 reply, has 1 voice, and was last updated by  Alexander Rohvarger 3 months, 4 weeks ago.

  • Creator
    Topic
  • #46112

    Running HDP 2.0 with HA enabled.

    I added a parameter to core-site.xml via Ambari, and, honestly, don’t even remember if I hit save. I restarted hdfs through Ambari for the changes to take effect. The result was pretty much everything crashing.

    I can access hue, but the file browser and running hive scripts both show the error:
    “Operation category READ is not supported in state standby (error 403)”

    Running hive script from shell leads to the same error. Using “hadoop fs -ls” command still works from shell. I restarted all the services via Ambari but nothing has changed. Ambari shows that all the hosts are fine.

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #46113

    The issue has been fixed. Turns out that when hdfs was restarted, active and standby states of the namenodes were switched. I shut down the active namenode via Ambari, and they switched again. Everything seems to be back to normal, but it may be the case that HA configuration needs to further looked into, since the failover did not go over smoothly.

    Collapse
Viewing 1 replies (of 1 total)