The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Can't handle ServiceComponentHostEvent event at current state

  • #28883

    Installation is consistently failing with yellow dashes. The first error in ambari-server.log is

    15:00:49,926 ERROR ServiceComponentHostImpl:721 - Can't handle ServiceComponentHostEvent event at current state, serviceComponentName=SQOOP, hostName=msh-hdpslave101, currentState=INSTALLED, eventType=HOST_SVCCOMP
    15:00:49,927 WARN HeartBeatHandler:233 - State machine exception
    org.apache.ambari.server.state.fsm.InvalidStateTransitionException: Invalid event: HOST_SVCCOMP_OP_IN_PROGRESS at INSTALLED
    at org.apache.ambari.server.state.fsm.StateMachineFactory.doTransition(
    at org.apache.ambari.server.state.fsm.StateMachineFactory.access$300(
    at org.apache.ambari.server.state.fsm.StateMachineFactory$InternalStateMachine.doTransition(

    There are no errors in ambari-agent.log on the afore mentioned host msh-hdpslave101.

    Registration is successful and several components successfully install before this error appears. After this point, all components on all hosts fail to install with yellow dashes. Attempting a retry does not allow installation to proceed any farther.

    I am following a combination of the documentation and . All firewalling is disabled. This particular disk image (based on rhel6) has all of the prereqs configured and was previously used to successfully spinup a cluster. This problem has been encountered twice, each with fresh hosts.

    I can post any additional logs as well as teardown/rebuild the cluster to assist in diagnosis. Thanks!

    Red Hat Enterprise Linux Server release 6.4 (Santiago)

  • Author
  • #28896

    Hi Matt,

    thanks for using trying out HDP/Ambari. Which component is failing to install? Also can you post the complete ambari-server.log to our FTP site user/pass=dropoff/horton. try ti name it something unique to you so that we can tell it’s your log. You won’t be able to see any files there just upload.



    Thanks for your quick response. I have uploaded msh-ServiceComponentHostEvent-ambari-server.log. It is hard to tell specifically what service fails at this point, because I hit retry and that removed all entries that had succeeded.

    On the master, the first master node, there are orange dashes by pig, sqoop, tas
    ktracker, and zookeeper client and server.

    On the second master, the dashes are with mysql server, oozie client and server,
    pig, snamenode, sqoop, tasktracker, webhcat, and zookeeper client and server.

    One slave node failed on everything it looks like, and the other two are just mi
    ssing sqoop, tasktracker, and the zookeepers.

    My nagios node is separate and that completely failed as well.


    Given the scope of the installation failure, let me post a few sanity checks on the network.

    The network:

    [root@msh-ambarimaster101 ~]# cat /etc/hosts localhost.localdomain localhost
    ::1 localhost6.localdomain6 localhost6

    ..external ips removed.. msh-ambarimaster101 msh-nagios101 msh-hdpmaster101 msh-hdpmaster102 msh-hdpslave101 msh-hdpslave102 msh-hdpslave103

    Keyed ssh and consistent /etc/hosts files:

    [root@msh-ambarimaster101 ~]# md5sum /etc/hosts
    63eaa9d6000ce0ca1916cee345cd5406 /etc/hosts
    [root@msh-ambarimaster101 ~]# ssh msh-hdpmaster101 md5sum /etc/hosts
    63eaa9d6000ce0ca1916cee345cd5406 /etc/hosts

    Local firewalls disabled:

    [root@msh-ambarimaster101 ~]# chkconfig --list iptables
    iptables 0:off 1:off 2:off 3:off 4:off 5:off 6:off
    [root@msh-ambarimaster101 ~]# service iptables status
    iptables: Firewall is not running.

    [root@msh-ambarimaster101 ~]# ssh msh-hdpmaster101 chkconfig --list iptables
    iptables 0:off 1:off 2:off 3:off 4:off 5:off 6:off
    [root@msh-ambarimaster101 ~]# ssh msh-hdpmaster101 service iptables status
    iptables: Firewall is not running.


    After lunch I hit retry again and the installation completed. There were no configuration changes between now and then, so I guess this points to either an intermittent system/network issue or perhaps ambari installing things in the wrong order (and needing a certain number of retry clicks to get through it all.) I would really like to know your process for debugging this sort of problem to help me fix it going forward.


    Hi Matt,

    Haven’t had the chance to look over the log you posted yet, but the only bit that might have some minor issues is that when installing on EC2 you need to make sure that the AWS security policy allows the boxes to connect to each other in addition to turning off the local firewalls.



    Tedr, I gave up on figuring out the ports a while back and opened up all traffic between systems in the aws security policy.

    I have tried adding a few nodes and they all failed at the same point. It appears to be sqoop again, just like in the very first error message.

    [check] DataNode install
    [check] Ganglia Monitor install
    [check] HBase Client install
    [check] HBase RegionServer install
    [check] HCat install
    [check] HDFS Client install
    [check] Hive Client install
    [check] MapReduce Client install
    [check] Oozie Client install
    [check] Pig install
    [dash] Sqoop install
    [dash] TaskTracker install
    [dash] ZooKeeper Client install

    Retrying resulted in a success, but sqoop is no longer in the list of items to install:

    [check] DataNode start
    [check] Ganglia Monitor start
    [check] HBase RegionServer start
    [check] TaskTracker start

    I don't think I need sqoop at this time, but it looks like it never gets installed.

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.