The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Windows – Installation Forum

Install OK, but trouble validating

  • #26848
    John Bunch

    I need some help troubleshooting this. I can start all services manually on all nodes, but when I run start_remote_hdp_services.cmd, I get the following error:

    Master nodes: start
    0 Master nodes successfully started.
    2 Master nodes failed to start.

    PSComputerName Service Message Status
    -------------- ------- ------- ------
    Connecting to re...
    Connecting to re...

    StartStop-HDPservices : Manually start services on Master nodes then retry
    full cluster start. Exiting.
    At D:\hdp\hadoop\manage_remote_hdp_services.ps1:187 char:26
    + if ($mode -eq "start") { StartStop-HDPservices($mode) }
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorExceptio

    I also find this entry in the hadoop-datanode-HADOOP5.log file:

    2013-06-03 09:41:28,818 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Call to failed on local exception: An existing connection was forcibly closed by the remote host
    at org.apache.hadoop.ipc.Client.wrapException(
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(
    at com.sun.proxy.$Proxy5.sendHeartbeat(Unknown Source)
    at org.apache.hadoop.hdfs.server.datanode.DataNode.offerService(
    at Source)
    Caused by: An existing connection was forcibly closed by the remote host
    at Method)
    (leaving out the rest)

    Here is clusterproperties.txt:

    #Log directory

    #Data directory


    #Database host

    #Hive properties

    #Oozie properties

    I’ve double-checked firewall config and eliminated that as a cause.

  • Author
  • #26850
    John Bunch

    Windows Server 2012 on all nodes, BTW.


    Hi John,

    A quick question here, I don’t think that it is the cause of your issue, but why is the domain for the secondary name node different from all the rest?


    John Bunch


    It isn’t. That’s a mistake in the above text. I was trying to obfuscate the actual domain name but missed a line. Please don’t tell anyone!

    Any idea on what’s causing the error?


    Hi BalckMamba,

    In a search over the web for the error in the log it points to either a firewall issue or a ssl certificate issue. Check that these are not the cause.


    John Bunch

    Does not appear to be a firewall issue – if I stop Windows Firewall on all nodes I get the same error.

    I’m not sure how to check for an SSL certificate issue. I have not installed an SSL certificate or made any modifications to Apache – nothing is changed from the installation.

    Seth Lyubich

    Hi John,

    I saw similar issue in HDP 1.1. Can you please try to start service with using start_local_hdp_service.cmd script instead?

    I also wanted to note that HDP 1.3 is out which you can try now.

    Hope this helps,


    John Bunch

    Update on this issue:

    The above appears to be due to a Powershell security issue when installing HDP on standalone (non-domain member) Windows servers. There may be a step or two missing in the installation instructions in section 5.5. I added each machine to a domain, set Group Policy for the domain (as directed by section 5.6 and the above issue disappeared. Also, I installed version 1.3 and replicated the behavior.


    Hi ,
    I am new to Hadoop , soryy for this question but I’m wondering is the Active Domain Name should be the same for all Machines.
    for example I want to cinfigure a cluster with 3 nodes : Node1 , Node2 , Node3
    should I create a same domain name for alla hosts so i get , and
    or should I create a different domain name for each Machine ???



    Can anyone answer me ? please help

    Ivan Malamen

    Please check if WinRM and remote powershell execution is enabled on all of the nodes. Remopte start script uses WinRM to connect to all of the nodes.

The forum ‘HDP on Windows – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.