HDP on Windows – Installation Forum

Running start_remote_hdp_services.cmd – cluster won't start

  • #16167

    Have installed on 4 machines, all successful (now :)).

    From cluster.properties (in c:\hdp)
    #Hosts
    NAMENODE_HOST=demon1.rosie.torver.net
    SECONDARY_NAMENODE_HOST=demon1.rosie.torver.net
    JOBTRACKER_HOST=demon1.rosie.torver.net
    HIVE_SERVER_HOST=demon1.rosie.torver.net
    OOZIE_SERVER_HOST=demon1.rosie.torver.net
    TEMPLETON_HOST=demon1.rosie.torver.net
    SLAVE_HOSTS=demon2.rosie.torver.net, demon3.rosie.torver.net, demon4.rosie.torver.net

    Have started CMD (run as administrator) on demon1.rosie.torver.net; the cluster won’t start….

    C:\hdp>start_remote_hdp_services.cmd
    Master nodes: start demon1.rosie.torver.net
    1 Master nodes successfully started.
    1 Master nodes failed to start.

    PSComputerName Service Message Status
    ————– ——- ——- ——
    demon1.rosie.tor… Cannot bind argu…

    StartStop-HDPServices : Manually start services on Master nodes then retry full
    cluster start. Exiting.
    At C:\hdp\manage_remote_hdp_services.ps1:177 char:47
    + if ($mode -eq “start”) { StartStop-HDPservices <<<

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #16168

    Sorry, some was missed off, here is the complete output from PS….

    C:\hdp>start_remote_hdp_services.cmd
    Master nodes: start demon1.rosie.torver.net
    1 Master nodes successfully started.
    1 Master nodes failed to start.

    PSComputerName Service Message Status
    ————– ——- ——- ——
    demon1.rosie.tor… Cannot bind argu…

    StartStop-HDPServices : Manually start services on Master nodes then retry full
    cluster start. Exiting.
    At C:\hdp\manage_remote_hdp_services.ps1:177 char:47
    + if ($mode -eq “start”) { StartStop-HDPservices <<<

    #16169

    Still got missed off:
    cluster start. Exiting.
    At C:\hdp\manage_remote_hdp_services.ps1:177 char:47
    + if ($mode -eq “start”) { StartStop-HDPservices <<<

    #16170

    Sorry for multiple attempts – the post keeps getting truncated; below is the output from HDPINST.txt and it looks like the intstall didn’t actually work – any ideas?

    CAQuietExec: HADOOP: Giving user/group “DEMON1\hadoop” full permissions to “c:\hadoop\data\h
    CAQuietExec: dfs\mapred”
    CAQuietExec: HADOOP: icacls “c:\hadoop\data\hdfs\mapred” /grant DEMON1\hadoop:(OI)(CI)F
    CAQuietExec: processed file: c:\hadoop\data\hdfs\mapred
    CAQuietExec: Successfully processed 1 files; Failed processing 0 files
    CAQuietExec: processed file: c:\hadoop\data\hdfs\mapred
    CAQuietExec: Successfully processed 1 files; Failed processing 0 files
    CAQuietExec: HADOOP: Install of Hadoop Core, HDFS, MapRed completed successfully
    CAQuietExec: InstallService : Cannot bind argument to parameter ‘services’ because it is an
    CAQuietExec: empty string.
    CAQuietExec: At C:\HadoopInstallFiles\HadoopPackages\hdp-1.1.0-winpkg\resources\hadoop-1.1.0
    CAQuietExec: -SNAPSHOT.winpkg\scripts\install.ps1:159 char:19
    CAQuietExec: + InstallService <<<< $NodeInstallRoot $serviceCredential "$roles_to_start
    CAQuietExec: "
    CAQuietExec: + CategoryInfo : InvalidData: (:) , ParameterBind
    CAQuietExec: ingValidationException
    CAQuietExec: + FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAl
    CAQuietExec: lowed,InstallService
    CAQuietExec:
    CAQuietExec:
    CAQuietExec:

    #16172
    Yi Zhang
    Moderator

    Hi Tony,

    Can you run start_local_hdp_services.cmd on the name node?

    Thanks,

    Yi.

    #16175

    Hi Yi,

    DEMON1 is my master name node, when trying I get…

    C:\hdp>start_local_hdp_services
    Start-Service : Cannot bind argument to parameter ‘Name’ because it is null.
    At C:\hdp\manage_local_hdp_services.ps1:52 char:35
    + $foo = Start-Service -Name <<<

    Had a similar problem in the install log, similar error:

    CAQuietExec: HADOOP: Install of Hadoop Core, HDFS, MapRed completed successfully
    CAQuietExec: InstallService : Cannot bind argument to parameter ‘services’ because it is an
    CAQuietExec: empty string.
    CAQuietExec: At C:\HadoopInstallFiles\HadoopPackages\hdp-1.1.0-winpkg\resources\hadoop-1.1.0
    CAQuietExec: -SNAPSHOT.winpkg\scripts\install.ps1:159 char:19
    CAQuietExec: + InstallService <<<< $NodeInstallRoot $serviceCredential "$roles_to_start
    CAQuietExec: "
    CAQuietExec: + CategoryInfo : InvalidData: (:) , ParameterBind
    CAQuietExec: ingValidationException
    CAQuietExec: + FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAl
    CAQuietExec: lowed,InstallService
    CAQuietExec:

    Many thanks,
    Tony.

    #16232

    Hi Yi,

    I’ve uploaded the install log and details of my environment etc. to your ftp drop off area.

    At a loss now – just cannot get this working and I’ve followed all the instructions.

    Many thanks,
    Tony.

    #16252
    Larry Liu
    Moderator

    Hi, Tony

    Thanks for providing log to us. We are looking into this issue.

    Larry

    #16256
    Larry Liu
    Moderator

    Hi, Tony

    Can you please verify if the FQDN is same as the computer name on all nodes? Basically, the FQDN you put in clusterproperties.txt should be same as the computer name.

    Thanks

    Larry

    #16257
    Yi Zhang
    Moderator

    Hi Tony,

    Your log suggests that the start script can’t find the services. Wondering if they show up in services.msc.

    If you run in ps ‘get-service -Displayname ‘Apache Hadoop*’, do you see a list of haoop services? If there is such Apache hadoop service, can you start them manually? for example, ‘start-service ‘Apache Hadoop namenode’ on the name node?

    Your clusterproperties.txt shows that most of your hadoop services are installed on demon5, do you have problem running the start script on that node?

    Thanks,

    Yi

    #16270

    Hi Yi, Larry.

    I’ve found the problem and this one might catch a lot of folk out – its a IPv6 issue – by default when talking on the same host it will use IPv6 (pinging DEMON1 or DEMON1.rosie.torver.net gives ::1: which is ipv6 for localhost).

    The resolution is to put entries in the c:\windows\system32\drivers\etc\hosts file that give the local host a IPv4 address.

    Bizzarre! anyway – doing the above and the services now appear; I’ve checked a couple of times, removing HDP and installing with the host entries and without and it does the trick.

    10.0.2.21 DEMON1
    10.0.2.21 DEMON1.rosie.torver.net

    Thanks for your help, I’m going to make sure its all working on the one node first (have set clusterproperties to all be the same DEMON1 machine).

    Tony.

    #16278
    Larry Liu
    Moderator

    Hi Tony

    Great you figured out. Were you using DNS instead of hosts file before? Just wondering.

    THanks

    Larry

    #16280
    Yi Zhang
    Moderator

    Hi Tony,

    Recommend to set ipv4 preference over ipv6 on your windows boxes.

    Yi.

    #16281

    Was using DNS before; now hardcoding using the local HOSTS file.

    I did remove the IPv6 from the network properties for each connection as well so not sure why for localhost it was still using IPv6 (before using HOSTS file).

    Odd one – perhaps can you guys update the documentation pre-reqs?

    #16305
    Yi Zhang
    Moderator

    Hi Tony,

    You are right that windows users may bump into this problem more often.
    Current hadoop does not support ipv6. we will put that into documentation.

    Thanks for your feedback!

    Yi.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.