HDP on Windows – Installation Forum

Hbase Master & Hbase regionsserver services won't start

  • #39639
    Jan Leonhard
    Member

    Hi there!

    I seem to be lacking some configuration.

    I have modified the config-file (clusterproperties):
    HBASE_MASTER=odi-mon
    HBASE_REGIONSERVERS=odi-mon

    But ‘start_local_hdp_services.cmd’ seems to try to start the services on : ‘computer ‘.”.

    Why doesn’t it use ‘odi-mon’?

    High regards
    Jan

    Here is the full report:

    c:\hdp\hadoop>start_local_hdp_services.cmd
    starting namenode
    starting secondarynamenode
    starting datanode
    starting jobtracker
    starting historyserver
    starting tasktracker
    starting zkServer
    starting master
    Start-Service : Service ‘Apache Hadoop Hbase master (master)’ cannot be started due to the following error: Cannot start service master on
    computer ‘.’.
    At C:\hdp\hadoop\manage_local_hdp_services.ps1:77 char:29
    + $foo = Start-Service <<<< -Name $serviceName.Name -ErrorAction Continue
    + CategoryInfo : OpenError: (System.ServiceProcess.ServiceController:ServiceController) [Start-Service], ServiceCommandExcept
    ion
    + FullyQualifiedErrorId : CouldNotStartService,Microsoft.PowerShell.Commands.StartServiceCommand

    starting regionserver
    Start-Service : Service 'Apache Hadoop Hbase regionserver (regionserver)' cannot be started due to the following error: Cannot start servic
    e regionserver on computer '.'.
    At C:\hdp\hadoop\manage_local_hdp_services.ps1:77 char:29
    + $foo = Start-Service <<<< -Name $serviceName.Name -ErrorAction Continue
    + CategoryInfo : OpenError: (System.ServiceProcess.ServiceController:ServiceController) [Start-Service], ServiceCommandExcept
    ion
    + FullyQualifiedErrorId : CouldNotStartService,Microsoft.PowerShell.Commands.StartServiceCommand

    starting hwi
    starting hiveserver
    starting hiveserver2
    starting metastore
    starting derbyserver
    starting templeton
    starting oozieservice
    Sent all start commands.
    total services
    16
    running services
    14
    not yet running services
    2
    Failed_Start master regionserver

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #39695
    Seth Lyubich
    Moderator

    Hi Jan,

    Please see http://hortonworks.com/community/forums/topic/hdp-1-3-flumeagent-and-hbase-services-will-not-start/

    Please try steps from the forum above:

    1) Uninstalled HDP 1.3 from server
    2) Removed all old files from install folders
    3) From my Environmental Variable Path I removed all paths which has like C:\Program files (x86)\SQL
    4) Reinstalled HDP 1.3

    Hope this helps,

    Thanks,
    Seth

    #42878
    Sorna Lingam
    Member

    HI Jan Leonhard

    Have you got the solution . Im also having the same problem

    Thanks

    #42879
    Sorna Lingam
    Member

    Hi

    As mentioned i have deleted the path variable and having

    “C:\Program Files\Intel\iCLS Client\;C:\Perl64\site\bin;C:\Perl64\bin;%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files\Intel\WiFi\bin\;C:\Program Files\Common Files\Intel\WirelessCommon\;C:\Program Files\TortoiseSVN\bin;C:\Python33″

    But still im getting error

    #43032
    Sorna Lingam
    Member

    Hi

    I have solved the issue

    “C:\Program Files\Intel\iCLS Client\;C:\Perl64\site\bin;C:\Perl64\bin;%SystemRoot%\system32;%SystemRoot%;%SystemRoot%\System32\Wbem;%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\;C:\Program Files\Intel\WiFi\bin\;C:\Program Files\Common Files\Intel\WirelessCommon\;C:\Program Files\TortoiseSVN\bin;C:\Python33″

    you have to remove “()” from the path variable

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.