HDP on Linux – Installation Forum

Install Doc mistyping Command

  • #32511
    hoondong kim
    Member

    There is a mistyping command.

    In CDH 1.3 Install Manual -> Chapter 4. Troubleshooting Ambari Deployments -> 2. Quick Checks,
    below

    mysqld -h $FQDN_for_MySQL_server -u $FQDN_for_HCatalog_Server -p

    must be changed

    mysql -h $FQDN_for_MySQL_server -u $FQDN_for_HCatalog_Server -p

    …..

    ——————————————————————————————————————————
    2. Quick Checks

    Make sure all the appropriate services are running. If you have access to Ambari Web, use the Services View to check the status of each component. If you do not have access to Manage Services, you must start and stop the services manually.

    If the first HDFS put command fails to replicate the block, the clocks in the nodes may not be synchronized. Make sure that Network Time Protocol (NTP) is enabled for your cluster.

    If HBase does not start, check if its slaves are running on 64-bit JVMs. Ambari requires that all hosts must run on 64-bit machines.

    Make sure umask is set to 0022.

    Make sure the HCatalog host can access the MySQL server. From a shell try:

    mysqld -h $FQDN_for_MySQL_server -u $FQDN_for_HCatalog_Server -p

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #32512
    hoondong kim
    Member

    I found One more mistype option….

    mysqld -h $FQDN_for_MySQL_server -u $FQDN_for_HCatalog_Server -p

    must be changed

    mysql -h $FQDN_for_MySQL_server -u $USER_of_HCatalog_Server -p

    ……….

    #32513
    hoondong kim
    Member

    And… One more

    chkconfig -–level 35 mysql on

    must be changed

    chkconfig -–level 35 mysqld on

    #32871
    Robert
    Participant

    Hi hoondong ,
    Thank you for the feedback. We have notified the product team.

    Kind Regards,
    Robert

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.