Errors while deployiong HDP using Ambari

to create new topics or reply. | New User Registration

This topic contains 8 replies, has 3 voices, and was last updated by  tedr 1 year, 12 months ago.

  • Creator
    Topic
  • #28992

    dev patra
    Member

    Hi,

    I’m getting the following two errors while installing Hadoop using Ambari 1.2.3 with only Hdfs, Mapreduce, and Ganglia.
    1. Both Jobtracker and Tasktracker start on the server node, but Mapreduce fails to start because of failed smoke test. This is what I see in the logs:

    notice: /Stage[2]/Hdp-hadoop::Mapred::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::run_wordcount]/Hdp::Exec[hadoop –config /etc/hadoop/conf jar /usr/lib/hadoop//hadoop-examples.jar wordcount mapredsmokeinput mapredsmokeoutput]/Exec[hadoop –config /etc/hadoop/conf jar /usr/lib/hadoop//hadoop-examples.jar wordcount mapredsmokeinput mapredsmokeoutput]/returns: 13/07/09 14:06:28 INFO ipc.Client: Retrying connect to server: has031.stg.ams1.spil/112.26.26.31:50300. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)

    2. When I look into ambari-server.log, I also the following error record:

    ERROR JMXPropertyProvider:311 – Caught exception getting JMX metrics : spec=http://has031.stg.ams1.spil:50030/jmx
    14:24:40,919 ERROR JMXPropertyProvider:311 – Caught exception getting JMX metrics : spec=http://has031.stg.ams1.spil:50060/jmx
    14:24:42,789 INFO HeartBeatHandler:108 – Received heartbeat from host, hostname=has031.stg.ams1.spil, currentResponseId=160, receivedResponseId=160
    14:24:42,790 INFO AgentResource:109 – Sending heartbeat response with response id 161
    14:24:52,894 INFO HeartBeatHandler:108 – Received heartbeat from host, hostname=has031.stg.ams1.spil, currentResponseId=161, receivedResponseId=161
    14:24:52,895 INFO AgentResource:109 – Sending heartbeat response with response id 162
    14:24:54,341 ERROR GangliaPropertyProvider:448 – Caught exception getting Ganglia metrics : spec=http://has031.stg.ams1.spil/cgi-bin/rrd.py?c=HDPJobTracker,HDPHBaseMaster,HDPSlaves,HDPNameNode&h=has031.stg.ams1.spil&m=load_one,disk_total,disk_free&e=now&pt=true
    java.io.IOException: Server returned HTTP response code: 500 for URL: http://has031.stg.ams1.spil/cgi-bin/rrd.py?c=HDPJobTracker,HDPHBaseMaster,HDPSlaves,HDPNameNode&h=has031.stg.ams1.spil&m=load_one,disk_total,disk_free&e=now&pt=true

    So, currently on my hadoop master, only HDFS is running properly. Gaglia, though is shown as ‘okay’on the UI but is throwing exceptions. And, Mapreduce is failing to start.
    Kindly, give me some pointers to resolve these issues.

    Best Regards,
    Dev

Viewing 8 replies - 1 through 8 (of 8 total)

The topic ‘Errors while deployiong HDP using Ambari’ is closed to new replies.

Viewing 8 replies - 1 through 8 (of 8 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.