Home Forums HDP on Linux – Installation Errors while deployiong HDP using Ambari

This topic contains 8 replies, has 3 voices, and was last updated by  tedr 1 year, 4 months ago.

  • Creator
    Topic
  • #28992

    dev patra
    Member

    Hi,

    I’m getting the following two errors while installing Hadoop using Ambari 1.2.3 with only Hdfs, Mapreduce, and Ganglia.
    1. Both Jobtracker and Tasktracker start on the server node, but Mapreduce fails to start because of failed smoke test. This is what I see in the logs:

    notice: /Stage[2]/Hdp-hadoop::Mapred::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::run_wordcount]/Hdp::Exec[hadoop --config /etc/hadoop/conf jar /usr/lib/hadoop//hadoop-examples.jar wordcount mapredsmokeinput mapredsmokeoutput]/Exec[hadoop --config /etc/hadoop/conf jar /usr/lib/hadoop//hadoop-examples.jar wordcount mapredsmokeinput mapredsmokeoutput]/returns: 13/07/09 14:06:28 INFO ipc.Client: Retrying connect to server: has031.stg.ams1.spil/112.26.26.31:50300. Already tried 20 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)

    2. When I look into ambari-server.log, I also the following error record:

    ERROR JMXPropertyProvider:311 – Caught exception getting JMX metrics : spec=http://has031.stg.ams1.spil:50030/jmx
    14:24:40,919 ERROR JMXPropertyProvider:311 – Caught exception getting JMX metrics : spec=http://has031.stg.ams1.spil:50060/jmx
    14:24:42,789 INFO HeartBeatHandler:108 – Received heartbeat from host, hostname=has031.stg.ams1.spil, currentResponseId=160, receivedResponseId=160
    14:24:42,790 INFO AgentResource:109 – Sending heartbeat response with response id 161
    14:24:52,894 INFO HeartBeatHandler:108 – Received heartbeat from host, hostname=has031.stg.ams1.spil, currentResponseId=161, receivedResponseId=161
    14:24:52,895 INFO AgentResource:109 – Sending heartbeat response with response id 162
    14:24:54,341 ERROR GangliaPropertyProvider:448 – Caught exception getting Ganglia metrics : spec=http://has031.stg.ams1.spil/cgi-bin/rrd.py?c=HDPJobTracker,HDPHBaseMaster,HDPSlaves,HDPNameNode&h=has031.stg.ams1.spil&m=load_one,disk_total,disk_free&e=now&pt=true
    java.io.IOException: Server returned HTTP response code: 500 for URL: http://has031.stg.ams1.spil/cgi-bin/rrd.py?c=HDPJobTracker,HDPHBaseMaster,HDPSlaves,HDPNameNode&h=has031.stg.ams1.spil&m=load_one,disk_total,disk_free&e=now&pt=true

    So, currently on my hadoop master, only HDFS is running properly. Gaglia, though is shown as ‘okay’on the UI but is throwing exceptions. And, Mapreduce is failing to start.
    Kindly, give me some pointers to resolve these issues.

    Best Regards,
    Dev

Viewing 8 replies - 1 through 8 (of 8 total)

The topic ‘Errors while deployiong HDP using Ambari’ is closed to new replies.

Viewing 8 replies - 1 through 8 (of 8 total)