The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Error while Installing hosts

  • #16839
    Adam Ocsvari
    Member

    Currently I’m using Scientific Linux6 what should be a modified RHEL6.
    When I’m doing the web based install, after the second step, my nodes are installed but not registered.
    I can see 2 things in the log file:
    22:14:14,113 WARN HeartBeatHandler:331 – Received registration request from host with non matching os type, hostname=master, serverOsType=redhat6, agentOstype=scientific6
    ( This is funny, because ALL my servers, also the ambari server are the same scientific6 images)

    But the real problem is maybe this:
    22:36:36,982 WARN nio:651 – javax.net.ssl.SSLException: Received fatal alert: unknown_ca
    22:36:44,687 WARN nio:651 – javax.net.ssl.SSLHandshakeException: null cert chain
    22:36:45,034 WARN nio:651 – javax.net.ssl.SSLProtocolException: handshake alert: no_certificate

    All my servers hase the same hosts file, with fix IP-name pairs. Ping works for everywhere. Also password-less ssh with root.

  • Author
    Replies
  • #16845
    abdelrahman
    Moderator

    Hi Adam ,

    I will be happy to help resolve out this issue. Please use FQDN in the Ambari installation process. Ambari server may need to be reset by running the following commands.
    ambari-server stop
    ambari-server reset
    ambari-server start

    Once the Ambari server up , please try to reregister again. If the problem persists take a copy of the error messages and past it here. Hope this helps.

    Thanks
    -Abdelrhman

    #16935
    Adam Ocsvari
    Member

    FQDN: All my servers has a hostname-IP matching in the hosts file. ( all file contains all the servers)
    Hostnames: ambari, master, slave
    I did the servers stop,reset,start.. still the same.

    My first message contains all the non-INFO part of the log.
    Here is what the ui displays as log:

    Registration log for master
    STDOUT

    STDERR
    STDOUT

    STDERR
    STDOUT
    Verifying Python version compatibility…
    Using python /usr/bin/python2.6
    Checking for previously running Ambari Agent…
    ERROR: ambari-agent already running
    Check /var/run/ambari-agent/ambari-agent.pid for PID.
    (‘hostname: ok slave
    ip: ok 10.9.2.193
    cpu: ok QEMU Virtual CPU version 1.0
    memory: ok 0.97353 GB
    disks: ok
    Filesystem Size Used Avail Use% Mounted on
    /dev/mapper/vg_sclinux-lv_root
    27G 1021M 25G 4% /
    tmpfs 499M 0 499M 0% /dev/shm
    /dev/vda1 485M 52M 408M 12% /boot
    IGLZSB@store.cloud.ik.bme.hu:home
    193G 2.0G 181G 2% /home/cloud/sshfs
    os: ok Scientific Linux release 6.3 (Carbon)
    iptables: ok
    Chain INPUT (policy ACCEPT 4635 packets, 16M bytes)
    pkts bytes target prot opt in out source destination

    Chain FORWARD (policy ACCEPT 0 packets, 0 bytes)
    pkts bytes target prot opt in out source destination

    Chain OUTPUT (policy ACCEPT 5407 packets, 833K bytes)
    pkts bytes target prot opt in out source destination
    selinux: ok SELINUX=disabled
    SELINUXTYPE=targeted
    yum: ok yum-3.2.29-30.el6.noarch
    rpm: ok rpm-4.8.0-27.el6.x86_64
    openssl: ok openssl-1.0.0-27.el6_4.2.x86_64
    curl: ok curl-7.19.7-26.el6_2.4.x86_64
    wget: ok wget-1.12-1.4.el6.x86_64
    net-snmp: UNAVAILABLE
    net-snmp-utils: UNAVAILABLE
    ntpd: UNAVAILABLE
    ruby: UNAVAILABLE
    puppet: UNAVAILABLE
    nagios: UNAVAILABLE
    ganglia: UNAVAILABLE
    passenger: UNAVAILABLE
    hadoop: UNAVAILABLE
    yum_repos: ok
    HDP-UTILS-1.1.0.15 Hortonworks Data Platform Utils Version – HDP-UTILS-1. 52
    zypper_repos: UNAVAILABLE
    ‘, None)
    (‘ File “/usr/lib64/python2.6/socket.py”, line 567, in create_connection
    raise error, msg
    error: [Errno 111] Connection refused

    [….]

    INFO 2013-03-08 08:42:45,442 security.py:49 – SSL Connect being called.. connecting to the server
    INFO 2013-03-08 08:42:45,444 Controller.py:103 – Unable to connect to: https://ambari:8441/agent/v1/register/slave
    Traceback (most recent call last):
    File “/usr/lib/python2.6/site-packages/ambari_agent/Controller.py”, line 88, in registerWithServer
    response = self.sendRequest(self.registerUrl, data)
    File “/usr/lib/python2.6/site-packages/ambari_agent/Controller.py”, line 239, in sendRequest

    #16938
    tedr
    Member

    Hi Adam,

    What do you get on the hosts when you type "hostname -f". This is what you should enter as the hosts to install to. Also please check that iptables or firewall and selinux are turned off/disabled.

    Thanks,
    Ted.

    #16982
    Adam Ocsvari
    Member

    Hi Ted!

    I double checked everything:
    hostname -f correct in every servers.
    SeLinux and IP tables are off everywhere.

    Any other idea?

    Adam

    #16991
    Jeff Sposetti
    Moderator

    I think since your agent is return “agentOstype=scientific6”, and that is an unsupported OS, that’s why your agent registration is failing.

    I filed this JIRA for making this logging more informative / less cryptic. If you would like to see supported added for Scientific Linux6, please file a JIRA improvement as well.

    https://issues.apache.org/jira/browse/AMBARI-1605

    #23924

    Hello,
    Is there any resolution to this thread? Is there a way to install HDP on SLC6? I got the same problem connecting the Ambari agent and server. I saw the Jira issue for unsupported OS is not resolved.

    Cheers,
    Alex

    INFO 2013-04-30 20:10:17,201 security.py:49 – SSL Connect being called.. connecting to the server
    INFO 2013-04-30 20:10:17,398 Controller.py:103 – Unable to connect to: https://pc-atd-cc-01.cern.ch:8441/agent/v1/register/pc-atd-cc-01.cern.ch
    Traceback (most recent call last):
    File “/usr/lib/python2.6/site-packages/ambari_agent/Controller.py”, line 89, in registerWithServer
    ret = json.loads(response)
    File “/usr/lib64/python2.6/json/__init__.py”, line 307, in loads
    return _default_decoder.decode(s)
    File “/usr/lib64/python2.6/json/decoder.py”, line 319, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    File “/usr/lib64/python2.6/json/decoder.py”, line 338, in raw_decode
    raise ValueError(“No JSON object could be decoded”)
    ValueError: No JSON object could be decoded

    #24069
    Larry Liu
    Moderator

    Hi, Adam and Alex

    I can propose a workaround:

    1. remove any release information under /etc. The file should be like /etc/scientific-release. I am not quite sure.
    2. Create the following 2 files:

    /etc/redhat-release
    /etc/issue

    You can copy the content from a redhat version. In my centos 6.0, I have the following content:
    [root@sandbox ~]# cat /etc/redhat-release
    CentOS release 6.3 (Final)
    [root@sandbox ~]# cat /etc/issue
    CentOS release 6.3 (Final)
    Kernel \r on an \m

    Please let me know if it works.

    Thanks
    Larry

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.