HDP on Linux – Installation Forum

Inbound-outbound ports

  • #42420
    Ardavan Moinzadeh
    Participant

    could some one tell me what is the range of ports that needs to be open for Inbound and outbound connection please?

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #42423
    Dave
    Moderator

    Hi Ardavan,

    Which components – as the ports are all configurable
    Thanks

    Dave

    #42425
    Ardavan Moinzadeh
    Participant

    ALL of them. šŸ˜€

    #42427
    Ardavan Moinzadeh
    Participant

    Most of the daemons are up and running however there are errors when running the smoke tests. Basically the master daemons can send commands to the slave nodes but they cannot connect to the master.
    Iā€™m getting this kind of errors from different smoke tests:
    13/10/30 02:18:59 INFO ipc.Client: Retrying connect to server: bdde-l4.novalocal/X.X.X:8020. Already tried 29 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)

    even after I open port 8020 still getting the same error

    #42439
    Dave
    Moderator

    Hi Ardavan,

    Is the namenode running and that server listening on 8020?

    Thanks

    Dave

    #42489
    Ardavan Moinzadeh
    Participant

    I checked that this port was blocked from outside.

    but in general, can you give me a list of ports that need to be opened in order for NM, DD and SNN to communicate?

    #42567
    Dave
    Moderator

    Hi Ardavan,

    You will find all the default ports listed in the HDP documentation.

    Thanks

    Dave

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.
ā€‹