Home Forums HDP on Linux – Installation Eclipse not able to connect to Hadoop AWS Cluster

This topic contains 10 replies, has 3 voices, and was last updated by  tedr 10 months, 1 week ago.

  • Creator
    Topic
  • #27082

    deepak kumar
    Member

    I was able to install HDP in AWS and get hadoop running.
    Now i am trying to connect from Eclipse Europa using hadoop-0.18.0 -eclipse plugin , to the hadoop cluster on AWS.
    But everytime it gives me the error “Connection timed out”
    I got the parameters correct i guess , but let me know if that needs to be cross checked from somewhere else.
    The installed hadoop version is 1.2.0.1.3.0.0-107 that is being displayed on the name node http page.
    Also is there any way to install the desired hadoop version using hdp , as hdp installs on its own i guess

Viewing 10 replies - 1 through 10 (of 10 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #27389

    tedr
    Moderator

    Hi Deepak,

    Try entering deepak.kumar as the eclipse plugin user.

    Thanks,
    Ted.

    Collapse
    #27373

    deepak kumar
    Member

    No.I entered root in the eclipse plugin user.
    I am not sure if its picking deepak.kumar from my laptop , as this is the user i am logged into laptop , where eclipse is running.

    I even tried by creting user deepk.kumar on amazon name node server , but still the issue persists

    Thanks
    Deepak

    Collapse
    #27256

    tedr
    Moderator

    Hi Deepak,

    Is “Deepak.Kumar…” the user you entered in the plugin configuration for it to use?

    Thanks,
    Ted.

    Collapse
    #27197

    deepak kumar
    Member

    Thanks for your reply Sasha.
    My desktop authenticates on the domain and the user name is Deepak.Kumar.XXX.com
    I created user Deepak.Kumar on the DFS master as well as map/reduce master , but still having the same issue.
    Even i assigned root group to the user and added other groups such as mapred/hdfs to this user id
    Am i doing something wrong here?

    Collapse
    #27196

    Sasha J
    Moderator

    Deepak,
    accesscontrolexception usually means that your user account on the cluster side and your desktop side is not the same.
    As a result, your desktop can not get correct data from cluster (same as permission denial on Linux)…

    Please, make sure the user account you have on the desktop side is valid user account on the cluster side.

    Thank you!
    Sasha

    Collapse
    #27195

    deepak kumar
    Member

    I was able to connect and DFS is being listed in eclipse , but not all.
    I am getting this error:
    org.apache.hadoop.security.accesscontrolexception
    How to get rid of this error?

    Collapse
    #27186

    deepak kumar
    Member

    I got the jar from /usr/lib/hadoop/contrib , but am still unable to connect.
    I followed the link that you shared and it didn’t helped either.
    Do we need to add the .pem of all the nodes from the AWS cluster to eclipse?
    How can i cross verify the details such as DFS master IP / port , Map/Reduce master host/ port ?
    Right now i am taking it from the web gui of MR and hdfs.

    Collapse
    #27125

    tedr
    Moderator

    Hi Deepak,

    There are many sites on the internet that talk about setting up the eclipse plugin. A quick Google search found this one http://cat123vn.wordpress.com/2011/02/27/eclipse-connect-hadoop-server-write-mapreduce-application/
    By the way we provide a plugin for the version of hadoop in our distribution the jar file is located in /usr/lib/hadoop/contrib.

    Thanks,
    Ted.

    Collapse
    #27109

    deepak kumar
    Member

    Thanks for the reply Ted.
    I have replaced the 0.18.0 plugin with 1.2.0 plugin version.
    Still not able to connect to HDP cluster from eclipse.
    Is there any document/blog around this issue?

    Thanks
    Deepak

    Collapse
    #27106

    tedr
    Moderator

    Hi Deepak,

    To start with you will probably need to download a later version of the Hadoop plugin for eclipse. HDP is a hadoop 1.0.x stack and using a plugin for 0.18.x may cause problems.

    Thanks,
    Ted.

    Collapse
Viewing 10 replies - 1 through 10 (of 10 total)