Home Forums HDFS How to configure a Hadoop Development Environment ?

This topic contains 3 replies, has 3 voices, and was last updated by  tedr 11 months, 1 week ago.

  • Creator
    Topic
  • #25728

    suaroman
    Participant

    I have a 3 node cluster setup. All smoke screens run fine for all services. All sample’s seem to work fine too (WordCount,etc…)

    I have a machine that I would like to use as a development machine (e.g. a gateway into my cluster).

    Can someone tell me how to setup Eclipse so I can compile/build java mr programs to run in my cluster?

    Thanks

Viewing 3 replies - 1 through 3 (of 3 total)

The topic ‘How to configure a Hadoop Development Environment ?’ is closed to new replies.

  • Author
    Replies
  • #26007

    tedr
    Moderator

    Hi Sauroman,

    There is an eclipse plugin jar located in ‘/usr/lib/hadoop/contrib/eclipse-plugin’ that you supposedly can just copy to the plugins directory of your eclipse installation. I am looking for some documentation on how to use it.

    Thanks,
    Ted.

    Collapse
    #25946

    suaroman
    Participant

    Thanks for the reply. I already have Eclipse and spare machine setup.
    The machine connects to the HDP cluster. Currently able to build/ run hive and pig without any problems.

    I’m interested now in learning to develop Java MR programs but uncertain how to configure Eclipse properly to work with my HDP cluster. I never knew finding information like this would be so difficult. Figure basic information like ‘how to build dev encironment” would be plentiful .

    Thanks. Anxiously awaiting further replies

    Collapse
    #25891

    Seth Lyubich
    Keymaster

    Hi Suaroman,

    I think you can try to install Eclipse on spare machine and add your machine as a client node to the cluster. Once you compile code on that machine you should be able to submit jobs to the cluster.

    Hope this helps.

    Thanks,
    Seth

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)