Installing on Centos 6.2 under VMWare

This topic contains 1 reply, has 2 voices, and was last updated by  Jeff Sposetti 2 years, 3 months ago.

  • Creator
    Topic
  • #11266

    Rob Styles
    Member

    Just finished getting a HDP 2.0 Alpha deployed as a single node on Centos 6.2 under VMWare and wanted to feedback.

    All the packages required were found find in the standard and epel repos. All at suitable versions AFAICT.

    Getting as far as the HMC service started, everything went smoothly :)

    When deploying services to the node there were two issues I came across:

    HMC seems to select devices as the starting point for folders so accepting the defaults leads to a failure. On my VM it selected /dev/mapper/vg_hd01-lv_root and /dev/mapper/vg_hd01-lv_home and attempts to create hadoop/… directories under that. Being devices, this fails. Unticking the default devices and giving a base directory in the text box allows the installation to continue.

    Next up, the default java heap size for the map tasks under mapred is set to 256Mb. When deploying on a single box this leads the mapred test to fail with a java heap exception. Changing it to 2048Mb allowed the test to pass on my VM.

    Both of these failures were repeatable under my VM (tested using snapshots) so I can provide logs if anyone at HortonworksHQ needs them.

    Now it’s deployed, time to start playing :)

    thanks

    rob

    Rob Styles

Viewing 1 replies (of 1 total)

You must be to reply to this topic. | Create Account

  • Author
    Replies
  • #11272

    Jeff Sposetti
    Moderator

    Hi Rob,

    Thank you for the feedback. Glad to hear you are up-and-running. As for the two issues you came across, we have been able to reproduce the issues and we will work to address.

    Please continue to let us know how your evaluation of HDP 2.0 Alpha progresses.

    Cheers,
    Jeff

    Collapse
Viewing 1 replies (of 1 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.