HDP on Linux – Installation Forum

install the required version of the hadoop and other components with ambari

  • #32902

    hi everyone~
    I want to use ambari1.2.0 to build a hadoop cluster, but the version of hadoop,hbase,hive,zookeeper what ambari provided are not the one I need, how can I install the right version of the components i need? If i need to install an rpm package, what’s the prerequisites needed?


to create new topics or reply. | New User Registration

  • Author
  • #32950

    Hi Member,

    HDP 1.3.2 is distributed with a tested stack. You can find the documentation for this at docs.hortonworks.com and choose the ambari (Automated) install manual.




    Hi Dave,

    I use ambari installed HDP1.2.0, which component versions were hadoop1.1.2.2, hbase-, hive-, zookeeper-, but I want to use my own tar.gz compression installation programs of hadoop, hbase, hive, zookeeper with a different version, what I need to do to improve the installation steps? if I need to generate my own components rpm package, how to build it?




    I don’t really understand your issue, this forum is for HDP installation issues, unfortunately I do not know how you would go about building your own RPM package.



You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.