The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Change Ambari bundled JDK on existing install

  • #48145
    Chris Bennight

    I was recently attempting to install some software on top of a HDP cluster, and some of the code required Java 1.7 – and the cluster had been setup with 1.6.

    After a few hours of google searches and poling around I wasn’t easily able to determine *how* to change the bundled java version. (Admittedly some of those hours were wasted with me discovering that java was bundled, my java_home settings were ignored, and puppet was overwriting monkeypatch changes I attempted to make directly).

    I tried updating the ambari server properties to point to a java home that was installed on all boxes, restarted everything multiple times, etc. – but was never able to get everything to swap over to java 1.7.

    would I have needed to re-install the entire platform to push out those changes? I assume there *has* to be some simple way to do this (change/update java version) – but for the life of me I couldn’t find it.

  • Author
  • #48657
    James Poole

    I would like to see the answer to this as well. Is it possible to change the Java version to 1.7 after the initial install used 1.6?

    Shlomi Hassan

    i am looking for answer to this problem as well
    is there a solution to this problem

    Chris Bennight

    Anyone here?
    If I’m missing something obvious, or looking at this the wrong way if someone could just give me a pointer in the right direction (i.e. if it’s a RTFM just point me to the chapter – but I have tried to do my due diligence – and other than a complete re-install I don’t see a supported method)

    I can’t believe this is that crazy of a question? With all the Java exploits constantly cropping up surely keeping the run-time environment up to date should be a first class operation?

    Igor Skokov

    Did you try this?
    ambari-server setup –j /usr/java/default

    More info:

    Tim Ellis

    The closest we’ve come to being able to this is run this shell script to get real heavy-hitting on your cluster.

    for i in cat hostList | xargs ; do
    echo “$i ==============” ;
    ssh root@$i “cd /var/lib/ambari-agent ;
    find . -type f -name ‘*.erb’ | fgrep ‘puppet’ | xargs sed -i -e ‘s-<%=scope.function_hdp_java_home()%>-/usr/java/default-‘” ;

    Formatting might break. That’s backticks on the “cat hostList | xargs” portion. The hostList should be a list of all your Ambari-controlled host nodes.

    After you do that, you should restart all Ambari services. As they restart, files like “” should get the updated JAVA_HOME.

    Tim Ellis

    A note: Igor’s advice, which was also given to us by HortonWorks, does nothing (aka does not work).

    In general, the safest thing to do is just tear down your whole cluster and rebuild it again using the command Igor gives so that it uses JDK 1.7 to start with. If you’ve accidentally built your cluster on 1.6, the script I paste might save you, but we’re not even sure it’ll work at this point other than fixing the various “*” files.

    Chris Bennight

    Igor – yep, I tried that – same experience as Tim mentioned – on an existing cluster it did absolutely nothing. On a new cluster I expect it should work fine.

    Tim – thanks – that cluster no longer exists, but that makes sense – I had just been avoiding digging into the HDP/Ambari puppet files (since if I was going to do that I would just roll it myself via puppet) (which is what ended up happening).

    Thanks for the sanity check though – good to know I wasn’t missing something blatantly obvious.

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.