HDP on Linux – Installation Forum

Local repo mirrors not being used

  • #24529


    I’ve synced HDP, HDP-EPL, HDP-UTILS, Ambari, and Ambari-Updates repos to a machine on the same network as the HDP cluster. I’m having trouble getting the Ambari install to actually utilize some/all of the mirrors.

    I’ve configured in both repoInfo.xml and copied over ambari.repo and HDP.repo and when the installer runs, the HDP.repo is overwritten on all hosts with the public mirror.

    Any suggestions are appreciated.


to create new topics or reply. | New User Registration

  • Author
  • #24845
    Chen Xie

    Yea, same problem here.
    I’m trying to deploy it on local repo. But the “hdp.repo” seems never get used.
    Every time I run installer, a new HDP.repo and HDP-EPEL.repo were generate and overwrote files with the same name.

    One possible solution I saw online is to disable these files (HDP.repo, HDP-EPEL.repo). Working on this now.

    Chen Xie

    Hi Karl,

    It seems that it’s a bug of Ambari, which keep “distributing repo files during installation”.

    Please check:




    I was able to get it working if I set the repoinfo.xml before starting the ambari-server for the first time. It also might work if you restart the ambari-server after making changes to the repoinfo.xml.

    This only seemed to work on a clean cluster though. There is a probably a bug where the repo files don’t get copied after the first install on a cluster. (even with a reset).

    Seth Lyubich

    Hi Karl,

    Thank you for letting us know on how you were able to correct the issue. You can also watch https://issues.apache.org/jira/browse/AMBARI-2006 for ability to avoid distributing repo files during installation.


    Jeff Sposetti

    Also watch this JIRA, which is sounds like very much like the problem you hit.


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.