HDP on Linux – Installation Forum

Local repository won't work (The requested URL returned error: 403 Forbidden)

  • #55568


    I am trying to configure ambari without access to the internet, I have successfully installed and configure a ftp server to work as a repository mirror.
    I’ve looked around and after allot of configuration / re-configuration I’m still not able to get over the “403” error.
    The following steps were followed:
    1. installed vsftpd
    2. created a repo in /var/ftp/pub/hdp/HDP-UTILS-
    3. ran createrepo in the ../HDP-UTILS-
    4. chmoded -R to 775
    5. added a new repo to /etc/yum.repos.d/hdp.conf (triple checked the baseurl, it is correct, copy pasting it in a “curl baseurl” will work)
    6. disabled iptables, selinux
    7. able to curl,wget,navigate in a browser to ftp://fqdn-hostname/pub/hdp/HDP-UTILS-
    8. ran yum clean all
    9. yum list will give me this error “ftp://fqdn-hostname/pub/hdp/HDP-UTILS- [Errno 14] PYCURL ERROR 22 – “The requested URL returned error: 403 Forbidden”

    I’m trying to find out why I still get the 403 when running the yum install command. Do you have any sugestions?

    PS. I actually went in python and using the pycurl module, called the URL and it gets an answer from the baseurl used in yum.repos.d.


to create new topics or reply. | New User Registration

  • Author
  • #55613
    Jeff Sposetti

    Can you post the contents of your /etc/yum.repos.d/ambari.repo file?

    I see you are using for the base url ftp://fqdn-hostname/pub/hdp/HDP-UTILS- What about using this instead?



    I have found the issue eventually:

    yum.conf was using a http proxy. Yum was also going through this proxy to access the local repository mirror, which was wrong in the case of hdp.conf.

    This was fixed by adding inside /etc/yum.repos.d/hdp.conf the following line for each repo:


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.