HDP on Linux – Installation Forum

Ambari Server Proxy Authentication

to create new topics or reply. | New User Registration

  • Author
  • #49458
    Jeff Sposetti

    Believe you can add these args as well.

    “-Dhttp.proxyUser=someUserName -Dhttp.proxyPassword=somePassword”

    Andreas Mahling

    Jeff, thanks you for your support, but, sorry, proxyUser and proxyPassword did not work for me.
    I will proceed now with setting up local repositories. Still wondering, why HDP needs access to http URLs while all nodes have a fully operable yum setup, Hortonworks Repos included. Tried skipping URL checking in inital cluster configuration, but this beast came up later again…

    Best Regards, Andreas

    Jeff Sposetti

    You might want to also confirm yum can use the proxy too.


    Check the “skip url checking” in the Ambari wizard, and confirm your yum works thru the proxy and see how that goes.

    Peter Bulman


    Does anyone have a solution or work around to this?

    I have set the AMBARI_JVM_ARGS with -Dhttp.proxyHost=<host> -Dhttp.proxyPort=<port> -Dhttp.proxyUser=<user> -Dhttp.proxyPassword=<password>
    The credentials I’m using are valid and Yum is working with the same values.

    Ambari logs are showing: Server returned HTTP response code: 407 for URL


    Jeff Sposetti

    For now, you’ll want to click “Skip validation” when you set the Base URLs.


    You can also follow this JIRA:


    Peter Bulman

    thanks for the reply, I have set the properties as per the Jira AMBARI-5960:
    [root@reuxeuls517 etc]# echo $AMBARI_JVM_ARGS
    -Xms512m -Xmx2048m -Djava.security.auth.login.config=/etc/ambari-server/conf/krb5JAASLogin.conf -Djava.security.krb5.conf=/etc/krb5.conf -Djavax.security.auth.useSubjectCredsOnly=false -Dhttp.proxyHost= -Dhttp.proxyPort=8080 -Dhttp.proxyUser=userabcd -Dhttp.proxyPassword=password

    restarted ambari-server after setting the above args.

    however i am still getting HTTP response code: 407 for URL: http://public-repo-1.hortonworks.com/HDP/hdp_urlinfo.json)

    I have verified access to this page with Links using the same proxy configuration.

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.