Hive / HCatalog Forum

Hive R Studio Issue

  • #41710
    Anupam Gupta

    Hi All,
    we have HDP 1.3 cluster running on EC2 with 2 instances hive metastore and hive server running on slave node and R studio installed on master node.
    We successfully configured HDFS through R studio (by using rhdfs package), now we want to configure hive and Rstudio using (Rhive). getting following error…

    [1] “there is no slaves file of HADOOP. so you should pass hosts argument when you call rhive.connect().”
    Error in .jnew(“org/apache/hadoop/conf/Configuration”) :
    In addition: Warning message:
    In file(file, “rt”) :
    cannot open file ‘/hadoop/conf/slaves’: No such file or directory
    Error: Unable to establish connection with R session
    Error: Unable to establish connection with R session

    We also want to know about environment variables HIVE_HOME value in HDP
    Kindly help

    Thanks in Advance,
    Sandeep Upreti

to create new topics or reply. | New User Registration

  • Author
  • #41763
    Yi Zhang

    Hi Sandeep,

    HIVE_HOME in HDP is usually set at /usr/lib/hive
    Looks like the configuration object is not getting the correct configuration home as /etc/hadoop/conf? unless it is defined in your environment at /hadoop/conf?


You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.