Hive R Studio Issue

to create new topics or reply. | New User Registration

This topic contains 1 reply, has 2 voices, and was last updated by  Yi Zhang 1 year, 9 months ago.

  • Creator
  • #41710

    Anupam Gupta

    Hi All,
    we have HDP 1.3 cluster running on EC2 with 2 instances hive metastore and hive server running on slave node and R studio installed on master node.
    We successfully configured HDFS through R studio (by using rhdfs package), now we want to configure hive and Rstudio using (Rhive). getting following error…

    [1] “there is no slaves file of HADOOP. so you should pass hosts argument when you call rhive.connect().”
    Error in .jnew(“org/apache/hadoop/conf/Configuration”) :
    In addition: Warning message:
    In file(file, “rt”) :
    cannot open file ‘/hadoop/conf/slaves': No such file or directory
    Error: Unable to establish connection with R session
    Error: Unable to establish connection with R session

    We also want to know about environment variables HIVE_HOME value in HDP
    Kindly help

    Thanks in Advance,
    Sandeep Upreti

Viewing 1 replies (of 1 total)

You must be to reply to this topic. | Create Account

  • Author
  • #41763

    Yi Zhang

    Hi Sandeep,

    HIVE_HOME in HDP is usually set at /usr/lib/hive
    Looks like the configuration object is not getting the correct configuration home as /etc/hadoop/conf? unless it is defined in your environment at /hadoop/conf?


Viewing 1 replies (of 1 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.