Home Forums Hive / HCatalog Hive R Studio Issue

This topic contains 1 reply, has 2 voices, and was last updated by  Yi Zhang 5 months, 3 weeks ago.

  • Creator
    Topic
  • #41710

    Anupam Gupta
    Participant

    Hi All,
    we have HDP 1.3 cluster running on EC2 with 2 instances hive metastore and hive server running on slave node and R studio installed on master node.
    We successfully configured HDFS through R studio (by using rhdfs package), now we want to configure hive and Rstudio using (Rhive). getting following error…

    [1] “there is no slaves file of HADOOP. so you should pass hosts argument when you call rhive.connect().”
    Error in .jnew(“org/apache/hadoop/conf/Configuration”) :
    java.lang.ClassNotFoundException
    In addition: Warning message:
    In file(file, “rt”) :
    cannot open file ‘/hadoop/conf/slaves’: No such file or directory
    Error: Unable to establish connection with R session
    Error: Unable to establish connection with R session

    We also want to know about environment variables HIVE_HOME value in HDP
    Kindly help

    Thanks in Advance,
    Sandeep Upreti

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #41763

    Yi Zhang
    Moderator

    Hi Sandeep,

    HIVE_HOME in HDP is usually set at /usr/lib/hive
    Looks like the configuration object is not getting the correct configuration home as /etc/hadoop/conf? unless it is defined in your environment at /hadoop/conf?

    Thanks,
    Yi

    Collapse
Viewing 1 replies (of 1 total)