The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Hive / HCatalog Forum

Hive Embedded Client

  • #8346
    saad khawaja
    Member

    HI

    Has anyone used hive embedded client? Please respond if you were successfully? I am attempting to do the same and facing a very persistinga problem.

    Problem:
    I am using the hive embedded client and the embedded client is picking up hive config from /conf but its Hadoop config is not getting initialized, instead of using hdfs the embedded client seems to end up using file:/// local file system.

    I have tried several things, such as adding the hive/conf and hadoop/conf on the classpath of my java embedded client process, I add the all the hadoop/lib & hive/lib jars on the classpath. I have exported the hive_home and hadoop_home.

    Here is the exception stack trace. There are two of them
    Using hortonworks hive 0.90,
    OS: linux and centos

    2012-08-15 10:46:56,178 DEBUG DataNucleus.Transaction: Committing [DataNucleus Transaction, ID=Xid=, enlisted resources=[org.datanucleus.store.rdbms.ConnectionFactoryImpl$EmulatedXAResource@2a717ef5]]
    2012-08-15 10:46:56,181 DEBUG DataNucleus.Connection: Managed connection org.datanucleus.store.rdbms.ConnectionFactoryImpl$EmulatedXAResource@2a717ef5 is committing for transaction Xid= with onePhase=true
    2012-08-15 10:46:56,183 DEBUG DataNucleus.Connection: Managed connection org.datanucleus.store.rdbms.ConnectionFactoryImpl$EmulatedXAResource@2a717ef5 committed connection for transaction Xid= with onePhase=true
    2012-08-15 10:46:56,183 DEBUG DataNucleus.Connection: Connection “jdbc:mysql://localhost/hive, UserName=root@localhost, MySQL-AB JDBC Driver” closed
    2012-08-15 10:46:56,183 DEBUG DataNucleus.Connection: Connection removed from the pool : [org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl@21ff3fcf, NULL] for key=org.datanucleus.MultithreadedObjectManager@4ab3a5d1 in factory=ConnectionFactory:tx[org.datanucleus.store.rdbms.ConnectionFactoryImpl@26e56328]
    2012-08-15 10:46:56,184 DEBUG DataNucleus.Persistence: Detaching object from persistence : “org.apache.hadoop.hive.metastore.model.MDatabase@5a30cefd” (depth=0)
    2012-08-15 10:46:56,186 DEBUG DataNucleus.Persistence: Object “org.apache.hadoop.hive.metastore.model.MDatabase@5a30cefd” (id=”1[OID]org.apache.hadoop.hive.metastore.model.MDatabase”) is having the SCO wrapper in field “parameters” replaced by the unwrapped value
    2012-08-15 10:46:56,187 DEBUG DataNucleus.Lifecycle: Object “org.apache.hadoop.hive.metastore.model.MDatabase@5a30cefd” (id=”1[OID]org.apache.hadoop.hive.metastore.model.MDatabase”) has a lifecycle change : “P_CLEAN”->”DETACHED_CLEAN”
    2012-08-15 10:46:56,187 DEBUG DataNucleus.Transaction: Object “org.apache.hadoop.hive.metastore.model.MDatabase@5a30cefd” (id=”1[OID]org.apache.hadoop.hive.metastore.model.MDatabase”) being evicted from transactional cache
    2012-08-15 10:46:56,187 DEBUG DataNucleus.Persistence: Disconnecting org.apache.hadoop.hive.metastore.model.MDatabase@5a30cefd from StateManager[pc=org.apache.hadoop.hive.metastore.model.MDatabase@5a30cefd, lifecycle=DETACHED_CLEAN]
    2012-08-15 10:46:56,188 DEBUG DataNucleus.Cache: Object with id=”1[OID]org.apache.hadoop.hive.metastore.model.MDatabase” being removed from Level 1 cache [current cache size = 1]
    2012-08-15 10:46:56,190 DEBUG DataNucleus.Transaction: Transaction committed in 17 ms
    2012-08-15 10:46:56,543 INFO exec.HiveHistory: Hive history file=/tmp/root/hive_job_log_root_201208151046_337230891.txt
    2012-08-15 10:46:56,579 DEBUG conf.Configuration: java.io.IOException: config()
    at org.apache.hadoop.conf.Configuration.(Configuration.java:227)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:214)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:467)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:453)
    at org.apache.hadoop.hive.shims.HadoopShimsSecure.getUGIForConf(HadoopShimsSecure.java:489)
    at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:51)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:221)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:285)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:136)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:51)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:45)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool$HiveConnectionPoolLazyInitializer.(HiveConnectionPool.java:137)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getInstance(HiveConnectionPool.java:133)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:114)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:56,596 DEBUG security.Groups: Creating new Groups object
    2012-08-15 10:46:56,603 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000
    2012-08-15 10:46:56,647 DEBUG security.UserGroupInformation: hadoop login
    2012-08-15 10:46:56,647 DEBUG security.UserGroupInformation: hadoop login commit
    2012-08-15 10:46:56,652 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: root
    2012-08-15 10:46:56,655 DEBUG security.UserGroupInformation: UGI loginUser:root
    2012-08-15 10:46:56,679 DEBUG security.Groups: Returning fetched groups for ‘root’
    2012-08-15 10:46:56,679 DEBUG security.Groups: Returning cached groups for ‘root’
    2012-08-15 10:46:56,684 DEBUG conf.Configuration: java.io.IOException: config(config)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:260)
    at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:802)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.init(HiveAuthorizationProviderBase.java:46)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:39)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:195)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:287)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:136)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:51)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:45)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool$HiveConnectionPoolLazyInitializer.(HiveConnectionPool.java:137)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getInstance(HiveConnectionPool.java:133)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:114)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:56,685 DEBUG conf.Configuration: java.io.IOException: config()
    at org.apache.hadoop.conf.Configuration.(Configuration.java:227)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:214)
    at org.apache.hadoop.mapred.JobConf.(JobConf.java:330)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:831)
    at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:803)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.init(HiveAuthorizationProviderBase.java:46)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:39)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:195)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:287)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:136)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:51)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:45)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool$HiveConnectionPoolLazyInitializer.(HiveConnectionPool.java:137)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getInstance(HiveConnectionPool.java:133)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:114)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:56,918 INFO service.HiveServer: Putting temp output to file /tmp/root/root_2012081510461970716513230989280.pipeout
    2012-08-15 10:46:56,922 INFO root: 001 | com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool::initUdfToHive | Line: 60 | – Loading UserDefinedFunctions …
    2012-08-15 10:46:56,933 INFO service.HiveServer: Running the query: create temporary function UDFEquivalentAdamIdCalculator as ‘com.apple.ist.gbi.edw.etl.fwrk.udf.UDFEquivalentAdamIdCalculator’
    2012-08-15 10:46:56,953 INFO ql.Driver:
    2012-08-15 10:46:56,953 INFO ql.Driver:
    2012-08-15 10:46:57,000 DEBUG parse.VariableSubstitution: Substitution is on: create temporary function UDFEquivalentAdamIdCalculator as ‘com.apple.ist.gbi.edw.etl.fwrk.udf.UDFEquivalentAdamIdCalculator’
    2012-08-15 10:46:57,011 INFO parse.ParseDriver: Parsing command: create temporary function UDFEquivalentAdamIdCalculator as ‘com.apple.ist.gbi.edw.etl.fwrk.udf.UDFEquivalentAdamIdCalculator’
    2012-08-15 10:46:57,233 INFO parse.ParseDriver: Parse Completed
    2012-08-15 10:46:57,268 INFO parse.FunctionSemanticAnalyzer: analyze done
    2012-08-15 10:46:57,268

    2012-08-15 10:46:57,443 INFO service.HiveServer: Running the query: add jar /ngs/app/etlfwrkt/software/hiveioformat
    2012-08-15 10:46:57,445 INFO service.HiveServer: Putting temp output to file /tmp/root/root_2012081510461970716513230989280.pipeout
    2012-08-15 10:46:57,446 DEBUG parse.VariableSubstitution: Substitution is on: jar /ngs/app/etlfwrkt/software/hiveioformat
    2012-08-15 10:46:57,463 DEBUG fs.FileSystem: Creating filesystem for file:///
    2012-08-15 10:46:57,471 ERROR SessionState: /ngs/app/etlfwrkt/software/hiveioformat does not exist
    2012-08-15 10:46:57,480 FATAL root: 001 | com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool:: | Line: 55 | – Failed to Load UDF or HiveIOFormat Jars
    HiveServerException(message:Query returned non-zero code: 1, cause: /ngs/app/etlfwrkt/software/hiveioformat does not exist., errorCode:1, SQLState:null)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:212)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.addJarToHive(HiveConnectionPool.java:73)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:53)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.(HiveConnectionPool.java:45)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool$HiveConnectionPoolLazyInitializer.(HiveConnectionPool.java:137)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getInstance(HiveConnectionPool.java:133)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:114)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:57,487 DEBUG conf.Configuration: java.io.IOException: config()
    at org.apache.hadoop.conf.Configuration.(Configuration.java:227)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:214)
    at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:797)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:115)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:57,487 DEBUG conf.Configuration: java.io.IOException: config()
    at org.apache.hadoop.conf.Configuration.(Configuration.java:227)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:214)
    at org.apache.hadoop.mapred.JobConf.(JobConf.java:330)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:831)
    at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:798)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:115)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:57,543 INFO exec.HiveHistory: Hive history file=/tmp/root/hive_job_log_root_201208151046_1466269778.txt
    2012-08-15 10:46:57,583 DEBUG security.Groups: Returning cached groups for ‘root’
    2012-08-15 10:46:57,584 DEBUG security.Groups: Returning cached groups for ‘root’
    2012-08-15 10:46:57,584 DEBUG conf.Configuration: java.io.IOException: config(config)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:260)
    at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:802)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.init(HiveAuthorizationProviderBase.java:46)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:39)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:195)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:287)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:136)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:115)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:57,584 DEBUG conf.Configuration: java.io.IOException: config()
    at org.apache.hadoop.conf.Configuration.(Configuration.java:227)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:214)
    at org.apache.hadoop.mapred.JobConf.(JobConf.java:330)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:831)
    at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:803)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.init(HiveAuthorizationProviderBase.java:46)
    at org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProviderBase.setConf(HiveAuthorizationProviderBase.java:39)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:195)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:287)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:136)
    at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.(HiveServer.java:121)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveConnectionPool.getHiveClient(HiveConnectionPool.java:115)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:60)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:57,634 INFO service.HiveServer: Putting temp output to file /tmp/root/root_2012081510462917386560809082546.pipeout
    2012-08-15 10:46:57,634 INFO metastore.HiveMetaStore: 0: get_table : db=ITUNES_TRANSACTION_CORE tbl=ITS_DOWNLOAD
    2012-08-15 10:46:57,639 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    2012-08-15 10:46:57,639 DEBUG conf.Configuration: java.io.IOException: config(config)
    at org.apache.hadoop.conf.Configuration.(Configuration.java:260)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getConf(HiveMetaStore.java:316)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:344)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:333)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:974)
    at com.apple.ist.gbi.edw.etl.fwrk.hive.HiveClient.getTable(HiveClient.java:231)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:61)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.HiveTableBuilder.buildTable(HiveTableBuilder.java:40)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTable(EDWTableFactory.java:77)
    at com.apple.ist.gbi.edw.etl.fwrk.metadata.EDWTableFactory.getEDWTableFromTargetTable(EDWTableFactory.java:57)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.prepareBatchTable(FileToCoreExecutor.java:600)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:195)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.execute(FileToCoreExecutor.java:169)
    at com.apple.ist.gbi.edw.etl.fwrk.file2core.FileToCoreExecutor.main(FileToCoreExecutor.java:132)

    2012-08-15 10:46:57,640 DEBUG metastore.ObjectStore: Overriding datanucleus.cache.level2.type value null from jpox.properties with none
    2012-08-15 10:46:57,640 DEBUG metastore.ObjectStore: Overriding datanucleus.plugin.pluginRegistryBundleCheck value null from jpox.properties with LOG
    2012-08-15 10:46:57,640 DEBUG metastore.ObjectStore: Overriding javax.jdo.option.ConnectionURL value null from jpox.properties with jdbc:mysql://localhost/hive
    2012-08-15 10:46:57,640 DEBUG metastore.ObjectStore: Overriding datanucleus.autoStartMechanismMode value null from jpox.properties with checked
    2012-08-15 10:46:57,641 DEBUG metastore.ObjectStore: Overriding datanucleus.validateConstraints value null from jpox.properties with false
    2012-08-15 10:46:57,641 DEBUG metastore.ObjectStore: Overriding datanucleus.autoCreateSchema value null from jpox.properties with true
    2012-08-15 10:46:57,641 DEBUG metastore.ObjectStore: Overriding datanucleus.cache.level2 value null from jpox.properties with false
    2012-08-15 10:46:57,641 DEBUG metastore.ObjectStore: Overriding javax.jdo.option.ConnectionUserName value null from jpox.properties with root
    2012-08-15 10:46:57,641 DEB

  • Author
    Replies
  • #8499
    saad khawaja
    Member

    Thanks for the response.

    I got this working by adding, Hadoop jars, Hive jars, HCatalog jars in the classpath, in addition to hive/conf and hadoop/conf. The hive client metastore is backed by mysql and the process is running. I am using the hive embedded client and making it to directly talk to the mysql metastore.
    I do not start the hive server because I dont want to use it, I start my process local to where hive is installed.

The topic ‘Hive Embedded Client’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.