HBaseTestingUtility based tests fail in HDP 2.0 sandbox

to create new topics or reply. | New User Registration

This topic contains 3 replies, has 2 voices, and was last updated by  Badri Narayanan 1 year, 10 months ago.

  • Creator
  • #36772

    We are currently evaluating HortonWorks HDP 2.0 & testing our existing project in sandbox environment. However, all unit tests in our project using HBaseTestingUtility fails. The logs are not useful and looking through the test cluster’s logs, we found the following exception:

    Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaster
    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.v2.app.MRAppMaster
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)…
    Could not find the main class: org.apache.hadoop.mapreduce.v2.app.MRAppMaster. Program will exit.

    The suggestions for this error were adding environment variables to user’s profile. I tried the same by editing both yarn-env.sh as well as /etc/profile, but same error persists.
    I also followed some of the patches being discussed in the Apache mailing list and tried the same in our test suite. The main ones being:
    *Setting “yarn.is.minicluster” as “true”
    *Setting “yarn.application.classpath” to System.getProperty(“java.class.path”)
    *Adding the following code: TableMapReduceUtil.addDependencyJars(job);

    I tried each one of them separately and in combination as well. In all cases, the error message has changed to indicate a different YARN exception, but the test continues to fail. The underlying exception that causes the test job to fail:

    ERROR [main] org.apache.hadoop.yarn.YarnUncaughtExceptionHandler: Thread Thread[main,5,main] threw an Exception.
    java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
    at org.apache.hadoop.security.Groups.(Groups.java:55)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:182)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:234)
    at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:248)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:80)
    Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

    at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
    … 5 more
    Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
    at org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native Method)
    … 10 more

    Any pointers to fixing the error is greatly appreciated.

    java version “1.6.0_31″
    Java(TM) SE Runtime Environment (build 1.6.0_31-b04)
    Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)

Viewing 3 replies - 1 through 3 (of 3 total)

You must be to reply to this topic. | Create Account

  • Author
  • #37182

    Thanks Cheryle. Yes we are running the HDP 2.0 Community Preview. Is this a known bug in that version? Please let me know. I will anyway download and try the HDP 2.0 Beta Sandbox.


    Cheryle Custer


    Are you running the HDP 2.0 Community Preview? If so, we’ve just released the HDP 2.0 Beta Sandbox. You can download it here: http://hortonworks.com/products/hdp-2/?c#install


    FWIW, this error happens only when using “startMiniMapReduceCluster()” and not “startMiniDFSCluster()” or “startMiniZKCluster()”.

Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.