Home Forums MapReduce Sqoop Import From SQL Error

This topic contains 1 reply, has 1 voice, and was last updated by  Jeffrey 8 months, 2 weeks ago.

  • Creator
    Topic
  • #46388

    Jeffrey
    Participant

    We are trying to create a simple import routine in Java that imports a single table from SQL Server to Hadoop/Hive using Sqoop. We are getting the following error:

    Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
    at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:303)
    at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:348)
    at org.apache.hadoop.mapreduce.task.JobContextImpl.<init>(JobContextImpl.java:72)
    at org.apache.hadoop.mapreduce.Job.<init>(Job.java:133)
    at org.apache.hadoop.mapreduce.Job.<init>(Job.java:123)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:230)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
    at SqlImport.importSQLToHDFS(SqlImport.java:38)
    at SqlImport.main(SqlImport.java:9)
    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

    We have found that the PlatformName.class file is no longer being supplied in the Hadoop-Common jar file.
    Is there a incompatibility between the JAR files supplied in the HortonWorks Sandbox 2.0?

    Here is a copy of our test Java import routine:

    String driver = “com.microsoft.sqlserver.jdbc.SQLServerDriver”;
    Class.forName(driver).newInstance();
    SqoopOptions options = new SqoopOptions();
    options.setConnManagerClassName(“org.apache.sqoop.manager.GenericJdbcManager”);
    options.setDriverClassName(driver);
    options.setHadoopMapRedHome(“/usr/lib/hadoop-mapreduce”);
    options.setConnectString(“jdbc:sqlserver://server;databaseName=db;user=user;password=pwd”);
    options.setTableName(“Parameters”);
    options.setUsername(“user”);
    options.setPassword(“pwd”);
    options.setOverwriteHiveTable(true);
    options.setDirectMode(true);
    options.setNumMappers(1);
    options.setJobName(“Test Import”);
    options.setFileLayout(FileLayout.TextFile);
    options.setDirectMode(true);
    options.setHiveImport(true);
    options.setHiveDatabaseName(“default”);
    options.setHiveTableName(“Parameters”);
    options.setHiveHome(“/usr/lib/hive”);

    ImportTool tool = new ImportTool();
    tool.run(options);

    If anyone has a solution for this, please let me know.

    Thanks,

    Jeffrey Taylor

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #46869

    Jeffrey
    Participant

    Finally got past this error.
    You must include hadoop-auth-2.2.0.2.0.6.0-76.jar in your classpath.

    Collapse
Viewing 1 replies (of 1 total)