Home Forums HDP on Windows – Installation libhdfs classpath

This topic contains 2 replies, has 2 voices, and was last updated by  Stephen Bovy 11 months, 3 weeks ago.

  • Creator
    Topic
  • #44433

    Stephen Bovy
    Member

    The libhdfs JNI Interface ONLY uses the “-D” option for setting up the CLASSPATH and DOES not support the new classpath syntax

    Therefore I have updated libhdfs to use a NEW env variable LIBHDFS_CLASSPATH

    And I have updated “hadoop-config.cmd” and “hadoop.cmd” as follows

    hadoop-config.cmd::
    set LIBHDFS_CLASSPATH=%CLASSPATH%

    for %%i in (%HADOOP_CORE_HOME%\lib\*.jar) do (
    set LIBHDFS_CLASSPATH=!LIBHDFS_CLASSPATH!;%%i
    )

    hadoop.cmd::

    :print_usage
    @echo Usage: hadoop [--config confdir] COMMAND
    @echo where COMMAND is one of:
    @echo namenode -format format the DFS filesystem
    @echo secondarynamenode run the DFS secondary namenode
    @echo namenode run the DFS namenode
    @echo datanode run a DFS datanode
    @echo dfsadmin run a DFS admin client
    @echo mradmin run a Map-Reduce admin client
    @echo fsck run a DFS filesystem checking utility
    @echo fs run a generic filesystem user client
    @echo balancer run a cluster balancing utility
    @echo snapshotDiff diff two snapshots of a directory or diff the
    @echo current directory contents with a snapshot
    @echo lsSnapshottableDir list all snapshottable dirs owned by the current user
    @echo oiv apply the offline fsimage viewer to an fsimage
    @echo fetchdt fetch a delegation token from the NameNode
    @echo jobtracker run the MapReduce job Tracker node
    @echo pipes run a Pipes job
    @echo tasktracker run a MapReduce task Tracker node
    @echo historyserver run job history servers as a standalone daemon
    @echo job manipulate MapReduce jobs
    @echo queue get information regarding JobQueues
    @echo version print the version
    @echo jar ^ run a jar file
    @echo.
    @echo distcp ^ ^ copy file or directories recursively
    @echo distcp2 ^ ^ DistCp version 2
    @echo archive -archiveName NAME ^* ^ create a hadoop archive
    @echo daemonlog get/set the log level for each daemon
    @echo or
    @echo CLASSNAME run the class named CLASSNAME
    @echo Most commands print help when invoked w/o parameters.

    rem export variables for libhdfs

    @echo %LIBHDFS_CLASSPATH% > myclass

    rem export variables for libhdfs

    endlocal

    rem export variables for libhdfs

    for /F “usebackq” %%i IN (`type myclass`) do set LIBHDFS_CLASSPATH=%%i
    @del myclass

Viewing 2 replies - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #44842

    Stephen Bovy
    Member

    FYI: LIBHDFS issues

    There are no scripts provided that set-up the environment for using libhdfs ( this is true both for windows and Linux )

    I have updated some of your windows scripts to set things up for testing libhdfs , I hope to use the “sandbox” for regression testing on Linux

    But I will need to create a set-up script before I can begin testing

    Collapse
    #44838

    Robert Molina
    Moderator

    Hi Stephen,
    Thanks for the share. Will forward along the information to the product team.

    Regards,
    Robert

    Collapse
Viewing 2 replies - 1 through 2 (of 2 total)