HDP on Windows – Installation Forum

libhdfs classpath

  • #44433
    Stephen Bovy

    The libhdfs JNI Interface ONLY uses the “-D” option for setting up the CLASSPATH and DOES not support the new classpath syntax

    Therefore I have updated libhdfs to use a NEW env variable LIBHDFS_CLASSPATH

    And I have updated “hadoop-config.cmd” and “hadoop.cmd” as follows


    for %%i in (%HADOOP_CORE_HOME%\lib\*.jar) do (


    @echo Usage: hadoop [–config confdir] COMMAND
    @echo where COMMAND is one of:
    @echo namenode -format format the DFS filesystem
    @echo secondarynamenode run the DFS secondary namenode
    @echo namenode run the DFS namenode
    @echo datanode run a DFS datanode
    @echo dfsadmin run a DFS admin client
    @echo mradmin run a Map-Reduce admin client
    @echo fsck run a DFS filesystem checking utility
    @echo fs run a generic filesystem user client
    @echo balancer run a cluster balancing utility
    @echo snapshotDiff diff two snapshots of a directory or diff the
    @echo current directory contents with a snapshot
    @echo lsSnapshottableDir list all snapshottable dirs owned by the current user
    @echo oiv apply the offline fsimage viewer to an fsimage
    @echo fetchdt fetch a delegation token from the NameNode
    @echo jobtracker run the MapReduce job Tracker node
    @echo pipes run a Pipes job
    @echo tasktracker run a MapReduce task Tracker node
    @echo historyserver run job history servers as a standalone daemon
    @echo job manipulate MapReduce jobs
    @echo queue get information regarding JobQueues
    @echo version print the version
    @echo jar ^ run a jar file
    @echo distcp ^ ^ copy file or directories recursively
    @echo distcp2 ^ ^ DistCp version 2
    @echo archive -archiveName NAME ^* ^ create a hadoop archive
    @echo daemonlog get/set the log level for each daemon
    @echo or
    @echo CLASSNAME run the class named CLASSNAME
    @echo Most commands print help when invoked w/o parameters.

    rem export variables for libhdfs

    @echo %LIBHDFS_CLASSPATH% > myclass

    rem export variables for libhdfs


    rem export variables for libhdfs

    for /F “usebackq” %%i IN (`type myclass`) do set LIBHDFS_CLASSPATH=%%i
    @del myclass

to create new topics or reply. | New User Registration

  • Author
  • #44838
    Robert Molina

    Hi Stephen,
    Thanks for the share. Will forward along the information to the product team.


    Stephen Bovy

    FYI: LIBHDFS issues

    There are no scripts provided that set-up the environment for using libhdfs ( this is true both for windows and Linux )

    I have updated some of your windows scripts to set things up for testing libhdfs , I hope to use the “sandbox” for regression testing on Linux

    But I will need to create a set-up script before I can begin testing

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.