HDP on Windows – Installation Forum

libhdfs port for windows

  • #44429
    Stephen Bovy

    I have ported libhdfs to windows. Would anyone be interested in colaborating or sharing in my efforts ??

to create new topics or reply. | New User Registration

  • Author
  • #44434
    Chris Nauroth

    Hi Stephen,

    Thanks for the post. You and I have been conversing on Apache issue HDFS-5541. I’d be happy to help move this work forward in Apache with code reviews and eventually committing it. Thank you for contributing!



    Stephen Bovy

    Thanks Chris !

    Ohh so you are my freindly advisor, thanks !!

    Pertinent Facts ( for the benfit of our audience)

    I derived my port from the current 2.xx source tree
    The 2.x version makes refference to some fs methods that do not exist in 1.1.3 ( I had to bypass or comment those sections )

    I have managed to test most of the hdfs features on windows with the 1.1.3 hdp windows package from hortonworks

    But I am getting the following error ::

    Open file in append mode:/tmp/appends

    hdfsOpenFile(/tmp/appends): FileSystem#append((Lorg/apache/hadoop/fs/Path;)Lorg/apache/hadoop/fs/FSDataOutputStream;) error:
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: Append is not supported. Please see the dfs.support.append configuration parameter

    >> Is this a limmitation or known issue ?? ( if not how do I fix it ?? ) <java -version

    java version “1.6.0_31”
    Java(TM) SE Runtime Environment (build 1.6.0_31-b05)
    Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)

    I have successfully run the SmokeStack

    Additional JDK issues ::

    The file open function also has some code to automatically enable a new feature called


    This code generated the following JDK error ::

    could not find method read from class org/apache/hadoop/fs/FSDataInputStream with signature (Ljava/nio/ByteBuffer;)IreadDirect: FSDataInputStream#read error:
    Begin Method Invokation:org/apache/commons/lang/exception/ExceptionUtils ## getStackTrace
    java.lang.NoSuchMethodError: read

    hdfsOpenFile(/tmp/testfile.txt): WARN: Unexpected error 255 when testing for direct read compatibility

    I do not know if the above error is a JDK classpath issue OR just a version/incompatibility issue

    I have also temporarilly bypassed and commented that code section

    Stephen Bovy

    >> Note >> Test Update >>

    I discovered how to enable the “broken” hadoop 1.x file ( append ) support

    I have successfully executed the libhdfs operational test program on windows

    Liu Sheng

    where can i find libhdfs.dll or how can i build it ? thanks in advance

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.