libhdfs port for windows

to create new topics or reply. | New User Registration


This topic contains 4 replies, has 3 voices, and was last updated by  Liu Sheng 1 year, 5 months ago.

  • Creator
  • #44429

    Stephen Bovy

    I have ported libhdfs to windows. Would anyone be interested in colaborating or sharing in my efforts ??

Viewing 4 replies - 1 through 4 (of 4 total)

You must be to reply to this topic. | Create Account

  • Author
  • #48806

    Liu Sheng

    where can i find libhdfs.dll or how can i build it ? thanks in advance


    Stephen Bovy

    >> Note >> Test Update >>

    I discovered how to enable the “broken” hadoop 1.x file ( append ) support

    I have successfully executed the libhdfs operational test program on windows


    Stephen Bovy

    Thanks Chris !

    Ohh so you are my freindly advisor, thanks !!

    Pertinent Facts ( for the benfit of our audience)

    I derived my port from the current 2.xx source tree
    The 2.x version makes refference to some fs methods that do not exist in 1.1.3 ( I had to bypass or comment those sections )

    I have managed to test most of the hdfs features on windows with the 1.1.3 hdp windows package from hortonworks

    But I am getting the following error ::

    Open file in append mode:/tmp/appends

    hdfsOpenFile(/tmp/appends): FileSystem#append((Lorg/apache/hadoop/fs/Path;)Lorg/apache/hadoop/fs/FSDataOutputStream;) error:
    org.apache.hadoop.ipc.RemoteException: Append is not supported. Please see the configuration parameter

    >> Is this a limmitation or known issue ?? ( if not how do I fix it ?? ) <java -version

    java version “1.6.0_31″
    Java(TM) SE Runtime Environment (build 1.6.0_31-b05)
    Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)

    I have successfully run the SmokeStack

    Additional JDK issues ::

    The file open function also has some code to automatically enable a new feature called


    This code generated the following JDK error ::

    could not find method read from class org/apache/hadoop/fs/FSDataInputStream with signature (Ljava/nio/ByteBuffer;)IreadDirect: FSDataInputStream#read error:
    Begin Method Invokation:org/apache/commons/lang/exception/ExceptionUtils ## getStackTrace
    java.lang.NoSuchMethodError: read

    hdfsOpenFile(/tmp/testfile.txt): WARN: Unexpected error 255 when testing for direct read compatibility

    I do not know if the above error is a JDK classpath issue OR just a version/incompatibility issue

    I have also temporarilly bypassed and commented that code section


    Chris Nauroth

    Hi Stephen,

    Thanks for the post. You and I have been conversing on Apache issue HDFS-5541. I’d be happy to help move this work forward in Apache with code reviews and eventually committing it. Thank you for contributing!


Viewing 4 replies - 1 through 4 (of 4 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.