Home Forums HDP on Windows – Installation libhdfs port for windows

Tagged: 

This topic contains 4 replies, has 3 voices, and was last updated by  Liu Sheng 5 months, 1 week ago.

  • Creator
    Topic
  • #44429

    Stephen Bovy
    Member

    I have ported libhdfs to windows. Would anyone be interested in colaborating or sharing in my efforts ??

Viewing 4 replies - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #48806

    Liu Sheng
    Participant

    where can i find libhdfs.dll or how can i build it ? thanks in advance

    Collapse
    #44472

    Stephen Bovy
    Member

    >> Note >> Test Update >>

    I discovered how to enable the “broken” hadoop 1.x file ( append ) support

    I have successfully executed the libhdfs operational test program on windows

    Collapse
    #44456

    Stephen Bovy
    Member

    Thanks Chris !

    Ohh so you are my freindly advisor, thanks !!

    Pertinent Facts ( for the benfit of our audience)

    I derived my port from the current 2.xx source tree
    The 2.x version makes refference to some fs methods that do not exist in 1.1.3 ( I had to bypass or comment those sections )

    I have managed to test most of the hdfs features on windows with the 1.1.3 hdp windows package from hortonworks

    But I am getting the following error ::

    Open file in append mode:/tmp/appends

    hdfsOpenFile(/tmp/appends): FileSystem#append((Lorg/apache/hadoop/fs/Path;)Lorg/apache/hadoop/fs/FSDataOutputStream;) error:
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: Append is not supported. Please see the dfs.support.append configuration parameter

    >> Is this a limmitation or known issue ?? ( if not how do I fix it ?? ) <java -version

    java version “1.6.0_31″
    Java(TM) SE Runtime Environment (build 1.6.0_31-b05)
    Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)

    I have successfully run the SmokeStack

    Additional JDK issues ::

    The file open function also has some code to automatically enable a new feature called

    HDFS_FILE_SUPPORTS_DIRECT_READ

    This code generated the following JDK error ::

    could not find method read from class org/apache/hadoop/fs/FSDataInputStream with signature (Ljava/nio/ByteBuffer;)IreadDirect: FSDataInputStream#read error:
    Begin Method Invokation:org/apache/commons/lang/exception/ExceptionUtils ## getStackTrace
    java.lang.NoSuchMethodError: read

    hdfsOpenFile(/tmp/testfile.txt): WARN: Unexpected error 255 when testing for direct read compatibility

    I do not know if the above error is a JDK classpath issue OR just a version/incompatibility issue

    I have also temporarilly bypassed and commented that code section

    Collapse
    #44434

    Chris Nauroth
    Participant

    Hi Stephen,

    Thanks for the post. You and I have been conversing on Apache issue HDFS-5541. I’d be happy to help move this work forward in Apache with code reviews and eventually committing it. Thank you for contributing!

    https://issues.apache.org/jira/browse/HDFS-5541

    –Chris

    Collapse
Viewing 4 replies - 1 through 4 (of 4 total)