NFS Gateway error

to create new topics or reply. | New User Registration

This topic contains 2 replies, has 3 voices, and was last updated by  Cen Yuhai 1 year, 10 months ago.

  • Creator
  • #29622

    Arnaud Gaillard


    I just upgraded a little virtual cluster that I use for test from 1.2 to 1.3.

    Everything went smoothly , but I’m unable to make the NFS gateway work.

    I can launch the gateway and mount it on another node. If I do a df -k data usage reported is correct, however if I go to the directory (whatever the user) and do a ls, it will hang forever. There is no specific error log on the node hosting the gateway, however on the namenode I can see plenty of this:

    2013-07-17 17:20:49,930 INFO org.apache.hadoop.ipc.Server: IPC Server handler 97 on 8020, call getExtendedFileInfo(/.reserved/.inodes/16389) from error: File for given inode path does not exist: /.reserved/.inodes/16389 File for given inode path does not exist: /.reserved/.inodes/16389

    If I do a fsck of the hdfs the result is ok.

    If someone can point me to where to look to solve this problem it will be really great!


Viewing 2 replies - 1 through 2 (of 2 total)

You must be to reply to this topic. | Create Account

  • Author
  • #35898

    Cen Yuhai

    I also installed the 1.3.0 and nfs ,do you know the password of the account hdfs?


    Sasha J

    It looks like something was changed in the HDFS after NFS export was performed.
    It can not see this file: File for given inode path does not exist: /.reserved/.inodes/16389

    try to see, it that file exist or not.
    Also, try to restart NFS gateway and see if this help.

    Thank you!

Viewing 2 replies - 1 through 2 (of 2 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.