HDFS Forum

New bug in HDFS "listFiles" method

  • #36781
    Arnaud LINZ
    Participant

    Hello,

    We migrated from HDP2.0 alpha to beta and we found a regression :

    
    // FileSystem filesys is the HDFS File System
    // Path path is a relative path such as new Path("test")
    final RemoteIterator iter = filesys.listFiles(path, true);
                while (iter.hasNext()) {
                    retVal.add(iter.next());
              }
     }
    

    fails in NullPointerException when path is relative and contains a subdirectory.
    The stack trace shows that when trying to make an absolute path of the subdirectory,it calls

    
    final public LocatedFileStatus makeQualifiedLocated(URI defaultUri,
          Path path) {
        return new LocatedFileStatus(getLen(), isDir(), getReplication(),
            getBlockSize(), getModificationTime(),
            getAccessTime(),
            getPermission(), getOwner(), getGroup(),
            isSymlink() ? new Path(getSymlink()) : null,
            (getFullPath(path)).makeQualified(
                defaultUri, null), // fully-qualify path
            DFSUtil.locatedBlocks2Locations(getBlockLocations()));
      }
    

    but getFullPath code :

    
    /**
       * Get the full path
       * @param parent the parent path
       * @return the full path
       */
      final public Path getFullPath(final Path parent) {
        if (isEmptyLocalName()) {
          return parent;
        }
        return new Path(parent, getLocalName());
      }
    

    when called with a relative path as a parent still returns a relative path, and that makes
    getFullPath(path)).makeQualified(defaultUri, null)
    fail because of the null argument that is passed.

    A workaround is to call a method like

    
    public static Path fixRelativePart(Path path) throws IOException {
            final Path retVal;
            if (path.isUriPathAbsolute()) {
                retVal = path;
            }
            else {
                retVal = new Path(getFileSystem().getWorkingDirectory(), path);
            }
            return retVal;
        }
    

    on path before trying to call listFiles.

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #36783
    Arnaud LINZ
    Participant

    I forgot the stack trace :

    java.lang.NullPointerException
    at org.apache.hadoop.fs.Path.(Path.java:105)
    at org.apache.hadoop.fs.Path.makeQualified(Path.java:421)
    at org.apache.hadoop.hdfs.protocol.HdfsLocatedFileStatus.makeQualifiedLocated(HdfsLocatedFileStatus.java:72)
    at org.apache.hadoop.hdfs.DistributedFileSystem$15.hasNext(DistributedFileSystem.java:749)
    at org.apache.hadoop.fs.FileSystem$5.hasNext(FileSystem.java:1845)
    at com.bouygtel.ganesh.hdfs.HdfsTools.listDirectory(HdfsTools.java:152)
    (…)

    #36926
    Nicholas Sze
    Moderator

    Hi Arnaud, thanks for posting your discovery. I just have verified that the bug does exist. It is also in current Apache trunk. I will file a JIRA for fixing it. Thank again!

    #36927

The topic ‘New bug in HDFS "listFiles" method’ is closed to new replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.