The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

HDP on Linux – Installation Forum

Snappy Library

  • #44169

    Hello,

    I am attempting to install Hadoop 1.3.2 on Red Hat Linux 5.9. I am just trying it out, so I am using a single node.

    I have managed to run all of the smoke tests for core installation succefully except for
    /usr/lib/hadoop/bin/hadoop jar /usr/lib/hadoop/hadoop-examples.jar terasort /test/10gsort/input /test/10gsort/output

    Any ideas would be greatly appreciated.

    I get these warning messages. When it gets further it fails as Snappy is not available.
    HDFS@SERVERNAME:/var/log/hadoop> /usr/lib/hadoop/bin/hadoop jar /usr/lib/hadoop/hadoop-examples.jar terasort /test/10gsort/input /test/10gsort/output
    13/11/21 11:16:08 INFO terasort.TeraSort: starting
    13/11/21 11:16:09 INFO mapred.FileInputFormat: Total input paths to process : 2
    13/11/21 11:16:09 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
    13/11/21 11:16:09 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
    13/11/21 11:16:09 WARN snappy.LoadSnappy: Snappy native library is available
    13/11/21 11:16:09 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
    13/11/21 11:16:09 WARN snappy.LoadSnappy: Snappy native library not loaded
    13/11/21 11:16:10 INFO compress.CodecPool: Got brand-new compressor
    Making 1 from 100000 records
    Step size is 100000.0
    13/11/21 11:16:10 INFO mapred.FileInputFormat: Total input paths to process : 2
    13/11/21 11:16:11 INFO mapred.JobClient: Running job: job_201311211109_0002
    13/11/21 11:16:12 INFO mapred.JobClient: map 0% reduce 0%
    13/11/21 11:16:39 INFO mapred.JobClient: map 1% reduce 0%
    13/11/21 11:16:44 INFO mapred.JobClient: Task Id : attempt_201311211109_0002_m_000000_0, Status : FAILED
    java.io.IOException: Spill failed

    Caused by: java.lang.RuntimeException: native snappy library not available

    (NOTE: I have added … where I deleted non-essential lines)

    HDFS@SERVERNAME:/var/log/hadoop> echo $PATH
    …/usr/java/default/bin

  • Author
    Replies
  • #44170

    HDFS@SERVERNAME:/var/log/hadoop> ls -al /usr/java/default/bin
    total 2392
    drwxrwxr-x 2 HDFS HADOOP 4096 Jan 20 2012 .
    drwxrwxr-x 10 HDFS HADOOP 4096 Nov 20 11:59 ..
    -rwxrwxr-x 1 HDFS HADOOP52315 Jan 20 2012 appletviewer
    -rwxrwxr-x 1 HDFS HADOOP52200 Jan 20 2012 apt
    lrwxrwxrwx 1 HDFS HADOOP 10 Nov 7 15:00 ControlPanel -> ./jcontrol
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 extcheck
    -rwxrwxr-x 1 HDFS HADOOP 953 Jan 20 2012 HtmlConverter
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 idlj
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jar
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jarsigner
    -rwxrwxr-x 1 HDFS HADOOP50794 Jan 20 2012 java
    -rwxrwxr-x 1 HDFS HADOOP52264 Jan 20 2012 javac
    -rwxrwxr-x 1 HDFS HADOOP52200 Jan 20 2012 javadoc
    -rwxrwxr-x 1 HDFS HADOOP52264 Jan 20 2012 javah
    -rwxrwxr-x 1 HDFS HADOOP52296 Jan 20 2012 javap
    -rwxrwxr-x 1 HDFS uxdw0had 105148 Jan 20 2012 javaws
    -rwxrwxr-x 1 HDFS HADOOP51987 Jan 20 2012 jconsole
    -rwxrwxr-x 1 HDFS HADOOP 6407 Jan 20 2012 jcontrol
    -rwxrwxr-x 1 HDFS HADOOP52019 Jan 20 2012 jdb
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jhat
    -rwxrwxr-x 1 HDFS HADOOP52131 Jan 20 2012 jinfo
    -rwxrwxr-x 1 HDFS HADOOP52131 Jan 20 2012 jmap
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jps
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jrunscript
    -rwxrwxr-x 1 HDFS HADOOP52019 Jan 20 2012 jsadebugd
    -rwxrwxr-x 1 HDFS HADOOP52131 Jan 20 2012 jstack
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jstat
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 jstatd
    -rwxrwxr-x 1 HDFS HADOOP 2673 Mar 25 2011 jvisualvm
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 keytool
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 native2ascii
    -rwxrwxr-x 1 HDFS HADOOP52235 Jan 20 2012 orbd
    -rwxrwxr-x 1 HDFS HADOOP52059 Jan 20 2012 pack200
    -rwxrwxr-x 1 HDFS HADOOP52347 Jan 20 2012 policytool
    -rwxrwxr-x 1 HDFS HADOOP52296 Jan 20 2012 rmic
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 rmid
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 rmiregistry
    -rwxrwxr-x 1 HDFS HADOOP52011 Jan 20 2012 schemagen
    -rwxrwxr-x 1 HDFS HADOOP52296 Jan 20 2012 serialver
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 servertool
    -rwxrwxr-x 1 HDFS HADOOP52235 Jan 20 2012 tnameserv
    -rwxrwxr-x 1 HDFS uxdw0had 186301 Jan 20 2012 unpack200
    -rwxrwxr-x 1 HDFS HADOOP51979 Jan 20 2012 wsgen
    -rwxrwxr-x 1 HDFS HADOOP52011 Jan 20 2012 wsimport
    -rwxrwxr-x 1 HDFS HADOOP52011 Jan 20 2012 xjc

    HDFS@SERVERNAME:/var/log/hadoop> echo $JAVA_HOME
    /usr/java/default

    HDFS@SERVERNAME:/var/log/hadoop> ls -al /usr/java/default
    lrwxrwxrwx 1 HDFS uxdw0had 28 Nov 13 16:49 /usr/java/default -> /usr/jdk1.6.0_31/jdk1.6.0_31

    Snappy
    HDFS@SERVERNAME:/var/log/hadoop> ls -al /usr/lib64/libsnappy.so
    lrwxrwxrwx 1 root root 18 Nov 19 15:51 /usr/lib64/libsnappy.so -> libsnappy.so.1.1.3

    #44171

    HDFS@SERVERNAME:/var/log/hadoop> ls -al /usr/lib/hadoop/lib/native/Linux-amd64-64/
    total 52
    drwxrwxr-x 2 HDFS HADOOP4096 Nov 20 15:13 .
    drwxrwxr-x 4 HDFS HADOOP4096 Nov 13 17:01 ..
    -rwxrwxr-x 1 HDFS uxdw0had 22542 Aug 13 2012 libgplcompression.a
    -rwxrwxr-x 1 HDFS HADOOP1248 Aug 13 2012 libgplcompression.la
    lrwxrwxrwx 1 HDFS HADOOP 26 Nov 13 17:01 libgplcompression.so -> libgplcompression.so.0.0.0
    lrwxrwxrwx 1 HDFS HADOOP 26 Nov 13 17:01 libgplcompression.so.0 -> libgplcompression.so.0.0.0
    -rwxrwxr-x 1 HDFS uxdw0had 15688 Aug 13 2012 libgplcompression.so.0.0.0
    lrwxrwxrwx 1 HDFS HADOOP 23 Nov 20 15:13 libsnappy.so -> /usr/lib64/libsnappy.so

The forum ‘HDP on Linux – Installation’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.