HDP on Linux – Installation Forum

How do I build hadoop rpms from source rpm

  • #44091

    I downloaded the hadoop src rpm: hadoop-
    What should I do if I want to build the hadoop rpm?

    I tried but got the following errors:

    Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.PRSJfl
    + umask 022
    + cd /root/rpmbuild/BUILD
    + LANG=C
    + export LANG
    + unset DISPLAY
    + cd /root/rpmbuild/BUILD
    + rm -rf hadoop-
    + /bin/tar -xf –
    + /usr/bin/gzip -dc /root/rpmbuild/SOURCES/hadoop-
    + STATUS=0
    + ‘[‘ 0 -ne 0 ‘]’
    + cd hadoop-
    + /bin/chmod -Rf a+rX,u+w,g-w,o-w .
    + exit 0
    Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.5qbfJQ
    + umask 022
    + cd /root/rpmbuild/BUILD
    + cd hadoop-
    + LANG=C
    + export LANG
    + unset DISPLAY
    + env HADOOP_VERSION= ‘hadoop_jar_version=%{hadoop_jar_version}’ HADOOP_ARCH=Linux-amd64-64 bash /root/rpmbuild/SOURCES/do-component-build
    + echo ‘————– Hadoop-do-component-build-started ———–‘
    ————– Hadoop-do-component-build-started ———–
    ++ pwd
    + base_path=/root/rpmbuild/BUILD/hadoop-
    ++ echo /root/rpm
    + source_file_path=/root/rpm
    + tar_output_folder=/root/rpm../hdp-output
    + source_file_path=/root/rpm../HDP_COMPONENT_VARIABLES.sh
    + source /root/rpm../HDP_COMPONENT_VARIABLES.sh
    /root/rpmbuild/SOURCES/do-component-build: line 23: /root/rpm../HDP_COMPONENT_VARIABLES.sh: No such file or directory
    error: Bad exit status from /var/tmp/rpm-tmp.5qbfJQ (%build)

    RPM build errors:
    Bad exit status from /var/tmp/rpm-tmp.5qbfJQ (%build)

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.