Home Forums HDP on Linux – Installation Mapreduce service start issue

This topic contains 19 replies, has 2 voices, and was last updated by  Dave 1 year, 1 month ago.

  • Creator
    Topic
  • #37981

    Vikas Madaan
    Member

    Hi I am trying to start map reduce services in HDP 2.0 and i am facing some issues with it. Below is the log file.

    i see this line in the log /user/ambari-qa/mapredsmokeinput]/returns: put: `/user/ambari-qa/mapredsmokeinput': No such file or directory

    I get a similar error while starting OOZIE service also. Can please some help.

    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    notice: /Stage[2]/Hdp-yarn::Mapred2::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::cleanup_before]/Hdp::Exec[hadoop --config /etc/hadoop/conf fs -rm -r -f /user/ambari-qa/mapredsmokeoutput /user/ambari-qa/mapredsmokeinput]/Exec[hadoop --config /etc/hadoop/conf fs -rm -r -f /user/ambari-qa/mapredsmokeoutput /user/ambari-qa/mapredsmokeinput]/returns: executed successfully
    notice: /Stage[2]/Hdp-yarn::Mapred2::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::create_file]/Hdp::Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput]/Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput]/returns: put: `/user/ambari-qa/mapredsmokeinput': No such file or directory
    err: /Stage[2]/Hdp-yarn::Mapred2::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::create_file]/Hdp::Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput]/Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput]/returns: change from notrun to 0 failed: hadoop –config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput returned 1 instead of one of [0] at /var/lib/ambari-agent/puppet/modules/hdp/manifests/init.pp:479
    notice: /Stage[2]/Hdp-yarn::Mapred2::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::create_file]/Hdp::Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput]/Anchor[hdp::exec::hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput::end]: Dependency Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput] has failures: true
    warning: /Stage[2]/Hdp-yarn::Mapred2::Service_check/Hdp-hadoop::Exec-hadoop[mapred::service_check::create_file]/Hdp::Exec[hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput]/Anchor[hdp::exec::hadoop --config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput::end]: Skipping because of failed dependencies

Viewing 19 replies - 1 through 19 (of 19 total)

The topic ‘Mapreduce service start issue’ is closed to new replies.

  • Author
    Replies
  • #38638

    Dave
    Moderator

    Issues were:

    Incorrect hostname (localhost.localdomain)
    Hosts file set incorrectly
    CentOS Image was not using static IP

    After these things were corrected and the correct 2.0 repository were installed, then the system installed and configured correctly

    Collapse
    #38238
    Collapse
    #38234

    Dave
    Moderator

    Vikas,

    Great, send me your email and we can get started.

    Thanks

    Dave

    Collapse
    #38229

    Vikas Madaan
    Member

    Yes I have a fresh machine I can have on VM on my laptop.

    Collapse
    #38228

    Dave
    Moderator

    Hi Vikas,

    Do you have a fresh machine on CentOS 6 as we don’t recommend using localhost.localdomain as it causes problems.
    I can then guide you through an installation and get you up and running.

    Thanks

    Dave

    Collapse
    #38223

    Vikas Madaan
    Member

    Yes I installed HDP via Ambari.
    I have a installed a CentOS 6.2 on a VM on my laptop and i installed the HDP against localhost.localdomain which point the VM.
    When i start the services using ambari it failed on NameNode Start giving Puppet is killed by time out error. then i start the services again and then it starts everything eceprt oozie server. at present all the services are running except Oozie Server, Ganglia Monitor and HBase RegionServer and the Action button is disabled against them.

    Can we have a web ex so that you can see my environment.

    Regards
    Vikas Madaan

    Collapse
    #38221

    Dave
    Moderator

    Hi Vikas,

    I would look more into why these don’t start through Ambari.
    Did you install HDP against the hostname of the machine?
    When you start HDFS through Ambari does it start correctly? If you start ALL services, do they start?

    Thanks

    Dave

    Collapse
    #38210

    Vikas Madaan
    Member

    can you please send me an example .profile for root and hdfs user

    Collapse
    #38208

    Dave
    Moderator

    Hi Vikas,

    The smoketest will run as HDFS user.
    It is environmental, it is recommended that you add these to your bash profile for the users (hdfs, oozie, hive) so that these users can pick up and run the commands.

    Oozie is not showing me an error here, but I believe that MapReduce is running correctly now you have setup the environment for root (although you would need to do this for hdfs too)

    As we need to keep forum post lengths on topic and pretty short, I would advise a new post for Oozie.

    Thanks

    Dave

    Collapse
    #38203

    Vikas Madaan
    Member

    These errors are coming on Ambari console only. I have never tried to start the services from the command line.

    Collapse
    #38202

    Dave
    Moderator

    Hi Vikas,

    To smoke test MapReduce you must be the HDFS user – use “su – l hdfs -c “command””

    Can you not start / stop these services through Ambari?

    Thanks

    Dave

    Collapse
    #38198

    Vikas Madaan
    Member

    This is the exact error i found when i tried to start OOZIE.

    err: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns: change from notrun to 0 failed: su – oozie -c ‘echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share’ returned 1 instead of one of [0] at /var/lib/ambari-agent/puppet/modules/hdp/manifests/init.pp:479

    Collapse
    #38196

    Vikas Madaan
    Member

    Hi Dave, I set the HADOOP_HOME in hadoop.env and ran the command again and this time it gives below o/p
    [root@localhost ~]# hadoop –config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput
    put: Permission denied: user=root, access=EXECUTE, inode=”/user/ambari-qa”:ambari-qa:hdfs:drwxrwx—
    [root@localhost ~]#
    When i start the services again it again it fails at NodeName start saying that “Puppet has been killed due to time out”. but when i restart again all goes fine except the oozie server.

    notice: /Stage[1]/Hdp::Snappy::Package/Hdp::Snappy::Package::Ln[32]/Hdp::Exec[hdp::snappy::package::ln 32]/Exec[hdp::snappy::package::ln 32]/returns: executed successfully
    notice: /Stage[2]/Hdp-oozie/Configgenerator::Configfile[oozie-site]/File[/etc/oozie/conf/oozie-site.xml]/content: content changed ‘{md5}b24a7c46c332462d2fe6c2d2baff3658′ to ‘{md5}550e95cabfb8bec9783652e148ed485a’
    notice: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns: 0
    notice: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns: DEPRECATED: Use of this script to execute hdfs command is deprecated.
    notice: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns: Instead use the hdfs command for it.
    notice: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns:
    notice: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns: put: Permission denied: user=oozie, access=WRITE, inode=”/user/oozie”:hdfs:hdfs:drwxr-xr-x
    notice: /Stage[2]/Hdp-oozie::Service/Hdp::Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/Exec[exec su - oozie -c 'echo 0; hadoop dfs -put /usr/lib/oozie/share /user/oozie ; hadoop dfs -chmod -R 755 /user/oozie/share']/returns: DEPRECATED: Use of this script to exec

    Collapse
    #38186

    Dave
    Moderator

    Hi Vikas,

    You installed this using Ambari?
    It looks like you have not set the environment variables like HADOOP_HOME correctly.

    Thanks

    Dave

    Collapse
    #38005

    Vikas Madaan
    Member

    this is what i am getting

    [root@localhost ~]# hadoop –config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput
    Exception in thread “main” java.lang.NoClassDefFoundError: –config
    Caused by: java.lang.ClassNotFoundException: –config
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
    Could not find the main class: –config. Program will exit.
    [root@localhost ~]#

    Collapse
    #38002

    Dave
    Moderator

    Hi Vikas,

    Did the ls on hdfs give no output?

    It looks like hdfs may be having some issues.
    What do your hdfs logs look like, do you see any issues there?

    What happens when you run (as hdfs):

    hadoop –config /etc/hadoop/conf fs -put /etc/passwd /user/ambari-qa/mapredsmokeinput

    Thanks

    Dave

    Collapse
    #38001

    Vikas Madaan
    Member

    I was able to start all the services previously and did not face the error. I just kept on restarting everything every time it fails. in one attempt all the services started but there was some number mentioned against WebHCat Serives (6)

    Then I was configuring HUE and it requires to start and stop all services. Everything was going fine but when it was trying to start NameNode it faced the error again puppet killed due to time out.

    I am facing this Puppet error more often then not.

    Collapse
    #37996

    Vikas Madaan
    Member

    Hi Dave,

    1. Yes I am starting the services from with in Ambari.
    2. [root@localhost ~]# su – hdfs
    [hdfs@localhost ~]$ hadoop dfs -ls /user/ambari-qa
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    3. [root@localhost ~]# hadoop-version
    bash: hadoop-version: command not found
    [root@localhost ~]# hadoop version
    Hadoop 2.1.0.2.0.5.0-67
    Subversion git@github.com:hortonworks/hadoop.git -r 1c22de9e115666ace01266dc8eef890298fa5e69
    Compiled by jenkins on 2013-08-30T18:13Z
    Compiled with protoc 2.5.0
    From source with checksum 16222eceaf2ceaadcd517eada7280bc
    This command was run using /usr/lib/hadoop/hadoop-common-2.1.0.2.0.5.0-67.jar

    Also I am getting “Puppet is killed due to time out” very often.

    Regards
    Vikas Madaan

    Collapse
    #37991

    Dave
    Moderator

    Hi Vikas,

    Can you run (as hdfs):

    hadoop dfs -ls /user/ambari-qa

    Are you starting the services from within Ambari?
    Also provide the output of:
    hadoop version

    Thanks

    Dave

    Collapse
Viewing 19 replies - 1 through 19 (of 19 total)