Home Forums HDP on Linux – Installation Single node installation RHEL – Errors

This topic contains 3 replies, has 2 voices, and was last updated by  Dave 1 year, 2 months ago.

  • Creator
    Topic
  • #33600

    I am installing HDP1.3 on a single node RHEL machine, based on the installation steps on your website. When i try to copy a file to the HDFS, i am getting this error..

    [root@vc2c09rtp3992 conf]# /usr/lib/hadoop/bin/hadoop dfs -copyFromLocal /etc/passwd passwd-test
    13/08/28 18:54:11 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/root/passwd-test could only be replicated to 0 nodes, instead of 1
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1991)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:799)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1444)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1440)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1438)

    at org.apache.hadoop.ipc.Client.call(Client.java:1118)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
    at $Proxy1.addBlock(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
    at $Proxy1.addBlock(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3929)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3789)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2985)
    at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:3230)

    13/08/28 18:54:11 WARN hdfs.DFSClient: Error Recovery for null bad datanode[0] nodes == null
    13/08/28 18:54:11 WARN hdfs.DFSClient: Could not get block locations. Source file “/user/root/passwd-test” – Aborting…
    copyFromLocal: java.io.IOException: File /user/root/passwd-test could only be replicated to 0 nodes, instead of 1
    13/08/28 18:54:11 ERROR hdfs.DFSClient: Failed to close file /user/root/passwd-test
    org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/root/passwd-test could only be replicated to 0 nodes, instead of 1
    at org.apache.had

Viewing 3 replies - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #33613

    Dave
    Moderator

    Hi Shanthan,

    It looks like your datanode is trying to start in a secure (Kerberos) mode and you dont have this setup – is that correct?
    Can you check that:

    hadoop.security.authorization is set to false in all your core-site.xml

    And also in the file hadoop-env.sh, comment out the following line
    #export HADOOP_SECURE_DN_USER=hdfs

    Then restart your datanode process.

    Try again and see if you can upload the test again, or check that the error in the Datanode log has been rectified.

    Thanks

    Dave

    Collapse
    #33606

    Dave

    Here is the output

    jsvc.err

    28/08/2013 18:40:43 32299 jsvc.exec error: Cannot load daemon
    28/08/2013 18:40:43 32258 jsvc.exec error: Service exit with a return value of 3
    Initializing secure datanode resources
    java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.commons.daemon.support.DaemonLoader.load(DaemonLoader.java:156)
    Caused by: java.lang.RuntimeException: Cannot start secure datanode in unsecure cluster
    at org.apache.hadoop.hdfs.server.datanode.SecureDataNodeStarter.init(SecureDataNodeStarter.java:63)
    … 5 more
    28/08/2013 19:06:41 2675 jsvc.exec error: Cannot load daemon
    28/08/2013 19:06:41 2634 jsvc.exec error: Service exit with a return value of 3

    This is from the data node log

    ulimit -a for secure datanode user hdfs
    core file size (blocks, -c) 0
    data seg size (kbytes, -d) unlimited
    scheduling priority (-e) 0
    file size (blocks, -f) unlimited
    pending signals (-i) 63373
    max locked memory (kbytes, -l) 64
    max memory size (kbytes, -m) unlimited
    open files (-n) 32768
    pipe size (512 bytes, -p) 8
    POSIX message queues (bytes, -q) 819200
    real-time priority (-r) 0
    stack size (kbytes, -s) 10240
    cpu time (seconds, -t) unlimited
    max user processes (-u) 65536
    virtual memory (kbytes, -v) unlimited
    file locks (-x) unlimited

    Collapse
    #33604

    Dave
    Moderator

    Hi Shanthan,

    What does your datanode log look like? /var/log/hadoop/hdfs/

    It looks like this may be having some issues.

    Thanks

    Dave

    Collapse
Viewing 3 replies - 1 through 3 (of 3 total)