Also, the ownership on my /hadoop/hdfs/namenode/in_use.lock file is already set to hdfs. -Dan ...
Also, the ownership on my /hadoop/hdfs/namenode/in_use.lock file is already set to hdfs.
I deployed hadoop and its components using ambari, while I am starting namenode seeing following error logs,
Any help will be appreciated.
Fail: Execution of ‘ulimit -c unlimited; export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh –config /etc/hadoop/conf start namenode’ returned 1. -bash: line 0: ulimit: core file size: cannot modify limit: Operation not permitted
starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-euca-192-168-217-80.eucalyptus.internal.out
You must be logged in to reply to this topic.