Can you post your /var/log/hadoop/hdfs/hadoop-hdfs-namenode-euca-192-168-217-80.eucalyptus.internal.out ? ...
Can you post your /var/log/hadoop/hdfs/hadoop-hdfs-namenode-euca-192-168-217-80.eucalyptus.internal.out ?
I deployed hadoop and its components using ambari, while I am starting namenode seeing following error logs,
Any help will be appreciated.
Fail: Execution of ‘ulimit -c unlimited; export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh –config /etc/hadoop/conf start namenode’ returned 1. -bash: line 0: ulimit: core file size: cannot modify limit: Operation not permitted
starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-euca-192-168-217-80.eucalyptus.internal.out
The forum ‘Ambari’ is closed to new topics and replies.
A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.
Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world