Home Forums Hive / HCatalog HCatalog: Error while importing files

Tagged: , ,

This topic contains 0 replies, has 1 voice, and was last updated by  RAVI RANJAN 9 months ago.

  • Creator
    Topic
  • #49189

    RAVI RANJAN
    Participant

    Hi,

    I am working on a 4 node cluster, 1NN, 1Sec NN and 2DN. I was trying to follow the tutorials on Hortonworks. I uploaded the file and also created the table ‘nyse_stocks’. However when I try to import data onto the created table, I get an error:

    IOException: Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[10.45.153.156:50010, 10.180.193.223:50010], original=[10.180.193.223:50010, 10.45.153.156:50010]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via ‘dfs.client.block.write.replace-datanode-on-failure.policy’ in its configuration.

    When I checked on Ambari, I did not notice any bad datanode. All datanodes are working fine. Can anyone please suggest a good fix for this issue.
    Thank You

You must be logged in to reply to this topic.