Hortonworks Sandbox Forum

Error when trying to write from Talend to HDP 2.0

  • #48500
    David Goverman
    Participant

    Registered: 2014-02-10
    Posts: 1
    E-mail

    Issue writing to HDP 2.0 from Talend 5.4.1
    Tags: [bug, development, error, HDP 2.0, Hortonworks]

    Downloaded the most recent Hortonworks HDP Sandbox (v2.0) and latest Talend Open Studio for Big Data (v5.4.1) on 2/10/14. I am able to interact with and upload data to the HDP VM through the Hue interface. However, when trying to upload data via Talend as per the tutorial available online at ___http://hortonworks.com/kb/how-to-connectwrite-a-file-to-hortonworks-sandbox-from-talend-studio/___ I am receiving the following error (excerpt due to length):

    [ERROR]: org.apache.hadoop.hdfs.DFSClient – Failed to close file /user/root/testfilez
    org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/root/testfilez could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

    ___________

    The file is created in HDP, but it is 0 bytes and thus contains no content.

    Based on research online, I have verified that the data node is running and is not full (see attached cluster summary image). I have seen similar issues to this one with other versions of HDP/Talend in online forums, none of which has provided a solution. I am trying to have a tRowGenerator in Talend generate 100 rows and then output them into Hadoop via the tHDFSOutput. Every time I run the job I am receiving the error. I have the following configured in Talend:

    tHDFSConnection:

    Hadoop Version: Hortonworks Data Platform V2.0.0(BigWheel)
    NameNode URI: “hdfs://127.0.0.1:8020/”
    User name: “root” (have also tried “sandbox” and “hue”)

    tHDFSOutput:

    File Name: “/user/root/testfile” (have tried “/” and “/user/hue/testfile”)

    tRowGenerator:

    Configured to generate 100 rows, each with two string and 1 int columns.

    The tHDFSConnection is connected to the tRowGenerator via an OnComponentOk trigger, and the tRowGenerator is connected to the tHDFSOutput via a Row Main.

    Can anyone please suggest a solution to this connectivity issue? Thanks in advance for your advice!

to create new topics or reply. | New User Registration

  • Author
    Replies
  • #48576
    iandr413
    Moderator

    Hi David,
    Have you tried the initial setup doc located at http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.9.0/bk_dataintegration/content/ch_talend-instruct.html#ch_talend-instruct-job? It may provide a baseline to see if the job works and may indicate any missing setup. Might be helpful to start there.

    Ian

    #48577
    David Goverman
    Participant

    Hi Ian,

    Yes, I have tried the instructions listed in the document you referenced, and received the exact same error. Seems to be any action trying to write to HDFS from Talend is generating that error. Any further suggestions? Thanks in advance!

    David

    #48603
    iandr413
    Moderator

    I just did a quick setup myself and see the same issue. I have come to the conclusion it is due to the setup of the sandbox and the various IP addresses at work in it. I believe the file is created but when it tries to write to the datanode it cannot due to it seeing an internal IP address. You could try putting that in your local hosts file so the client is aware of where it needs to go and see if makes a difference.

    Have you tried doing a simple one node install in a virtualbox instance and tried connecting there? I am going to try this on a virtual box instance I have to see if it makes a difference when I have some free time this week.

    Ian

    #48609
    iandr413
    Moderator

    I just tested this against my single node cluster running on virtual box and it worked. FYI.

    Ian

    #49122
    David Goverman
    Participant

    Ian,

    I was able to resolve this issue as well by following the advice given in this Talend forum post:

    http://www.talendforge.org/forum/viewtopic.php?id=34330

    Hopefully this will help others trying to stand up the sandbox on their machines.

    #61285
    Zack Riesland
    Participant

    I am experiencing (what I think is) the same basic issue, with different details.

    I’m running the 2.1 sandbox using the virtualbox image from Hortonworks.

    I have Pentaho 5.1 setup with the HDP21 ‘shim’ to integrate with hadoop 2.1 and running on the (Windows 8) host machine.

    I am able to transfer the file to HDFS on the sandbox VM, but the new file always ends up empty.

    The error in the log is:

    Caused by: File /user/pdi/weblogs/raw/weblogs_rebuild.txt could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

    I am able to upload the same weblogs_rebuild.txt file via the File Uploader tool in the ‘Hue’ interface, so I don’t think this is a storage issue.

    Some of the forums I have found suggest that I need to change the network config of the VM image from “NAT” to “Bridged Adapter” (along the lines of the IP Address mapping issues mentioned in this thread).

    However, when I do this, I am unable to access the machine at all (via Hue or Pentaho, etc).

    I’m guessing (hoping) that there’s a set of changes I can apply to the network settings to overcome this issue.

    Is there any documentation or help on the suggested way to interface with the sandbox from the host machine as far as network settings?

    Thanks!

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.