Most used data transfer tool

to create new topics or reply. | New User Registration

This topic contains 3 replies, has 3 voices, and was last updated by  Larry Liu 2 years, 2 months ago.

  • Creator
  • #23863

    What is the most used data transfer tool/technology for HDFS ? Is it flume , sqoop ..other scripting languages ..

Viewing 3 replies - 1 through 3 (of 3 total)

The topic ‘Most used data transfer tool’ is closed to new replies.

  • Author
  • #24196

    Larry Liu

    Hi, Manoj

    You are right. There are different tools for different use cases.



    I guess it also depends on what the most common data source for HDFS is :-) . Since Sqoop transfers data from structured and non structured data bases , it seems like a no brainer…but this is just an opinion and hence i wanted to validate



    Hi Manoj,
    I guess it all depends what use cases are. Maybe the community can take a survey?

    +1 for sqoop


Viewing 3 replies - 1 through 3 (of 3 total)
Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.