Integrate legacy systems/data with Apache Hadoop
Data Integration is a key step in a Hadoop solution architecture. Hortonworks Data Platform (HDP) provides a set of tools from Talend that abstract integration complexity. Talend Open Studio for Big Data presents a graphical tool where you drag and drop pre-built components to a canvas, configure them and then all the underlying Java code is created for you.
Hortonworks Data Platform integration with Talend goes beyond what is available with other distributions to extend their offering beyond HDFS, Hive, Pig and HBase into HCatalog (metadata service) and Oozie (workflow and job scheduler). Within HDP, you can also use Apache Sqoop as part of the distribution, which is a tool to push bulk data from relational stores into HDFS.
HCatalog components provide an interface on the metadata services provided by HDP. The components allow you to easily create, drop and modify tables and databases and check for existence, etc. Also, when storing data from the all the components you can choose HCatalog as a storage option so that the data is aligned with a single schema throughout Hadoop .
Finally, a set of components delivers Apache Pig scripts without writing a line of code. Components for join, aggregate, filtering, cross and others are all included. Again you drop a component, connect schema, configure the function, and then all the underlying code is written for you…making your time to delivery all that faster.
View a demo Download now