Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.
Thank you for subscribing!
Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.
Thank you for subscribing!
This blog focuses on moving streaming analytics outside the confines of the traditional data center. Moving streaming analytics closer to where data originates can be accomplished by leveraging an enterprise grade data movement application, married with an extremely lightweight streaming engine. This combination is being used by forward-looking organizations to solve usage cases in a number of areas, including:
IoAT (Internet of AnyThing) data includes any new data source generated from sensors and machines, server logs, clickstream web application servers, social media, as well as files and email. By reacting rapidly, a high degree of analytic intelligence such as IoAT data can be collected from its origin to help companies gain fast insights to outpace their competition. Whether it’s personalizing a “next best offer” or providing an immediate treatment to an early warning alert, IoAT data is only beginning to impact bottom line revenues. The time to act is now, but what components do we need to add to our modern data applications?
Hortonworks Data Flow (HDF), powered by Apache Nifi, is a data application platform designed to solve data acquisition and delivery challenges, inside and outside the data center. It provides a fast, easy and secure way to move data from anywhere it originates to anywhere it needs to go. HDF has a simple GUI command and control for building “Data Workflows” so there is no need to write custom data movement scripts. HDF provides Simple Event Processing (SEP) out of the box to curate data payloads as they move from one place to another.
SAS Event Stream Processing (ESP) analyzes and understands streaming data as it is being generated, detects patterns of interest as they occur, and provides the necessary instructions to take the correct actions; i.e. what alerts to issue and what portions of data should be retained for further investigation.
By incorporating SAS ESP into our edge originating HDF based data workflows, we are able to inject Complex Event Processing (CEP) into any IoAT application. It’s important to note that both HDF and SAS ESP are ideal candidates to run on Edge Node Gateways, which typically have very small memory footprints and run lightweight Linux kernels. Since HDF and SAS ESP are architected to run from the smallest of devices large clustered configurations, they are a perfect compliment both inside and outside the data center.
In summary, HDF, powered by Apache Nifi, is a secure, reliable enterprise data movement platform, which provides Simple Event Processing (SEP) processors out of the box. HDF is architected to run from the Data Center to the edge of an IoAT framework and back. SAS ESP is an extremely fast streaming analytics engine, also architected to run in the Data Center and out to an edge node device and back. SAS ESP and HDF are seamlessly integrated via SAS ESP Nifi Processors. Together, they can provide immediate, streaming, deep actionable intelligence for improved customer experience and lower cost operations.
Hortonworks HDP and SAS Event Stream Processing Together, Using YARN blog
11.17.17
9.11.17
9.7.17
9.6.17
8.30.17
8.29.17
8.28.17
8.25.17
8.24.17
Apache, Hadoop, Falcon, Atlas, Tez, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie, Phoenix, NiFi, Nifi Registry, HAWQ, Zeppelin, Slider, Mahout, MapReduce, HDFS, YARN, Metron and the Hadoop elephant and Apache project logos are either registered trademarks or trademarks of the Apache Software Foundation in the United States or other countries.
© 2011-2018 Hortonworks Inc. All Rights Reserved.
Comments
I really appreciate the clear explanation of the functionality and the value this can bring. Nice job!