Ready to Get Started?Download HDF
HDF makes streaming analytics faster and easier, by enabling accelerated data collection, curation, analysis and delivery in real-time, on-premises or in the cloud through an integrated solution with Apache NiFi, Kafka and Storm.
Integrated collection from dynamic, disparate and distributed sources of differing formats, schemas, protocols, speeds and sizes such as machines, geo location devices, click streams, files, social feeds, log files and videos.
Real-time evaluation of perishable insights at the edge to determine what is pertinent or not, and executing upon consequent decisions to send, drop or locally store data as needed through a visual user interface with real time operational visibility and control. Unprecedented operational effectiveness is achieved by eliminating the dependence and delays inherent in a coding and custom scripting approach.
Secure end-to-end routing from source to destination, with discrete user authorization and detailed, real-time visual chain of custody and metadata (data provenance). The ability to equally support security and encryption on small scale, JVM-capable data sources from the edge of the Internet of Things, to large scale enterprise clusters in support of the Internet of Anything ensures high trust of analytical outcomes and its underlying data.
HDF supports stream processing to aggregate and analyze event data in order to dynamically recognize data patterns and detect outliers. The real-time, high volume event processing for immediate action and response to streaming is supported through an integrated enterprise offering of Apache NiFi, MiNiFi, Kafka and Storm to initiate data.
Apache NiFi and MiNiFi provide dynamic, configurable data pipelines, through which all sources, systems and destinations communicate. Kafka adapts to differing rates of data creation and delivery while real-time streaming analytics with Storm creates immediate insights at a massive scale.
HDF provides secure access and control of data to enable informed business decisions. Enabling sharing of specific pieces of information, and removing role-based data access that grants blanket access to an entire silo of data at once, HDF allows enterprises to dynamically and securely share select pieces of pertinent data and gain new business insights.
Real-time edge analytics eases integration with log analytics systems such as Splunk, SumoLogic, Graylog, LogStash, etc. for easy, secure and comprehensive data ingest of log files. By cost-effectively increasing volumes of data collected with content based routing, enterprises can accelerate trouble-shooting and improve anomaly detection with a more comprehensive view across all available machine data.
Equally well designed to run on the small scale data sources that make up the Internet of Things as well as on large scale clusters in today's enterprise data centers, HDF is designed to support the Internet of AnyThing. HDF securely moves data from wherever it is, to wherever it needs to go, regardless of size, shape, or speed dynamically adapting to the needs of the source, the connection, and the destination.
Use the visual user interface of HDF to drag and drop dataflows to encrypt streaming data, route it to Kafka, configure buffers and congestion management so that data can be dynamically prioritized and securely sent from source to destination, with immediate responsiveness to fluctuating conditions that commonly occur at the edge.
Get HDF release notes; guides for users, developers and getting started.
The industry’s best support for Apache NiFi, Kafka and Storm in the enterprise. Connect to our team experts to help guide your journey.
Real-world training from the Big Data experts. Available in person or on-demand whenever you need us.