Ready to Get Started?Download HDF
Powered by Apache NiFi, Kafka and Storm, HDF collects, curates, analyzes and delivers real-time data from the IoAT to data stores both on-premises and in the cloud.START SUBSCRIPTION
Integrated collection from dynamic, disparate and distributed sources of differing formats, schemas, protocols, speeds and sizes such as machines, geo location devices, click streams, files, social feeds, log files and videos.
Real-time evaluation of perishable insights at the edge to determine what is pertinent or not, and executing upon consequent decisions to send, drop or locally store data as needed through a visual user interface with real time operational visibility and control. Unprecedented operational effectiveness is achieved by eliminating the dependence and delays inherent in a coding and custom scripting approach.
Secure end-to-end routing from source to destination, with discrete user authorization and detailed, real-time visual chain of custody and metadata (data provenance). The ability to equally support security and encryption on small scale, JVM-capable data sources from the edge of the Internet of Things, to large scale enterprise clusters in support of the Internet of Anything ensures high trust of analytical outcomes and its underlying data.
HDF provides secure access and control of data to enable informed business decisions. Enabling sharing of specific pieces of information, and removing role-based data access that grants blanket access to an entire silo of data at once, HDF allows enterprises to dynamically and securely share select pieces of pertinent data and gain new business insights.
Real-time edge analytics eases integration with log analytics systems such as Splunk, SumoLogic, Graylog, LogStash, etc. for easy, secure and comprehensive data ingest of log files. By cost-effectively increasing volumes of data collected with content based routing, enterprises can accelerate trouble-shooting and improve anomaly detection with a more comprehensive view across all available machine data.
Equally well designed to run on the small scale data sources that make up the Internet of Things as well as on large scale clusters in today's enterprise data centers, HDF is designed to support the Internet of AnyThing. HDF securely moves data from wherever it is, to wherever it needs to go, regardless of size, shape, or speed dynamically adapting to the needs of the source, the connection, and the destination.
Use the visual user interface of HDF to drag and drop dataflows to encrypt streaming data, route it to Kafka, configure buffers and congestion management so that data can be dynamically prioritized and securely sent from source to destination, with immediate responsiveness to fluctuating conditions that commonly occur at the edge.
Get HDF release notes; guides for users, developers and getting started.
The industry’s best support for Apache NiFi, Kafka and Storm in the enterprise. Connect to our team experts to help guide your journey.
Real-world training from the Big Data experts. Available in person or on-demand whenever you need us.