Hadoop Ecosystem

Industry news, partner stories, buzz and happenings

Geoff Flood is president of T4G Limited and vice-chairman of the province of New Brunswick Research & Innovation Council. In this guest blog, Geoff elaborates on why “partnering with Hortonworks was simply a no-brainer for us. It’s a decision that will deliver prized and measurable value to our customers.”

Big data is more than just buzz; it’s a big deal. It’s changing everything in our lives and all around us. As president of a successful technology services firm in Canada, I knew we had to change, too, when it comes to designing, developing and implementing solutions for our customers across North America.…

I can’t believe it’s been 6 months since we first announced for expanded strategic alliance with Red Hat. For those that have been following this partnership, you know our goal is simple — to help organizations adopt enterprise Apache Hadoop as part of their modern data architecture. Our expanded relationship with Red Hat is closely aligned around a strategy of innovating in the open and applying enterprise rigor to open source software, thereby de-risking it for the enterprise, and allowing faster adoption for Enterprise Apache Hadoop.…

The open source community, including Hortonworks, has invested heavily in building enterprise grade security for Apache Hadoop. These efforts include Apache Knox for perimeter security, Kerberos for strong authentication and the recently announced Apache Argus incubator that brings a central administration framework for authorization and auditing.

Join Hortonworks and Voltage Security in a webinar on August 27  to learn more.

In multi-platform environments with data coming from many different sources, personally identifiable information, credit card numbers, and intellectual property can land in the Hadoop cluster.…

Zettaset is a Hortonworks partner. In this guest blog, John Armstrong, VP of Marketing at Zettaset Inc., shares Zettaset’s security features and explains why data encryption is vital for data in the Hadoop infrastructure.

Comprehensive Security Across the Hadoop Infrastructure

As big data technologies like Hadoop become widely deployed in production environments, the expectation is that they will meet the enterprise requirements in data governance, operations and security while integrating with existing data center infrastructure. …

The key to monetization of Big Data is not only the ability to capture and process information quickly but also to analyze the data to derive meaningful insights.  Big Data can be complex, and often the expertise of a programmer is needed to create focused and targeted queries.

0xdata, a provider of open source machine learning and predictive analytics for Big Data, helps to facilitate the manipulation and extraction of data with the use of its H2O prediction engine for statisticians. …

With the release of Apache Hadoop YARN in October of last year, more and more solution providers are moving from single-application Hadoop clusters to a versatile, integrated Hadoop 2 data platform. This allows them to host multiple applications — eliminating silos, maximizing resources and bringing true multi-workload capabilities to Hadoop. 

That is why we’re  extremely excited to have Paul Kent, Vice President of Big Data at SAS, share his insights on the value of Apache Hadoop YARN and the benefits it brings to SAS and its users. …

Few industries depend as heavily on data as financial services. Insurance companies, retail and investment banks aggregate, price and distribute capital with the aim of increasing their return on assets with an acceptable level of risk.

To do that, financial decision-makers need data. Apache Hadoop helps them store new data sources, then process the larger combined dataset for batch, interactive and real-time analysis. More data and better analysis improves bottom-line results.…

The world’s top telecommunications firms adopt Hadoop to gain competitive advantage and to respond to technology-driven changes like increases in both network traffic and the telemetry data captured by network sensors.

The majority of North America’s and Europe’s telcos have chosen Hortonworks Data Platform (HDP) to meet these challenges. Read the new Hortonworks white paper for a detailed discussion of twenty-one common telco and cable company use cases.

Download the White Paper

With their Modern Data Architectures based on HDP, these firms improve efficiency and capture opportunities in some of these ways:

  • Analyze call detail records (CDRs).

ScaleOut joined the Hortonworks Technology Partner Program and has recently achieved Hortonworks Certified status for ScaleOut hServer. ScaleOut Software is a pioneer in in-memory data grid software and the ScaleOut hServer can be installed directly on Hadoop nodes and runs in-memory. In this guest blog, William Bain, Founder and CEO, talks about certification and a use case.

Recently, ScaleOut Software announced technical certification of its ScaleOut hServer® product on Hortonworks Data Platform 2.1.…

This is a guest blog from Protegrity, a Hortonworks certified partner.

As Hadoop transitions to take on a more mission critical role within the data center, so the top IT imperatives of process innovation, operational efficiency, and data security naturally follow. One such imperative in particular now tops the requirement list for Hadoop consideration within the enterprise: a well-developed framework to secure data.

The open source community has responded. Work is underway to build out a comprehensive and coordinated security framework for Hadoop that can work well with existing IT security investments.…

This is a quest blog from Voltage Security, a Hortonworks partner.

Data Security for Hadoop is a critical requirement for adoption within the enterprise. Organizations must protect sensitive customer, partner and internal information and adhere to an ever-increasing set of compliance requirements. The security challenges these organizations are facing are diverse and the technology is evolving rapidly to keep pace. 

An Open Community For Platform Security

The open source community, including Hortonworks, has invested heavily in building enterprise grade security for Apache Hadoop. …

This is a guest post from Hortonworks partner, Dataguise. Dataguise is a HDP 2.1 certified technology partner providing sensitive data discovery, protection and reporting in Hadoop.

According to a 2013 Global Data Breach study by the Ponemon Institute, the average cost of data loss exceeds $5.4 million per breach, and the average per person cost of lost data approaching $200 per record in the United States. That said, no industry is spared from this threat and all of our data systems, including Hadoop, need to address the security concern.…

HP and Hortonworks recently announced a strategic partnership that included a $50 million equity investment by HP. While the investment is important, there is an equally important joint commitment to help accelerate the adoption of Enterprise Apache Hadoop by deeply integrating the Hortonworks Data Platform (HDP) with the HP HAVEn big data platform.

Below are some thoughts on our joint work from the HP OMi Team…

The first area of joint engineering strategy between our companies will be to integrate Apache Ambari with HP Operations Manager i (OMi) which provides tools and APIs to provision, manage and monitor Hadoop clusters.  …

Tresata, a Hortonworks Certified Technology Partner, is a next-generation predictive analytics software company that helps enterprises monetize big data™they have moved to Hadoop . In this blog, Tresata’s Director of Marketing, Katie Levans, (@katie_levans) describes the value of HDP 2.1 certification and the benefit of their solution. 

Last month Tresata announced the release of the third generation of their hugely successful software application TREE 3.3 and its subsequent certification on HDP 2.1.…

Today we are delighted to announce the formal partnership between Accenture and Hortonworks, which is the continuing evolution of the ongoing collaboration between the two companies which started in 2012. With this formal agreement, Accenture and Hortonworks will collaborate on making large structured and unstructured datasets – including operational, video and sensor data – more accessible to organizations for insight-driven decision-making. Together, the two companies will continue to collaborate on joint horizontal and vertical solutions to speed the adoption of Apache Hadoop.…

Go to page:12345...10...Last »