Is your University taking advantage of Big Data to improve student performance and raise professor effectiveness, while reducing administrative workloads?
Student performance data is increasingly being captured as part of software-based and online classroom exercises and testing. This data can be augmented with behavioral data captured from sources such as social media, student-professor meeting notes, blogs, student surveys, and so forth to discover new insights to improve student learning. The results transcend traditional IT departments to focus on issues like retention, research, and the delivery of content and courses through new modalities.
Hortonworks is partnering with Microsoft to show you how the Hortonworks Data Platform (HDP) running on the Microsoft stack enables you to develop a “single view of a student”.
In addition, we will review a use case of how HDP is used in Genomic Research to analyze data and develop actionable insights from the analysis.
Mike Christensen and Brian Hagan from Hortonworks will share their insights with you. Learn how you can:
Part five in a five-part series, this webcast will be a demonstration of the integration of Apache Zeppelin and Pivotal HDB. Apache Zeppelin is a web-based notebook that enables interactive data analytics. You can make beautiful data-driven, interactive and collaborative documents with SQL, Scala and more. This webinar will demonstrate the configuration of the psql interpreter and the basic operations of Apache Zeppelin when used in conjunction with Hortonworks HDB.
Chief Data Officers in financial services have unique challenges: they need to establish an effective data ecosystem under strict governance and regulatory requirements. They need to build the data-driven applications that enable risk and compliance initiatives to run efficiently. In this webinar, we will discuss the case of a global banking leader and the anti-money laundering solution they built on the data lake. With a single platform to aggregate structured and unstructured information essential to determine and document AML case disposition, they reduced mean time for case resolution by 75%. They have a roadmap for building over 150 data-driven applications on the same search-based data discovery platform so they can mitigate risks and seize opportunities, at the speed of business.
Vamsi Chemitiganti, GM – Financial Services at Hortonworks and Lee Phillips, Product Marketing at Attivio.
Customers are preparing themselves to analyze and manage an increasing quantity of structured and unstructured data. Business leaders introduce new analytical workloads faster than what IT departments can handle. Legacy IT infrastructure needs to evolve to deliver operational improvements and cost containment, while increasing flexibility to meet future requirements. By providing HDP on IBM Power Systems, Hortonworks and IBM are giving customers have more choice in selecting the appropriate architectural platform that is right for them. In this webinar, we’ll discuss some of the challenges with deploying big data platforms, and how choosing solutions built with HDP on IBM Power Systems can offer tangible benefits and flexibility to accommodate changing needs.
Many organisations are now looking to stream operational data in real-time from their transactional RDBMS systems into Hadoop big data platforms, in order to support new analytics use cases. Such data can then be combined with other data stored on the Hadoop cluster, and critical decisions made on up-to-date information, with more reliable results. Real-time data streaming into Hadoop also enables analytics while the data is ‘in-motion’, a powerful mechanism for reacting in real-time to customer actions and requests.
Hortonworks and Oracle can provide comprehensive solutions that allow organisations to respond rapidly to data events.
During this webinar we will cover:
Attend this webinar to learn how Hortonworks and Oracle can help you with your real-time big data analytics and streaming initiatives!
Part four in a five-part series, this webcast will be a demonstration of the installation of Apache MADlib (incubating), an open source library for scalable in-database analytics, into Hortonworks HDB. MADlib is an open-source library for scalable in-database analytics. It provides data-parallel implementations of mathematical, statistical and machine learning methods for structured and unstructured data. This webinar will demonstrate the installation procedures, as well as some basic machine learning algorithms to verify the install.
Le Big Data est en train de transformer la manière dont les entreprises gèrent et utilisent leurs données. Elles ont désormais plus de données qu’auparavant, statiques et au fil de l’eau, en provenance de sources plus nombreuses au travers de l’organisation.
Elles ne peuvent se permettre de rater l’opportunité d’obtenir plus de connaissances en raison du temps nécessaire à la préparation des données et leur mise à disposition. Elles ont également besoin de solution de chargement à l’épreuve de l’entreprise, allant bien au-delà d’outils comme sqoop, plus adapté aux environnement de test et de développement.
Rejoignez-nous dans ce webinaire pour comprendre comment les solutions d’Attunity et Hortonworks adressent ces difficultés. Vous verrez :
Vous aurez également l’opportunité de poser vos questions autour de votre problématique Big Data aux experts présents
You know that your data warehouse is necessary for analytics initiatives that help guide management decisions and serve your customers better. But do you know how offloading data from your data warehouse to Hadoop can help you save money, improve performance and rebalance workloads?
Join subject matter experts from HPE, Hortonworks and Attunity for a one-hour webinar where you’ll learn how to:
• determine which data is cold, warm and hot
• rebalance your data warehouse to identify which workloads are right for Hadoop
• integrate your current data warehouse with a modern data architecture
• get a 360-degree view of your Hadoop data lake
• utilize data assets to reduce costs while increasing value
Speakers: John Sanson at HPE, Cindy Maike at Hortonworks and Kevin Petrie at Attunity.
Featured Speaker: Ibrahim Itani, Leader of Big Data Architecture and Technology, Verizon.
With increasing data volumes and data sources, enterprises are outgrowing their traditional BI solutions and struggling to make use of the data collected on their new data platforms. Frequently, data engineers will resort to old habits of shifting data sets between repositories so that data can be analyzed using older methods. Benefits of attending:
Join Ibrahim Itani, Leader of Big Data Architecture and Technology at Verizon, who will talk about Verizon’s big data journey and how they use new technologies to solve problems with data at scale without data movement.
Ibrahim will be joined by Ajay Anand, VP of Product Management at Kyvos Insights, and Sanjay Kumar, General Manager, Telecom at Hortonworks, who will share additional use cases that leverage big data architectures and interactive BI to reach their business goals.
Streaming Analytics are the new normal. Customers are exploring use cases that have quickly transitioned from batch to near real time. Hortonworks Data Flow / Apache NiFi and Isilon provide a robust scalable architecture to enable real time streaming architectures. Explore our use cases and demo on how Hortonworks Data Flow and Isilon can empower your business for real time success.
Johnson Controls delivers best-in-class building technologies and energy storage. In their quest to continually improve operations, they implemented a modern data architecture based on Hadoop. They started with a small successful proof of concept and recognized the need to make more of their data accessible to more teams. Johnson Controls was able to successfully integrate Big Data technology into more of their operations that ultimately supports their goals to create better products, reduce waste, and increase profitability.
Join this webinar where our guest speaker from Johnson Controls shares their journey from a small Hadoop POC to a global production-ready implementation that includes security and governance.
You will hear:
• What steps JCI took throughout the process
• What framework and technologies they used
• How they engaged executives for full corporate buy-in
• Issues they encountered and solved
• How JCI was able to provide a secure reliable platform to consolidate global data
• Their next steps to complete the global integration.
Speakers: Tim Derrico, Manager of Architecture and Strategy – Big Data, Johnson Controls
Thomas Clarke, Managing Principal, RCG Global Services
Eric Thorsen, VP Retail / Industry Solutions, Hortonworks
Co-hosted by Hortonworks and RCG Global Services
Hadoop and The Internet of Things has enabled data driven companies to leverage new data sources and apply new analytical techniques in creative ways that provide competitive advantage. Beyond clickstream data, companies are finding transformational insights stemming from machine data and telemetry that are radically improving operational efficiencies and yielding new actionable customer insights.
During this webinar we will:
With the advent of Big Data platforms, Banking & Financial Services companies are building applications that create massive business value. However, the datasets being used often contain significant amounts of confidential, proprietary and highly sensitive data and so the potential benefits are held back by privacy concerns.
Part three in a five-part series, this webcast will be a demonstration of the integration of Hortonworks HDB and Apache Hadoop YARN. YARN provides the global resource management for HDB for cluster-level hardware efficiency, while the in-database resource queues and operators provide the database and query-level resource management for workload prioritization and query optimization. This webinar will focus on demonstrating the installation process as well as discuss the various YARN and HDB parameters and best practice settings.
Fueled by ever-changing customer behaviors and an increasing number of industry disruptions, the modern enterprise requires analytics to stay ahead of the game. Today’s data warehouse needs continuous enhancements to address new requirements for advanced analytics, real-time streaming data, Big Data, and unstructured data. The focus should be on developing a forward-looking, future-proof view and holistically addressing the combination of forces that are impacting the existing operational model. Join Hortonwork’s Eric Thorsen and Saama’s Karim Damji on Tuesday, October 18th at 10 am PT to learn about:
As a thank you, all webinar registrants will receive the Forrester Research report “The Four Things Data Scientists Wish You Knew”.
The Global Credit Card industry is rapidly changing and the participants are increasingly facing new challenges with exploding volumes, regulatory pressures and new entrants competing for the market share. The industry has responded to these challenges by looking at avenues to cut costs, increase efficiencies and provide better, safer products and services to attract new and retain existing customers. To help our customers address this challenge, Hortonworks and Capgemini are collaborating to create a suite of Credit Card Analytics solutions designed to enhance decision making by leveraging all of the data available including customer data, transactions, third party data, open data, government data, location data, social data, etc. The first solution in this suite of solutions is focused on Credit Card fraud.
Fraudulent behaviours evolve and so must the solutions that are used to detect them. Traditional rules based anti-fraud systems are no longer equipped to handle large volumes of data that is required to adapt to and detect the evolving fraud patterns. Identifying fraudulent behaviors is increasingly becoming more complicated and validating all transactions based on traditional technologies represent scaling challenges.
Join Hortonworks and Capgemini as they discuss:
– How the joint anti-fraud solution can support your business throughout the entire credit card transaction life cycle
– How we can help increase fraud detection accuracy using Predictive analytics and run it real time
– Why leading organizations are choosing Hortonworks Data Platform as the platform of choice for fraud detection
Who’s winning the deep forensic analysis ‘arms race’ for compliance? Real-time trade surveillance in global financial markets has created a data tsunami. With greater volumes of data comes greater compliance risk. CNBC reports U.S. Banks have been fined over $200B since the financial crisis. How are compliance teams fighting back to make more of the data and stay out of regulatory hot water? Rapid response to suspect trades means compliance teams need to access and visualize trade patterns, real time and historic data, to navigate the data in depth and flag possible violations. Join Hortonworks and Arcadia for this live webinar: we’ll cover the use case at a top 50 Global Bank who now has deep forensic analysis of trade activity. The result: interactive, ad hoc data visualization and access across multiple platforms – without limits on historic data – to detect irregularities as they happen. In-depth expert presentations by:
Q&A session follows the presentation.
Part two in a five-part series, this webcast will be a demonstration of Pivotal Extension Framework (PXF), an extensible framework that allows Hortonworks HDB to query external system data. This is really useful for both data loading, and also avoiding data loading for data that doesn’t need to reside within the database instance. PXF includes built-in connectors for accessing data inside HDFS files, Hive tables via Catalog, and HBase tables.
Hadoop didn’t disrupt the data center. The exploding amounts of data did. But, let’s face it, if you can’t move your data to Hadoop, then you can’t use it in Hadoop.
Join the experts from Hortonworks, the #1 leader in Hadoop development, and Attunity, a leading data management software provider, for a webinar where you’ll learn:
We will discuss how Attunity Replicate and Hortonworks Data Flow (HDF) work together to move data into Hadoop. And, there will be a live question & answer session.
Gartner predicts there will be 250 million connected vehicles by 2020. While automotive manufacturers are on track to drive connected vehicles implementation, are they poised to leverage the trillion-dollar opportunity from the gold-mine that is “sensor data”?
Research from Morgan Stanley suggests, automotive manufacturers can save $488 billion by using predictive maintenance. By assessing in advance which equipment needs maintenance, automotive manufacturers can better plan maintenance work and smoothly convert the abrupt “unplanned outages” into shorter and fewer “planned outages” and spend lesser time in damage control as equipment issues are detected even before they actually occur. The result? Lower operational costs, increased machine lifetime and asset performance.
Join this webinar to learn how automotive manufacturers can:
This webcast is the first in a five-part series on Hortonworks HDB, demonstrating the installation procedures for installing Hortonworks HDB on Hortonworks HDP. HDB’s integration with Apache Ambari allows you to install and manage your high-performance SQL database alongside other Hadoop services. Starting with an existing HDP cluster, it will cover any required prerequisites and then leveraging Apache Ambari to complete the install.
Companies of all sizes are challenged to keep up with emerging technologies that deliver a competitive advantage. Big data holds the key to a greater customer insight and stronger customer relationships. But risk of sensitive data exposure — and compliance violations— keep many organisations from pursuing big data initiatives and reaping the rewards of business-driven analytics. Join this webinar and find out:
· How can our technology help you pseudonymise sensitive data
· Share your data freely internally & externally without breaching compliance regulations
· Make an informed decision and leverage big data to gain competitive advantage
Like all consumer packaged goods (CPG) companies, PepsiCo relies on huge volumes of data to accurately replenish its retailers with the appropriate amount and type of product. Across the CPG industry, most analysts exclusively rely on Excel and Access for data wrangling, but as PepsiCo’s data surpassed the capabilities of those tools, they knew they needed a better way. In this webinar, supply chain leaders from PepsiCo will discuss:
PepsiCo has been able to automate the bulk of their regular data preparation processes and improve the speed and accuracy of their reporting processes with solutions from Hortonworks, Tableau and Trifacta.
Success in the insurance industry depends on your company’s ability to quickly interact with customers at every point in the insurance life cycle, and then to make timely use of the new data to guide business decisions. Many of the customers and companies agree that Customer 360 is an important initiative, but many don’t know how to get there.
Cindy Maike, GM Insurance Industry at Hortonworks, will be a guest panelist for an informational webinar with Amrita Dhar, Senior Solutions Manager for Insurance at Saama. The webinar will cover what’s important to a successful implementation of Customer 360 in Insurance initiatives, how to utilize industry standard models, data science algorithms, and predictive models for customer analytics and how to base it on open and connected data platforms from Hortonworks.
Already strategic partners, Pivotal Software and Hortonworks deepened their relationship in Spring 2016 with the goal of providing enterprises the most complete modern data platform for advanced analytics and machine learning. As part of the expanded relationship, Hortonworks has introduced Hortonworks HDB, the market’s leading Hadoop Native SQL database and big data SQL machine learning engine based on Apache HAWQ and Apache MADlib (incubating).
Join Pivotal’s Jeff Kelly and Hortonworks’ Ajay Singh in this webinar to learn more about:
As organizations strive to identify and realize the value in Big Data, many now seek more agile and capable analytic systems. While many have piloted Hadoop as a data repository for simple workloads, there is much more value that can be created from Hadoop by leveraging the data in the platform more, to interact with the data and uncover new business insights.
To understand how to be agile and data driven, please join us for a webinar with some of our Big Data experts. At this session we will be covering:
• Best practice methods needed to extract value from Hadoop investments, analysing opportunities and obstacles many organizations face nowadays.
• How to identify, validate and prioritise IT use cases based on value.
• Why consumption based platforms are, in the current business world, the best of choice to overcome the risk of technology obsolescence.
• Strategic guide to work with customer senior management and lead architects to address their architectural direction and alignment with business priorities.
• Competitive information you need to craft the winning analytics strategy to derive real-time business insights from new and existing data sources.
Payment card fraud has mushroomed into a massive challenge for consumers, financial institutions, regulators, and law enforcement. As the accessibility and usage of credit and debit cards grows and transaction volumes increase, banks are losing tens of billions of dollars on an annual basis to fraudsters. The Nilson Report estimated that of every dollar of transaction about five cents is lost to fraud—a massive drain on the overall system. Another insidious side effect of payment card fraud is identity theft. To guard against these impending threats, financial institutions are increasingly turning to Hadoop and software designed to detect and protect sensitive data in real-time.
Join Dataguise, a leader in data masking, and Hortonworks for this live webinar to learn how some of the biggest financial institutions are thwarting payment card fraud while maintaining personal information security and compliance.
このWebinarでは、モダンデータアプリケーションの基盤に必要な、Hortonworksの「Connected Data Platforms」と「EMC Isilon Storage」との組み合わせについてご紹介します。「Hortonworks Data Platform」や、「Hortonworks DataFlow」、「EMC Isilon Storage」を組み合わせることにより、柔軟性、低コスト、データプロテクションやセキュリティを提供できます。また、どれくらい簡単に分析プロジェクトを開始し、結果を出せるかについてもお話しします。
Existing data architectures are siloed within financial institutions IT departments, creating or replicating data marts or warehouses to feed internal lines of business. These data marts are then accessed by custom reporting applications thus replicating/copying data many times over, which leads to massive data management & governance challenges.
Please join us for this webinar to learn more on how HPE and Hortonworks deliver a modern data architecture that can both consolidate, and address data security and compliance issues.
Big data is transforming the way that organisations use and manage data. They now have more data in motion and at rest than ever before in higher velocities and from more sources across the organisation. Businesses can’t afford to miss opportunities for deeper insight due to time spent “data wrangling”. They are also looking for enterprise-class data loading solutions that go beyond simple tools such as Sqoop, which is more suitable for test and dev environments.
Join this webinar to learn about how Attunity and Hortonworks solutions alleviate those challenges. You will hear:
You’ll also have the chance to ask live audience questions to the experts about your big data challenges.
Rapid data growth of traditional and new data sources is putting a strain on existing Enterprise Data Warehouse (EDW) resources and related IT budgets. Learn how to reduce the cost of an EDW by augmenting it with an EMC Data Lake and Hortonworks Data Platform (HDP). Today, Enterprises simply can’t afford to keep all data and often have to discard or aggregate it before storing. Increasingly, EDW resources are also being used to handle the data wrangling and cleansing jobs instead of performing higher value-add analytics and Business Intelligence workloads. These challenges have driven many enterprises to look for ways to optimize their EDW by on-boarding lower value data storage and processing functions to a more affordable data lake platform that extend the value of their EDW and enhance their business intelligence activities.
Not only will you get more value out of EDW investments, but by introducing EMC Isilon in combination with HDP, you have extended your datacenter to capture more data for longer, giving you more advanced analytics capabilities. Join us for this webinar to hear from EMC and Hortonworks on how customers transformed and modernized their Data Warehouse.
Join this session as Hortonworks and eCube demonstrates how to drive actionable intelligence in real time with Hortonworks DataFlow, (HDF) and an array of Hadoop ecosystem tools. This session is a must attend for organisations challenged with optimising IoT data collection, analysing perishable insights and ultimately enriching the data lake with new data.
We will cover:
Ingesting data in real time from Twitter
Filtering tweets about Hadoop and Big Data using HDF
Storing data inside Apache Kafka
Filtering, aggregating and enriching data from Twitter with HDF
Indexing data with Apache Solr
Viewing real time dashboards with Banana and Kiwi
We’re fully into the “age of data”. New developments in connected and non-connected devices are multiplying the rate at which data is created thus challenging organizations to think differently about their data architecture. Today’s enterprises may not be equipped with collecting, curating and analysing this data in real time and always carry the pressures of driving competitive advantage.
With the launch of Hortonworks DataFlow (HDF), companies can now collect data at the edge, process and secure it in motion – at the same time deposit it into Hadoop. Join Hortonworks & WANdisco as they present a resilient joint solution that gives you control over the location of data and protection against failure enabling you to build a fast, secure data engine across multiple Hadoop clusters.
In this webinar you’ll understand how Hortonworks Connected Data Platforms enables a modern big data solution to run on the EMC Isilon infrastructure. We will also share how the combined solution delivers unmatched flexibility, lower costs and more robust security. Register now to initiate analytics projects quickly and get results in minutes.
In the latest Forrester Wave report for Big Data Hadoop Cloud Solutions, Microsoft Azure came on top beating some very esteemed vendors. Learn how to complete your big data solution and join Microsoft and Hortonworks as we showcase Hortonworks DataFlow and how it complements Azure HDInsight enabling users to easily move their data to the cloud for production, disaster recovery or development uses.
Powered by Apache NiFi, Kafka and Storm, HDF collects, curates, audits and delivers real-time data using a simple end-user interface unblocking customers from analysing their big data in Microsoft Azure.
During this webinar we will show:
• How to ingest a variety of data sources from a selection of source systems using HDF
• How to feed the data to HDInsight cluster on Azure
• How to easily display and interrogate those datasources once they have been ingested
Rapid data growth of traditional and new data sources is putting a strain on existing Enterprise Data Warehouse (EDW) resources and related IT budgets. Learn how to reduce the cost of an EDW by augmenting it with an EMC Data Lake and Hortonworks Data Platform (HDP). Today, Enterprises simply can’t afford to keep all data and often have to discard or aggregate it before storing. Increasingly, EDW resources are also being used to handle the data wrangling and cleansing jobs instead of performing higher value-add analytics and Business Intelligence workloads. These challenges have driven many enterprises to look for ways to optimise their EDW by on-boarding lower value data storage and processing functions to a more affordable data lake platform that extend the value of their EDW and enhance their business intelligence activities.
Not only will you get more value out of EDW investments, but by introducing EMC Isilon in combination with HDP, you will extend your datacenter to capture more data for longer, giving you more advanced analytics capabilities. Join us for this webinar to hear from EMC and Hortonworks on how customers can transform and modernise their Data Warehouse whilst also reducing cost.
Warranty claims have direct financial impact on manufacturers and add substantial indirect costs from degradation of a company’s brand image, reduced customer loyalty, and potential legal liability. When manufacturers analyze their “after the fact” warranty data, it’s too late to identify issues proactively and be able to respond to reduce risk.
Enter the era of the Internet of Things (IoT). The IoT has revolutionized how automotive, aerospace, and industrial equipment manufacturers design, manufacture and support their products. It can also revolutionize warranty analytics. As products become smart and more connected, manufacturers have access to billions of streaming data points. With new tools and methods, companies can look deeply into detailed warranty data, anticipate product failures, and resolve them proactively in the design process to avoid costly repairs or negative impacts to their brand.
In this one-hour webinar, provided through the strategic business relationship between Hortonworks and EY, we will discuss how manufacturers can:
The explosion of new types of data in recent years has put tremendous pressure on the financial services data center, both technically and financially, and an architectural shift is underway in which multiple Lines of Business (LOBs) can consolidate their data into a unified data lake. This approach helps financial institutions address risk management and compliance requirements in a cost effective manner.
Existing data architectures are siloed with financial institutions IT departments, creating or replicating data marts or warehouses to feed internal lines of business. These data marts are then accessed by custom reporting applications thus replicating/copying data many times over, which leads to massive data management & governance challenges.
Please join us for this webinar to learn more on how HPE and Hortonworks deliver a modern data architecture for the financial services market.
Microsoft HDInsight has been chosen by Forrester to be the leader in their Hadoop Cloud Wave report.
Join Microsoft and Hortonworks on June 6, 2016 at 10:00 AM PST to discuss how Hortonworks powers the Microsoft HDInsight platform and examples of how world leading corporations choose Microsoft HDInsight to run a variety of mission critical work loads.
Les entreprises entrent de plein pieds dans le monde Big Data. Avec Microsoft Azure et Hortonworks vous pourrez monter un cluster Hadoop aussi simplement que d’aller acheter une baguette.
Rejoignez-nous pour un webinar pendant lequel nous vous montrerons comment débuter avec votre HDP sur Microsoft Azure.
Nous aborderons les sujet suivants:
– La Marketplace Microsoft Azure
– Les produits Hortonworks pour le Big Data: HDP et HDF
– Les atouts d’une architecture Cloud pour Hadoop
– Démo de création et d’utilisation d’un cluster Hortonworks sur Azure à travers le Marketplace
Rendez-vous le 26 mai 2016, de 12h à 13h !
You may be up all night wondering how enterprise organizations deal with large data volumes and data varieties without significantly increasing costs. And perhaps your existing data architectures are not equipped to handle today’s data? Join this webinar to learn how to optimize your data architecture and gain significant insights into cost savings with Hadoop on Azure.
We will cover:
– Big data challenges and how cloud architecture can help you address them
– Advantages of HDP on Azure (storage, ETL, refining of new data types)
Credit & payment card fraud has mushroomed into massive challenges for consumers, financial institutions, regulators and law enforcement. As the accessibility and usage of credit cards increase, banks are losing billions in fraudulent transactions. A recent Nielsen report suggests 5 cents per one dollar is lost to fraud. Banks are increasingly turning to Hadoop & predictive analytics counter identity fraud real-time.
Join Dataguise and Hortonworks for this live webinar to learn how some of the biggest financial institutions are thwarting fraud while maintaining personal information security and compliance.
Join Hortonworks and Cisco at the upcoming webinar and hear from the industry leading experts on the latest trends and drivers for a modern data architecture and how can you benefit from it.
We will cover:
– How to become and data-driven organisation with Hortonworks Data Platform
– How to build a super-scaling Hadoop cluster while keeping the consistency within the cluster
– Live Demo on Hadoop cluster rapid deployment
You may be up all night wondering how enterprise organizations deal with large data volumes and data varieties without significantly increasing costs. And perhaps your existing data architectures are not equipped to handle today’s data challenges? Join this webinar to learn how to optimize your data architecture and gain significant cost savings with Hadoop.
We will cover:
– Overview on the traditional service-oriented Data Architecture and challenges associated with it.
– Modern Data Architecture and how it can help you address the complexity associated with SOA.
– Overview on Hortonworks Connected Data Platforms (HDF & HDP) and how can they help your business to increase efficiency & reduce cost.
Optimizing manufacturing processes ultimately revolves around increasing output at reduced cost and improved quality. Manufacturers try to minimize inventory levels by scheduling just-in-time delivery of raw materials, but even the smallest miscalculation can cause stock-outs that lead to production delays. Sensors and RFID tags can capture supply chain data, but this creates a large, ongoing flow of data. Hadoop can cost-effectively store this unstructured data, providing manufacturers with greater visibility into their supply chain history, and greater insight into longer term supply chain patterns. This gives manufacturers more lead time to adjust to supply chain disruptions, as well as helps reduce costs and improve margins on finished products.
Hewlett Packard Enterprise and Hortonworks have a strategic partnership to help manufacturers realize their modern data architecture. Join us for this webinar on Wednesday, May 4 at 10:00 AM PST and learn how Hortonworks leading enterprise-ready open data platform in combination with HPE’s leadership position in the worldwide x86 server market provides manufacturing organizations with proven solutions to help transform manufacturing processes.
The emergence of Big Data has driven the need for a new data platform within the enterprise. Apache Hadoop has emerged as the core of that platform and is driving transformative outcomes across every industry. Join this webinar for an overview of the technology, how it fits within the enterprise, and gain insight into some of the key initial use cases that are driving these transformations.
Today’s criminals and terrorist organizations are outpacing the performance of anti-money laundering (AML) programs by using new and unconventional ways to hide illicit transactions. While financial services firms have taken measures to improve programs, such as fine-tuning alert systems to reduce false positives, and investing in human capital to manage the growing number of investigations, they must look to Big Data to take their AML programs to the next level.
In this one-hour webinar, we’ll discuss how Big Data can be used today to bring AML programs into the new frontier, including how to:
Joe Gillespie, Anti-Money Laundering Leader, Booz Allen Hamilton
Vamsi Chemitiganti, General Manager-Financial Services, Hortonworks
Today, telecommunications providers need relevant data in reasonable time and format to transform their business by acting on the insights generated through the data. Sprint has turned to Hadoop for scalable data storage and analytics in order to harness the data flowing from all possible directions at high speed and in various formats.
Join Sprint, Diyotta and Hortonworks for a real-life use-case discussion and lessons learned during the project implementation and how Sprint addressed the challenges such as skills gap, reducing costs, shortened time-to-value, integration with existing ecosystem and telecom fraud prevention.
Retailers are laser-focused on understanding buyer sentiment and driving personalized engagement. It’s much easier to predict sales, revenue, and stock availability when you have a comprehensive understanding of customer buying behavior and path to purchase . Manthan’s Customer Analytics, running on Hortonworks Hadoop, leverages both structured data like sales history and unstructured data like social media and clickstream to truly create a 360-degree view of your customer. Leadership from Hortonworks and Manthan will provide an overview of this solution, and deliver a live product demonstration.
Join Hortonworks and Manthan for a webinar focused on uncovering revenue and profit opportunities from your customer data.
You might be living under a bridge if you think Hadoop implementation comes without its challenges. How can such projects graduate from a POC into full production? Join Hortonworks and Cisco as we share the top five common Hadoop implementation fails and ways to avoid them. You will learn the industry best practices and hear the real life trials and tribulations that plague implementation phases.
Every day, healthcare professionals must make critical decisions— often times without sufficiently accurate and transparent data. The healthcare industry is undergoing a revolution, driven by an irreversible surge in the quantity and availability of data.
This data includes traditional clinical and transactional data such as claims, electronic medicalrecords (EMR), lab results, and radiological images. However, this data surge also includes newer sources of data generated from wearable sensors and devices, mobile phones, websites, and social media interactions. Traditional data platforms cannot store this multifaceted data without costly extract-transform-load (ETL)processing. In the cases where unstructured data is available, it is retained only for weeks or months, keeping it longer than that can be too expensive.
Join this webinar on Wednesday, March 9 to learn how Hortonworks, together with HPE, can help resolve these challenges by making data less expensive, better organized, and more readily available.
The banking sector continues to be a driving force of any economy and leading banks are adapting to consumer and technological advances that are presenting a multitude of business opportunities.
Banks can now process huge amounts of data from both traditional and non-traditional sources in Hadoop giving them better insight into both their risks and opportunities. Deeper analysis and insight can improve operational margins and protect against one-time events that might cause catastrophic losses. Join Rob Toguri from EY and Vamsi Chemitiganti from Hortonworks as they discuss how banks are becoming more data-driven and predictive:
· How data is disrupting the banking sector
· Where business value is generated across the banking ecosystem
· What the roles of a banking Chief Data Officer are
· What are the requirements for data governance and compliance
· How leading banks are benefiting from this opportunity
Insurance companies of all sizes are challenged to keep up with emerging technologies that deliver a competitive advantage. Big data holds the key to greater customer insight and stronger customer relationships. But risk of sensitive data exposure — and compliance violations — keeps many insurers from pursuing big data initiatives and reaping the rewards of business-driven analytics.
Join Dataguise and Hortonworks for this live webinar to learn how you can free your organization from traditional information security constraints and unlock the power of your most valuable business assets.
Consumer Packaged Goods (CPG) companies such as PepsiCo rely on the seamless communication between a large interconnected network. In order to be successful, this network must include suppliers, production facilities, logistics partners and retailers. With a heavy reliance on coordination, each member of this network generates information in a wide variety of volumes and formats – making it difficult for CPG organizations to effectively leverage information in a timely manner to forecast sales, update production schedules, and define logistics efforts. This large volume of data poses a challenge for the CPG organization when executing a variety of business processes.
Join Hortonworks & Trifacta for a live webinar discussing how PepsiCo’s Collaborative Planning Forecasting & Replenishment (CPFR) team is leveraging the joint solution from Hortonworks & Trifacta to transform the process of making raw retail sales and inventory ready for analysis. During this webinar you will learn:
• What are the top data and analytics challenges facing today’s CPG organizations
• Why Excel and Access are unable to handle the growing volume and variety of data that makes up CPG analytics
• How PepsiCo is leveraging this joint solution from Hortonworks and Trifacta to reduce analytic build time by 90%
NRF 16 attracted over 33,000 attendees and focused on retail technology, trends, and areas of focus for the coming fiscal year. Hortonworks and Microsoft were visible booth presences, hosting large volumes of visitors. Together, the partnership is uniquely qualified to accelerate retail data collection and management.
Join Hortonworks VP of Industry Solutions, Eric Thorsen, and Microsoft’s Retail Industry Solutions Director, Shish Shridhar, to learn how Data-Centric Retail Trends can impact your business and how Hadoop can help you gain actionable insight from your data. Understand how your peers are using data to drive new levels of customer centricity, real-time inventory predictions, and personalized marketing. Learn about how consumer goods companies are using Hadoop to gain insight from demand signals and analyze data flow. Register now to learn more!
The emergence of Big Data has driven the need for a new data platform within the enterprise. Apache Hadoop has emerged as the core of that platform and is driving transformative outcomes across every industry. Join this webinar for an overview of the technology, how it fits within the enterprise, and gain insight into some of the key initial use cases that are driving these transformations.
Register for the on-demand here.
Together, Hortonworks and WANdisco eliminate downtime and data loss to meet the most demanding Service Level Agreements (SLAs). This joint solution helps move customers into full production and expand their deployment footprint through an active-active replication architecture that achieves 100% continuous availability.
Hear from experts with deep experience in analytics, big data, and data integration:
Forrester Research analyst educates you on the four areas of innovations—real-time Hadoop, machine learning accelerated solutions, simplification and automation, and security. Hadoop pioneer, Hortonworks, presents concrete steps to getting started with Hadoop for data discovery, single view, and predictive analytics. Data virtualization leader Denodo demonstrates how you can virtually integrate your big data with other enterprise data at a fraction of the time and cost of physical ETL.
Expect to walk away with actionable insights in 60 minutes:
In this webinar you will learn how the Hortonworks Data Platform offers an Open Enterprise Hadoop solution with EMC’s Elastic Cloud Storage (ECS) Platform. You will learn how the deep integration between EMC and Hortonworks delivers a Hadoop solution with unmatched scalability, storage efficiency & availability. Discover the benefits of robust data protection and geo-scale Big Data Analytics. Learn how to easily configure and deploy Hadoop with EMC Elastic Cloud Storage in minutes using Ambari. Reduce Hadoop complexity and accelerate your ‘time-to-insights’.
Les entreprises entre de plein pieds dans le monde data-driven. Avec Microsoft Azure et Hortonworks vous pouvez avoir monté un cluster big data aussi simplement que d’aller acheter une baguette. Rejoignez nous pour un webinar sur comment débuter avec votre cluster HDP sur Mircrosoft Azure. Nous balaierons:
les atouts d’une achitecture hybride
la sandbox Azure
la sandbox Cloudbreak
Recent innovations in the Internet-enabled connected cars that we drive today have spawned a whole new set of opportunities and challenges for automakers. The opportunities come from the ability to capture detailed, current data on how drivers operate their vehicles and how those vehicles respond to that use. Join this webinar to learn how this data can become critical in uses such as preventative maintenance, product development, manufacturing optimization, infotainment & paid content, as well as recall avoidance.
Lorsqu’elles évaluent Apache Hadoop les organisations identifient souvent des dizains de cas d’utilisation mais comment commencer? Avec des centaines d’implémentations réussis chez nos clients nous avons pu déterminer que les approches réussies commencent par un taille et un objectif raisonnable. Rejoignez nous pour passer en revue ces modèles de déploiements et implémentations réussies qui pourront vous servir de guides sur votre chemin d’optimisation du datawarehouse ou de nouvelles approches analytics avec Hadoop.
Durch die Nutzung von Big Data wissen Sie bereits, dass Ihre Daten aus unterschiedlichen Quellen kommen, darunter CRM-Systeme, Dateien, Kalkulationstabellen, Videos, soziale Medien, Zahlungsinformationen usw. Beim Sammeln der Daten aus diesen zahlreichen Quellen muss sichergestellt werden, dass sensible Informationen geschützt sind. Die Frage ist: Wie gewährleisten Sie den umfassenden Schutz Ihrer Big-Data-Umgebung, nachdem fast täglich von einem weiteren verheerenden Sicherheitsvorfall bei einer Organisation berichtet wird?
Nehmen Sie an diesem Webinar teil und erhalten Sie praktische Informationen darüber, wie Sie Hortonworks und Vormetrics optimal dazu nutzen können, Ihre Big-Data-Umgebung umfassend zu schützen, Compliance-Probleme zu lösen und die Auswirkungen auf Ihre Systeme zu minimieren.
Avec l’emergence du big data set venu le besoin pour une data plateforme dans l’entrerprise. Apache Hadoop s’est impose comme un des elements centraux de cette plateforme et mène une data revolution au sein des entreprises . Rejoignez nous pour un webinar autour de cette technology sur Microsoft Azure, comment elle se positionne dans l’entreprise et détaillons les use-cases les plus importants qui imposent ces transformations.
Your data is trying to tell you more about your business, are you listening? Join this webinar to hear how Quick Serve Restaurants and Retailers are using big data and open source Hadoop to listen and learn from their data. Our partner Blue Granite is uniquely qualified to drive quick time-to-value in Hadoop projects and will explain how to quickly tune into your data to improve customer loyalty and impact business performance.
Join Eric Thorsen from Hortonworks and Scott Faculak from Blue Granite in this webinar on October 27, 2015 at 11:00 EST to learn more about how retailers are using big data and open source Hadoop to drive market share, wallet share, and consumer loyalty.
GM, Consumer Products
Blue Granite, Inc.
The advent of connected manufacturing has ushered in an era where low-cost machine sensors take thousands of measurements per second at many points across the manufacturing process, enabling manufacturers to quickly detect anomalies and solve issues before they impact yield and quality. With Big Data insights, manufacturers can capitalize on this opportunity by following an approach that combines the power of Teradata with Hortonworks Data Platform’s storage and compute efficiencies at extreme scale.
Join Grant Bodley, Hortonworks GM of Industry Solutions and Dale Glover, Teradata VP of Industry Consulting, for a webinar on October 21, 2015 at 12:00pm PST to learn more about how manufacturing companies are utilizing Hadoop to:
While in big data we can reap big rewards, it also poses significant risk, including misleading data and unexpected costs that could impact the business. It is critical for enterprises to put in place a data governance strategy to ensure that information remains accurate, consistent and trusted. With the ever evolving landscape of disparate data sources that are now available to us for analytics, it’s now more important than ever to enforce a data assurance strategy, why? Nearly 40% of all company data is found to be inaccurate, 66% of organizations believe they’re negatively affected by inaccurate data, and the amount of data is growing exponentially. In this webinar we will discuss how Oracle’s Data Integration solutions can help organizations.
The world’s leading firms have recognized that data, along with human capital, is the most valuable asset they have today. The need for IT to digitize the business and provide actionable data to forecast market movements, improve customer experience, make flash offers, response to network errors, is paramount to sustainability. However, existing systems were not designed to address the data needs of today’s enterprises. The conclusion – organisations are turning to Apache Hadoop to become Data-Driven. Join this webinar to learn:
It is increasingly evident that organizations can realize the full potential value of their data assets by combining the structured transactional data with semi-structured and unstructured data. Businesses also notice that to be agile and react to situations in real time, access to transactional data with low latency is essential. Low-latency transactional data brings additional value especially for dynamically changing operations that day-old data, structured or unstructured, cannot deliver. Streaming transactional data into big data solutions in real time, without degrading the performance of the source production systems will lay the foundation for more efficient operations and improved customer experience.
In this webinar you will learn how Oracle GoldenGate 12c empowers organizations to capture, route, and deliver transactional data from Oracle and non-Oracle databases. Oracle GoldenGate for Big Data provides optimized and high performance delivery to Hadoop targets such as Flume, HDFS, Hive, Hbase, NoSQL, Kafka, Spark and others to support customers with their real-time big data analytics initiatives.
In case you missed the webinar, you can access the slides here.
The Personalized Medicine Initiative (PMI), part of the Life Sciences Institute of the University of BC, has deployed HDP and PHEMI Central Big Data Warehouse to collect, store and manage genomic and clinical data for Molecular You (MyCo). Molecular You is a ground breaking program that will provide 20,000 participants a molecular level understanding of themselves over the next 20 years, and provide scientists with new information necessary to detect disease risk and initiate early intervention.
Molecular You automatically provides differing views to their various users and applications – from researchers permitted access to de-identified data, to experts consulting on specific tests, to clinicians who need to view personal health information for consultations – all while protecting privacy and enforcing data sharing agreements.
Join Rob Fraser of PMI, Roy Wilds of PHEMI, and Richard Proctor of Hortonworks on October 8 at 10 am PST for this use case webinar to learn more about this Healthcare case study.
As more data is imported into Hadoop Data Lakes, how can we best secure sensitive data? What security options are available and what kind of best practices should be implemented? Join Vincent Lam of Protegrity and Syed Mahmood of Hortonworks as they jointly discuss securing HDP data lakes to leverage security in Hadoop without sacrificing usability.
You’ll learn about:
Hadoop ist nicht mehr länger nur eine Option. Unternehmen aller Größenordnungen befinden sich an den verschiedensten Stationen auf ihrer Big-Data-Reise. Ganz gleich, ob Sie gerade erst beginnen die Plattform zu erkunden oder ob Sie bereits mehrere Cluster im Betrieb haben. Jeder steht der gleichen Herausforderung gegenüber: Aufbau interner Kompetenzen.
Hadoop-Spezialisten sind schwer zu finden. Von Hand zu programmieren ist zu fehlerträchtig, wenn es an das Storing, die Integration oder die Analyse Ihrer Daten geht. Es geht jedoch auch einfacher.
Begleiten Sie Talend und Hortonworks bei diesem Webinar in welchem wir Ihnen aufzeigen, wie Sie Ihre Daten in Hadoop vereinen können – und zwar ohne spezielle Big-Data-Kenntnisse.
Anhand einer technischen Demo werden Sie erfahren, wie:
· Hadoop völlig neue Analyse-Applikationen ermöglicht
· Sie die Kompetenz-Kluft mit unseren Big-Data-Lösungen überbrücken können
VHA (Voluntary Hospitals of America) is the largest member-owned health care company in the US delivering industry-leading supply chain management services and clinical improvement services to its members. At VHA, product, supplier, and member information is siloed across multiple sources. VHA sees value in consolidating the disparate data into a Data Lake, supported by the Hortonworks Data Platform, to enable the business users to discover the related data and provide services to their members. Because of their previous success with data virtualization, powered by Denodo, VHA decided to use data virtualization to enable their business users to discover data using the familiar SQL, and thus abstract their access directly to Hadoop.
During this webinar, you will learn:
In this webinar you’ll understand how the Hortonworks Data Platform delivers an Open Enterprise Hadoop solution to run on EMC Isilon infrastructure. You will learn how the EMC Isilon storage solutions combined with the Hortonworks Data Platform deliver unmatched flexibility, lower cost, and deliver robust data protection and security. You’ll learn how you can easily initiate analytics projects quickly and get results in minutes.
Wow! When have you ever sat in on a Big Data analytics discussion by three of the most influential CTOs in the industry? What do they talk about among themselves?
Join Teradata’s Stephen Brobst, Informatica’s Sanjay Krishnamurthi, and Hortonworks’ Scott Gnau as they provide a framework and best practices for maximizing value for data assets deployed within a Big Data & Analytics Architecture.
Don’t forget to download the white paper
Over the last few years, the insurance industry appears to have fared reasonably well; however, as of year-end 2014 Returns on Equity (ROEs) have begun to fall due to a combination of capital accumulation, competitive pricing, weak investment returns and rising loss expense (Source: 2015 EY US Property/Casualty insurance outlook). Whether the firm is a health, property and casualty or life insurance company with accident and disability product offerings, big data analytics using machine learning on Hadoop provides an opportunity to streamline and systematize procedures, optimize pricing and reduce fraud and overall risk.
Join the Editor-in-Chief from Claims Magazine’s along with experts from Hortonworks and Skytree to learn about the potential and impact of big data analytics on the insurance industry. This webinar will help you understand how the use of machine learning on big data is helping institutions understand customer sentiments, detect fraud, and several use cases that can benefit from the applicability of big data analytics.
Enhancing a customer experience has become essential for communication service providers to effectively manage customer churn and build a strong, long lasting relationship with their customers. This has become increasingly challenging as customer interactions occur across multiple channels. Understanding customer behavior and how it applies across channels is the key to ensuring the best level of experience is achieved by each customer.
In this webinar Hortonworks and Apigee discuss how service providers can capture and visualize customer behavior across customer interaction points like call center events (IVR and chat) and combine it with network data, to predict customer calls and patterns of digital channel abandonment using Hadoop and predictive analysis and visualization tools..
We will identify ways to develop a 360 degree view across a customer’s household through an HDP Data Lake and visualize customer interaction patterns and predict expected behavior using Apigee Insights to identify and initiate the Next-Best-Action for a customer to ensure a superior level of customer experience.
Who should attend: Developers, Data Engineers, Data Scientists, Managers and Directors of Development teams.
By leveraging Big Data, you already know your data now comes from a variety of sources including CRM systems, files, spreadsheets, video, social media, payment data and more. As data is being collected from these diverse sources, sensitive information must be protected. The question is: how do you comprehensively secure your Big Data environment amid the almost daily reports of yet another colossal breach affecting another organization?
Join this webinar and gain practical information on how best leverage Hortonworks and Vormetric to comprehensively secure your Big Data environment, address compliance issues and minimize the impact on your systems.
We will discuss how you can:
Join this webinar to explore Hadoop security challenges and trends, learn how to simply the connection of your Hortonworks Data Platform to your existing Active Directory infrastructure and hear about real world examples of organizations that are achieving the following benefits:
Hadoop and The Internet of Things has enabled data driven companies to leverage new data sources and apply new analytical techniques in creative ways that provide competitive advantage. Beyond clickstream data, companies are finding transformational insights stemming from machine data and telemetry that are radically improving operational efficiencies and yielding new actionable customer insights.
We will discuss real world case studies from the field that describe the strategies, architectures, and results from forward thinking Fortune 500 organizations across a variety of verticals, including insurance, healthcare, media & entertainment, communications, and manufacturing.
Chad Meley, Vice President of Product & Services, Teradata
John Kreisa, Vice President of Marketing Strategy, Hortonworks
Hadoop est devenu incontournable. Les entreprises de toutes tailles sont à différents stades de leurs réflexions sur le Big Data. Que vous commenciez tout juste à explorer la plateforme, ou que vous ayez déjà plusieurs clusters existants, tout le monde fait face au même défi : développer ses compétences internes. Les experts Hadoop sont difficiles à trouver. Le codage manuel n’est pas assez fiable quand on en vient à stocker, intégrer ou analyser ses données. Oubliez ces obstacles.
Les spécialistes du Big Data, Talend et Hortonworks, vous présentent un webinar pendant lequel vous découvrirez comment unifier toutes vos données dans Hadoop, sans compétences spécifiques au Big Data.
In order to make data-driven decisions about risks and threats to facilities, assets and employees, Oil and Gas companies need a solution that can acquire, manage, integrate, analyze and explore your data more efficiently across a diverse set of data sources. During this webinar, you will learn how Hortonworks HDP and Novetta Entity Analytics can help Oil and Gas companies construct complete, integrated, and clear global profiles of suspicious individuals, terrorists and criminal threats. Learn how to provide security analysts with rich, contextual reporting of the risks posed to its global operations.
In this webinar you’ll learn how Pivotal HAWQ, one of the world’s most advanced enterprise SQL on Hadoop technology, coupled with the Hortonworks Data Platform, the only 100% open source Apache Hadoop data platform, can turbocharge your Data Science efforts.
Pivotal HAWQ allows you to leverage advanced analytics for your data in Hadoop using massively-parallel processing, based on SQL. HAWQ provides strong support for low-latency analytic SQL queries and machine learning capabilities. This enables discovery-based analysis of large data sets and rapid, iterative development of data analytics applications that apply deep machine learning – significantly shortening data-driven innovation cycles for the enterprise.
The Hortonworks Data Platform offers linear scale storage and compute across a wide range of access methods from batch to interactive, to real time, search and streaming. It includes a comprehensive set of capabilities across governance, integration, security and operations and allows for data discovery from new, voluminous data types such as machine and sensor data, geolocation data, click-stream data and sentiment data are valuable when correlated with other data sets in a shared enterprise “Data Lake.”
Together, Pivotal HAWQ and the Hortonworks Data Platform provide businesses with a Modern Data Architecture for IT transformation.
Hadoop provides a powerful platform for data science and analytics, where data engineers and data scientists can leverage myriad data from external and internal data sources to uncover new insight. Such power is also presenting a few new challenges. On the one hand, the business wants more and more self-service, and on the other hand IT is trying to keep up with the demand for data, while maintaining architecture and data governance standards.
In this webinar, Andrew Ahn, Data Governance Initiative Product Manager at Hortonworks, will address the gaps and offer best practices in providing end-to-end data governance in HDP. Andrew Ahn will be followed by Oliver Claude of Waterline Data, who will share a case study of how Waterline Data Inventory works with HDP in the Modern Data Architecture to automate the discovery of business and compliance metadata, data lineage, as well as data quality metrics.
Companies in every industry look for ways to explore new data types and large data sets that were previously too big to capture, store and process. They need to unlock insights from data such as clickstream, geo-location, sensor, server log, social, text and video data. However, becoming a data-first enterprise comes with many challenges.
Join this webinar organized by three leaders in their respective fields and learn from our experts how you can accelerate the implementation of a scalable, cost-efficient and robust Big Data solution. Cisco, Hortonworks and Red Hat will explore how new data sets can enrich existing analytic applications with new perspectives and insights and how they can help you drive the creation of innovative new apps that provide new value to your business.
Join this webinar with Hortonworks and Skytree and learn how Communications Service Providers can enhance their customers experience by:
– Creating a Data Lake for a 360 degree customer view.
– Building dynamic customer profiles.
– Leveraging a next-best-action streaming engine.
You will learn more about how Hortonworks Hadoop Distribution Platform and Skytree Machine Learning Solution can help you do so.
Speakers: Dr. Alexander Gray, CTO at Skytree, and Sanjay Kumar, General Manager, Hortonworks
Today’s enterprises are challenged with capturing large amounts of data from a number of sources in a variety of formats, and then storing it in a cost-effective, timely manner. With your current data warehouse, this may seem overwhelming. It doesn’t have to be. With a Hadoop-based modern data warehouse, you can overcome these challenges and get meaningful insights from real-time data.
Want to learn how? Join experts from Attunity, Hortonworks, and RCG Global Services for a live webinar – where we will be discussing enterprise data warehouse optimization. You will learn how to:
•Rebalance your data warehouse by identifying unused data and resource-intensive workloads that can be moved to Hadoop.
•Seamlessly integrate your current enterprise data warehouse with a Modern Data Architecture.
•Better utilize data assets to reduce costs while realizing more value from your data.
•Develop a roadmap for implementing the Hadoop-based Modern Data Architecture and Data Lake.
In today’s data-rich world, insight overlooked translates into opportunity missed. In this webinar, Charles Boicey, US Irvine Medical Center, shows how Hadoop helps reduce patient re-admittance rates and improves patient care with real-time monitoring.
Watch this webinar to learn how to enable one unified data platform to serve multiple constituents and much more.
Your Big Data strategy is only as good as the quality of your data. Today, deriving business value from data depends on how well your company can capture, cleanse, integrate and manage data. During this webinar, we will discuss how to eliminate the challenges to Big Data management inside Hadoop.
Securing Hadoop data is a hot topic for good reason – no matter where you are in your Hadoop implementation plans, it’s best to define your data security approach now, not later. Hortonworks and Voltage Security are focused on deeply integrating Hadoop with your existing data center technologies and team capabilities. Attend this discussion to learn about a central policy administration framework across security requirements for authentication, authorization, auditing and data protection.
Whether you are an insurer, reinsurer, broker or insurance service provider; everything you do is based on analytics. From underwriting to claims to agency and marketing, the smartest and most streamlined business operations at insurance companies are driven by advanced and intelligent analytics. But is your data ready? Are you an “Analytics Ready” insurer? Great analytics starts with great data management. Join us as industry experts from Informatica and Hortonworks share industry trends and best practices to show you how to become an “Analytics Ready” insurer.
Many organizations are leveraging social media to understand consumer sentiment and opinions about brands and products. Analytics in this area, however, is in its infancy and does not always provide a compelling result for effective business impact. Learn how consumer organizations can benefit by integrating social data with enterprise data to drive more profitable consumer relationships. This webinar is presented by Hortonworks and Clarity Solution Group, and will focus on the evolution of Hadoop, the clear advantage of Hortonworks distribution, and business challenges solved by “Consumer720.”
Learn how a successful Hadoop project moves from use case discovery to successful implementation of analytic insights to the ability to deliver predictive analysis. Hortonworks and CSC combine forces on this webinar to help answer the question “How can I see results quickly and reliably in my big data project?”. Whether your goal is to advance your career with a successful project, or exploit data insights for competitive intelligence, this discussion is for you. The first in a two part series, you will leave with a list of resources that will help you advance your project quickly, no matter where you stand today. This is intended for both a business and IT audience with at least a fundamental understanding of Hadoop as a concept.
Join Cloudian, Hortonworks and 451 Research for a panel-style Q&A discussion about the latest trends and technology innovations in Big Data and Analytics. Matt Aslett, Data Platforms and Analytics Research Director at 451 Research, John Kreisa, Vice President of Strategic Marketing at Hortonworks, and Paul Turner, Chief Marketing Officer at Cloudian, will answer your toughest questions about data storage, data analytics, log data, sensor data and the Internet of Things. Bring your questions or just come and listen!
Big Data Analytics is transforming how banks and financial institutions unlock insights, make more meaningful decisions, and manage risk. Join this webinar to see how you can gain a clear understanding of the customer journey by leveraging Platfora to interactively analyze the mass of raw data that is stored in your Hortonworks Data Platform. Our experts will highlight use cases, including customer analytics and security analytics.
Speakers: Mark Lochbihler, Partner Solutions Engineer at Hortonworks, and Bob Welshmer, Technical Director at Platfora
Join this webinar and hear about key trends for Hadoop in 2015. You will learn:
Mike Gualtieri, Principal Analyst at Forrester, and John Kreisa, Vice President Strategic Marketing at Hortonworks, will discuss these trends and go over real world examples of strategies, architectures and results from leading edge companies.
Many enterprises are turning to Apache Hadoop to enable Big Data Analytics and reduce the costs of traditional data warehousing. Yet, it is hard to succeed when 80% of the time is spent on moving data and only 20% on using it. It’s time to swap the 80/20! The Big Data experts at Attunity and Hortonworks have a solution for accelerating data movement into and out of Hadoop that enables faster time-to-value for Big Data projects and a more complete and trusted view of your business. Join us to learn how this solution can work for you.
Hadoop is no longer optional. Companies of all sizes are in various phases of their own Big Data journey. Whether you are just starting to explore the platform or have multiple clusters up and running, everyone is presented with a similar challenge – developing their internal skillset. Hadoop specialists are hard to find. Hand coding is too prone to error when it comes to storing, integrating or analyzing your data. However, it doesn’t need to be this difficult.
Please join Talend and Hortonworks on this webinar where we will help you learn how to unify all your data in Hadoop, with no specialized Big Data skills.
You will learn:
Developers increasingly are building dynamic, interactive real-time applications on fast streaming data to extract maximum value from data in the moment. To do so requires a data pipeline, the ability to make transactional decisions against state, and an export functionality that pushes data at high speeds to long-term Hadoop analytics stores like Hortonworks Data Platform (HDP). This enables data to arrive in your analytic store sooner, and allows these analytics to be leveraged with radically lower latency. But successfully writing fast data applications that manage, process, and export streams of data generated from mobile, smart devices, sensors and social interactions is a big challenge.
Join Hortonworks and VoltDB, an in-memory scale-out relational database that simplifies fast data application development, to learn how you can ingest large volumes of fast-moving, streaming data and process it in real time. We will also cover how developing fast data applications is simplified, faster – and delivers more value when built on a fast in-memory, scale-out SQL database.
As the Big Data Analytics and the Apache Hadoop ecosystem has matured and gained increasing traction in established industries with faster adoption in the insurance market than originally anticipated, it is clear that the potential benefits for data management and business intelligence are staggering. At the same time, many big data programs have stalled or failed to deliver on their aspirational value proposition, resulting in a substantial gap between expectations of analytics consumers and the ability of big data analytics programs to deliver.
Join Hortonworks and Clarity as we review the common needs of Property and Casualty (P&C) Insurers and how to unlock the true value of big data analytics:
We will discuss a modern data architecture that constitutes a mature, enterprise strength Hadoop framework for P&C Insurers that answers the need for governance processes across the enterprise stack. We will cover how a modern data architecture allows organizations to collect, store, analyze and manipulate massive quantities of data on their own terms—regardless of the source of that data – accelerating the real lifetime value of big data and Hadoop analytics for claims, customer sentiment and telematics.
The Smart Content Hub solution from HP and Hortonworks enables a shared content infrastructure that transparently synchronizes information with existing systems and offers an open standards-based platform for deep analysis and data monetization. Join this webinar and learn how you can: 1/ Leverage a 100% of your data, including text, images, audio, video, and many more data types can be automatically consumed and enriched using HP Haven and Hortonworks Data Platform. 2/ Democratize and enable multi-dimensional content analysis and empower your analysts, business users, and data scientists to search and analyze Hadoop data with ease. 3/ Extend the enterprise data warehouse to synchronize and manage content from content management systems, and crack open the files in whatever format they happen to be in. 4/ Dramatically reduce complexity with enterprise-ready SQL engine. Watch the demo.
Join us to learn how Hortonworks Data Platform and Nimble Storage provide an enterprise-ready data platform for multi-workload data processing. HDP supports an array of processing methods — from batch through interactive to real-time, with key capabilities required of an enterprise data platform — spanning Governance, Security and Operations. Nimble Storage provides the performance, capacity, and availability for HDP and allows you to take advantage of Hadoop with minimal changes to existing data architectures and skillsets.
No matter if you are new to Hadoop or have a mature cluster in production, scale will be a critical factor of your success with Hadoop. Are you ready to take the next big step as you scale out your data architecture?
Please join Talend and Hortonworks on this webinar where we will help you learn how to implement an effective big data and Hadoop strategy across your IT infrastructure. You will learn:
How can you simplify the management and monitoring of your Hadoop environment? Ensure IT can focus on the right business priorities supported by Hadoop? Join Hortonworks and HP in this webinar and learn how you can simplify the management and monitoring of your Hadoop environment, and ensure IT can focus on the right business priorities supported by Hadoop.
Almost every week, news of a proprietary or customer data breach hits the news wave. While attackers have increased the level of sophistication in their tactics, so too have organizations advanced in their ability to build a robust, data-driven defense. Join Hortonworks and Sqrrl to learn how a Modern Data Architecture with Hortonworks Data Platform (HDP) and Sqrrl Enterprise enables intuitive exploration, discovery, and pattern recognition over your big cybersecurity data.
In this webinar you will learn:
Many organizations have become aware of the importance of big data technologies, such as Apache Hadoop but are struggling to determine the right architecture to integrate it with their existing analytics and data processing infrastructure. As companies are implementing Hadoop, they need to learn new skills and languages, which can impact developer productivity. Often times they resort to hand-coded solutions which can be brittle, impact the productivity of the developer and the efficiency of the Hadoop cluster.
To truly tap into the business benefits of the big data solutions, it’s necessary to ensure that the business and IT have simple tools-based methods to get data in, change and transform it, and keep it continuously updated with their data warehouse.
In this webinar you’ll learn how the Oracle and Hortonworks solution can:
In this webinar we’ll discuss how technologies from both Oracle and Hortonworks can deploy the big data reservoir or data lake, an efficient cost-effective way to handle petabyte-scale data staging, transformations, and aged data requirements while reclaiming compute power and storage from your existing data warehouse.
Jeff Pollock, Vice President, Product Management, Oracle; and Tim Hall, Vice President, Product Management, Hortonworks
What if you could assemble all your data in one system and run your critical analytic applications in parallel, regardless of the format, age or location of the data? Today, thanks to the economics of Apache Hadoop-based data platforms, in particular YARN, this is possible. Listen to this relay and hear directly from our experts how SAS HPA and LASR have been integrated with Hadoop YARN.
Big Data is moving to the next level of maturity and it’s all about the applications. Dhruv Kumar, one of the minds behind Cascading, the most widely used and deployed development framework for building Big Data applications, will discuss how Cascading can enable developers to accelerate the time to market for their data applications, from development to production. In this session, Dhruv will introduce how to easily and reliably develop, test, and scale your data applications and then deploy them on Hadoop and Hortonworks Data Platform. He will also explain the growth behind Cascading and talk about Cascading’s future with Tez.
Financial services companies can reap tremendous benefits from ‘Big Data’ and they have moved quickly to deploy it. But these companies also place heavy demands on ‘Big Data’ infrastructure for flexibility, reliability and performance. In this webinar, Hortonworks joins WANDisco to look at three examples of using ‘Big Data’ to get a more comprehensive view of customer behavior and activity in the banking and insurance industries. Then we’ll pull out the common threads from these examples, and see how a flexible next-generation Hadoop architecture lets you get a step up on improving your business performance. Join us to learn:
What if your organization could study months and years worth of historical data from disparate sources, without sampling, to pinpoint risks for your business and compliance reporting?
These risks come from uncertainty in financial markets, threats from project failures, legal liabilities, credit risk, accidents and natural disasters as well as deliberate attack from an adversary.
Join this webinar and learn more about the Splunk and Hortonworks offering for risk management. You will be able to watch a live demo of Hunk + HDP for risk management and ask your questions.
Don’t miss this opportunity to see first-hand how Hunk and Hortonworks Hadoop provide a unique approach to your risk management needs!
Massive new data volumes are forcing a transformation in the data center and driving a new modern data architecture that includes Apache Hadoop. As organizations are developing new analytic applications to drive their business forward, many of these new applications are being deployed with Hadoop and HP hardware to meet the growing demands of their data.
Join Hortonworks and HP as we discuss trends and drivers for a modern data architecture and what that means in the data center. Our experts will walk you through some key design considerations when deploying a Hadoop cluster in production. We’ll also share practical best practices around HP and Hortonworks Data Platform to get you started on building your modern data architecture.
At the center of many data-driven businesses is Scalding. Scalding is a Scala library based on the Cascading framework and is designed to simplify application development on Hadoop and YARN.
Please join us as Jonathan Coveney, Sr. Software Engineer at Twitter, teaches us about Scalding, and how Twitter uses it to perform a variety of tasks such as traffic quality measurement, ad targeting, market insight, and more.
Register at the right for this webinar, or join us for the YARN Ready series.
Red Hat JBoss Data Virtualization and HDP: Enabling the Data Lake (demo and deep dive)
Data is exponentially increasing in both types and volumes, creating opportunities for businesses. To fully realize the potential of this new data, analysts recommend the shift from a single platform to a data ecosystem. Multiple systems are needed to exploit the variety and volume of data sources. A flexible data repository such as a data lake is needed to store the data. Technologically speaking Apache Hadoop 2 enables true data lake architectures. The introduction of YARN in particular added a pluggable framework that enabled new data access patterns in addition to MapReduce. An intelligent data management layer is needed to manage metadata and usage patterns as well as track consumption across these data platforms.
Join us in this webinar as our panel of experts discusses how Hadoop can be used alongside the Enterprise Data Warehouse and with Data Integration tools to enable the optimization of data processing workloads for more efficient use of resources.
Data is exponentially increasing in both types and volumes, creating opportunities for businesses. Watch this video and learn from three Big Data experts: John Kreisa, VP Strategic Marketing at Hortonworks, Imad Birouty, Director of Technical Marketing at Teradata, and John Haddad, Senior Director of Product Marketing at Informatica.
Join Ofer Medelvitch, Director of Data Science of Hortonworks and Michael Zeller, Founder and CEO of Zementis as they present key learnings as to what drives successful implementations of big data analytics projects. Their knowledge comes from working with dozens of companies from small cloud-based start-ups to some of the largest companies in the world. To learn more and to register, click the register button at the right.
Red Hat JBoss Data Virtualization and HDP: Evolving your data into strategic asset (demo/deep dive)
Learn how organizations combine HP Vertica Analytics Platform and Hortonworks to quickly explore and analyze a broad variety of data types. Learn how they transform unstructured data into actionable information to better understand customers, both offline and online.
Red Hat and Hortonworks: Delivering the Open Modern Data Architecture
Now that you are moving your Hadoop POC into production, you know that sensitive customer and corporate data (credit card numbers, intellectual property, customer files, and more) need protection. Now the question becomes: How do you keep all this sensitive data secure, as it moves into Hadoop, as it is stored and as it moves beyond Hadoop? And, most importantly, how do you secure the data and still make it available for analytics? After all, analytics is why it’s in Hadoop in the first place.
Join Hortonworks and partner Voltage Security to learn about security options with authentication, authorization, monitoring and data-level security for Apache Hadoop. Get insight to the use cases and architectural decisions that enable the business benefits you need to deliver, while avoiding risks straight from today’s headlines, including cyber attacks and leaking of sensitive customer data. Use cases we will cover include:
Learn how to protect your sensitive data, enable analytics without security risk, and neutralize breaches through new data-centric technologies that are easy to integrate with Hive, Sqoop, MapReduce and many other interfaces.
Join Hortonworks and Cisco as we discuss trends and drivers for a modern data architecture. Our experts will walk you through some key design considerations when deploying a Hadoop cluster in production. We’ll also share practical best practices around Cisco-based big data architectures and Hortonworks Data Platform to get you started on building your modern data architecture.
Join Hortonworks and CSC, in this interactive webinar to:
Join us in this interactive webinar as we walk through use cases on how you can use SAS In-Memory Statistic for Hadoop and SAS Visual Statistic with Hortonworks’ Data Platform (HDP) to reveal insights in your big data and redefine how your organization solves complex problems. Hortonworks and SAS together offers unprecedented speed and flexibility – critical requirements for finding actionable knowledge in massive amounts of data stored in Hadoop.
As more applications are created using Apache Hadoop that derive value from the new types of data from sensors/machines, server logs, click-streams, and other sources, the enterprise “Data Lake” forms with Hadoop acting as a shared service. While these Data Lakes are important, a broader life-cycle needs to be considered that spans development, test, production, and archival and that is deployed across a hybrid cloud architecture.
If you have already deployed Hadoop on-premise, this session will also provide an overview of the key scenarios and benefits of joining your on-premise Hadoop implementation with the cloud, by doing backup/archive, dev/test or bursting. Learn how you can get the benefits of an on-premise Hadoop that can seamlessly scale with the power of the cloud.
Join Revolution Analytics and Hortonworks in this interactive webinar to discuss how customers are using Hadoop and R in the real world for Data Mining and Predictive Analytics. We’ll show an end-to-end customer churn analytics demonstration (leveraging Revolution Analytics, Hortonworks and Tableau) serving three user personas: a website visitor, a data scientist and a business analyst.
Most successful Apache Hadoop implementations start small in scope and scale with a single analytic application but can quickly grow. Mature deployments can have many applications running off a single shared data lake. Improved application lifecycle will accelerate creation of new apps to meet new business needs. As the Hadoop environment grows and becomes increasingly more integrated with existing data center technologies and skills, it becomes more critical to efficiently manage this composite landscape to help organizations realize the benefits of Big Data.
Join this webinar to hear from BMC and Hortonworks on how their combined solutions help customers unlock the value of Big Data by implementing a modern data architecture that is managed with proven, enterprise grade solutions.
Are your business users able to quickly access and report on the massive amount of data flowing into Hadoop? Learn how leading companies are already accelerating the speed of innovation using the combination of Hadoop and the Actian Analytics Platform. In this webinar, Hortonworks and Actian will describe how you can:
Register now to accelerate Big Data 2 with Hortonworks and Actian!
Difficult challenges and choices face today’s healthcare and pharmaceutical industry. Listen to this replay and hear from industry leaders in pharmaceutical, healthcare, and Big Data technologies on how they’re unleashing Big Data to drive real business impact.
In this webinar, Gerard will present how Merck is leveraging Big Data and best practices you can use to implement similar outcome in you organization.
Join Hortonworks and Concurrent to learn how to accelerate your big data application development with the popular Cascading framework and Hortonworks Data Platform. In this webinar, we will
In this webinar, Charles Boyce will present how UC Irvine Health turned to Hadoop and Hortonworks Data Platform to improve clinical operations in the hospital and its scientific research at the medical school. Their team is building a quantified medical practice that reduces re-admissions, speeds new research projects, and tracks patient vital stats on a minute-by-minute basis.
Retailers need the complete picture – a 360-degree view of the customer, the pulse on brand sentiment, personalized promotions, and an optimal shopping experience. When Hadoop is integrated with modern retail operations, it dramatically reduces the cost of capturing, ingesting and storing data. By implementing self-service analytics capabilities on top of Hadoop, you can gain drill down into customer data—instantly—to answer learn more about your customers.
Join Hortonworks and Platfora to discuss:
What if your organization could obtain a 360 degree view of the customer across offline, online and social and mobile channels?
Attend this webinar with Splunk and Hortonworks and see examples of how marketing, business and operations analysts can reach across disparate data sets in Hadoop to spot new opportunities for up-sell and cross-sell. We’ll also cover examples of how to measure buyer sentiment and changes in buyer behavior. Along with best practices on how to use data in Hadoop with Splunk to assign customer influence scores that online, call-center, and retail brances can use to customize more compelling products and promotions.
Join Hortonworks and Actian, as we address the challenges faced by companies trying to implement their Big Data Strategy. In this webinar, we will identify some of the top challenges around analytics with big data and highlight how existing skills can be used to solve these challenges. Additionally, we will also provide real-world use cases on the integration between Hortonworks Data Platform and Actian Analytics Platform, which aims to simplify the delivery of data services for entire ecosystems of users, and significantly lower the total cost of ownership for the modern data platform.
Is Hadoop ready for high-concurrency complex BI and Advanced Analytics? Roaring performance and fast, low-latency execution is possible when an in-memory analytical platform is paired with the Apache Hadoop framework. Join Hortonworks and Kognitio for an informative Web Briefing on putting Hadoop at the center of your modern data architecture—with zero disruption to business users.
In this webinar, we’ll:
Apache, Hadoop, Falcon, Atlas, Tez, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie, Metron and the Hadoop elephant and Apache project logos are either registered trademarks or trademarks of the Apache Software Foundation in the United States or other countries.