6:30 pm – 6:50 pm Networking & Mingling
6:50 pm – 7:00 pm Introductions…
7:00 pm – 7:40 pm Speaker 1 (Madhan Neethiraj )
7:40 pm – 7:55 pm Q & A
7:55 pm – 8:00 pm break
8:00 pm – 8:40 pm Speaker 1 & 2 (Pavan Venkatesh & Venkat Vempati)
8:40 pm – 8:55 pm Q & A
Title : Enhanced Hadoop security with Apache Ranger and Protegrity
Enterprises are adopting Hadoop as the center of the modern data driven architecture. Customers are expecting advanced data security and governance to be embedded within Hadoop and related projects. Apache Ranger provides centralized security administration for Hadoop ecosystem projects. With the recent 0.5 release, Apache Ranger now provides centralized access control and auditing for many Hadoop related projects including Hadoop, Hive, Hbase, Storm, Kafka, Solr, Yarn and Knox. Ranger has also introduced a services based stack which can be easily extended by partners to extend centralized security framework across many applications.
Protegrity provides enterprise grade encryption for Hadoop and many traditional databases and adds value in protecting sensitive data. With Apache Ranger and Protegrity, enterprise users can secure their data across Hadoop, protect sensitive data and ensure compliance with internal and external driven mandates.
Madhan Neethiraj is an Apache committer and currently works at Hortonworks, as Engineering Lead in Enterprise Security Team. Prior to this, Madhan was at Oracle in development of security access management suite, governance and real-time fraud detection/prevention products. Prior to Oracle, worked in Bharosa Inc., responsible for the development of real-time fraud detection solution for Financial Institutes, HealthCare and eCommerce. Madhan is the main author and contributor of theApache Ranger Audit framework and was responsible making it scalable to meet the high demands in the Hadoop platform.
Title: Security Best Practices for Hortonworks using Protegrity
Abstract: “Exponential Data Growth” is driving organizations to leverage Hadoop for storing vast amounts of data. What’s the best way to address sensitive data that is transferred and processed inside Hadoop? How can we secure the Hadoop platform and other elements in its ecosystem? This talk briefly explains how Protegrity can help Hortonworks customers have a highly secure environment (in and around Hadoop), with technical details on how we protect data, followed by a sandbox demo.
Pavan Venkatesh has broad experience in Open-Source databases, NoSQL and Big Data environments. He is currently Head of Products at Protegrity. Previously he held various Product Management positions at DataStax (Cassandra) and Schooner. Prior to this, Pavan was a Solutions Engineer for MySQL(Oracle) & Basho (Riak) where he helped many clients successfully implement database & HA solutions. Pavan holds an M.S. in Computer Science from Syracuse University and a B.E. in Electrical and Electronics from National Institute of Engineering, India.
Venkat Vempati is a Senior Systems Engineer at Protegrity. He has several years of experience working in Big Data environments, Enterprise level data systems and Hadoop Ecosystem. Prior to Protegrity, Venkat worked as a Solution Architect at Equifax in Atlanta where he participated in building their Big Data team from inception. Before that Venkat was a Senior Hadoop Engineer at Lockheed Martin Corporation. Venkat holds an M.S in Computer Engineering from University of Toledo, Ohio and B.S in Electronics and Communication Engineering from Osmania University, India.