Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.

cta

Get Started

cloud

Ready to Get Started?

Download sandbox

How can we help you?

closeClose button

How to Use Open Enterprise Hadoop for Big Data Success

Recorded on August 25th, 2015

The emergence of Big Data has driven the need for a new data platform within the enterprise. Apache Hadoop has emerged as the core of that platform and is driving transformative outcomes across every industry. Join this webinar for an overview of the technology, how it fits within the enterprise and gain insight into some of the key initial use cases that are driving these transformations.

Comments

  • Dear Sir,
    I am looking help file for HDP 2.3, Please upload So that I can setup the hadoop environment at my Standalone PC

    Waiting for Your response

    • Hi Ram, please click through to the following link which will provide you access to the latest Hortonworks Sandbox. This is a fully self contained version of HDP in a virtual machine which can be run on your laptop. (vmware, hyperV or virtual box) https://hortonworks.com/hdp/downloads/

      The install guides are provided however for more information you can also find all documentation for HDP at http://docs.hortonworks.com

      From the same page you will also see a number of tutorials to get you started.

  • Question : Ive seen as well many demos where the user connects with excel and download from the server; is this a normal procedure? it would require lot of bandwidth… is this the way to proceed?

  • Very interesting session – You mentioned any sort of data can be stored in Hadoop… Could you please explain where the data sits physically?

    • Hi Francesco

      All data within HDP gets stored in HDFS. HDFS is the distributed filesystem that runs on commodity hardware. The data nodes in the cluster have locally attached disks which store the data physically.

  • What is the optimal size for the VM to be able to have a flawless Hadoop experience. I have downloaded the VMs though I have not yet had meaningful interactions with the system. Is there any additional software required to setup a use case if I have some raw data, some made up of tables or connections to get sql data and others in the form of GIS.

  • I have installed Hadoop in my PC. After installation i have tried to play virtual machine. Then I got a message like your system has only 2 CPUs. But virtual machine require 4 CPUs. What I have to do for this?

  • Leave a Reply

    Your email address will not be published. Required fields are marked *