The Hortonworks Community Connection is now live. A completely rebuilt Q&A forum, Knowledge Base, Code Hub and more, backed by the experts in the industry.

You will be redirected here in 10 seconds. If your are not redirected, click here to visit the new site.

The legacy Hortonworks Forum is now closed. You can view a read-only version of the former site by clicking here. The site will be taken offline on January 31,2016

Hortonworks Sandbox Forum

Suggestion new tutorial

  • #27709

    I have a suggestion for another tutorial.

    I did all the tutorial except for the last one (for Mac). And what I learned is that there is still a lot of work to make it userfriendly. Log are not really readable for anyone who is used to work with just interface. And the interface are limited for the moment in functionality.

    So it will mostly be persons with some kind of background in IT that will follow those tutorials. To continue further in this the interface is just too limited at the moment. So the suggestion is to make a tutorial about command line commandos.
    Make a tutorial where files are uploaded using the hadoop command. Explain what you can do with this command.

    Or can make a mix between interface and command line.

    I had troubles uploading a file to the virtual machine. I had no drives shared, USB did not work, …. So what I did is I uploaded the file through the tutorial interface. Next I logged in to the virtual machine as root and started searching for those files but had no luck finding them. That is when I learned about the hadoop fs command..

    Would have been nice if that was explained in a tutorial.

    best regards,


  • Author
  • #27747

    Hi Wim,

    Thanks for the suggestion, I will pass it along.

    The ‘hadoop fs’ commands as well as a lot more explained quite well in ‘Hadoop the Definitive Guide’.


The forum ‘Hortonworks Sandbox’ is closed to new topics and replies.

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.