Hortonworks Sandbox Forum

Suddenly sandbox can't connect

  • #46192
    Mary Dietess

    I have a sandbox 1.3 on Hyper-V.
    Usually when I do mapReduce with 1 million records in one file, everything is OK, but I decided to try 60 million records in several files on input. mapreduce started, made several percents of mapping and then sandbox showed me this repeating error:

    INFO ipc.Client: Retrying connect to server: sandbox/ Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1 SECONDS)

    Since this event I tried to restart hadoop services and whole sandbox, but it goes just two ways: veeery veeery slow reacting my commands and doesn’t complete mapreduce without any logging. Hadoop starts mapreduce, starts mapping and hangs at 0% with nothing in logging ot _temporary in output folder


    just repeating this “I can’t connect to myself” error even with dfs commands. In this case I need to restart namenode.

    What can it be and why? I worked with hortonworks hadoop sandbox 1.3 from-the-box and work with small amounts of data was OK. All I did – changed amount of data to input.

    Sorry for my english.

to create new topics or reply. | New User Registration

  • Author
  • #46193

    Hi Mary,

    Can you run a “df” on the sandbox and see if any of the filesystems are full?



    Mary Dietess

    Hi, Dave.

    Sorry for long reply – different timezone :)
    No, none of filesystems are full. And one interesting nuance: when mapreduce works, virtual machine use ~1-3% of CPU.


    Hi Mary,

    What about memory constraints – can you check how much the java process is using and compare it to its Xmx ?
    The Sandbox is configured to use 2GB of memory (iirc) – so the settings are usually fairly low.
    If it does turn out to be memory related then I would suggest building your own single node cluster on more powerful hardware – or you can reconfigure the settings on the sandbox – however we do not recommend you do this and cannot advise what settings you should use.



You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.