YARN Forum

Hive Insert Overwrite

  • #57603
    Diwakar Dhanuskodi
    Participant

    Having issue in yarn resource manager while using HIVE Insert Overwrite command on a partitioned table.

    .hadoop.ipc.Server: Socket Reader #3 for port 54711: readAndProcess from client 100.73.124.3 threw exception [java 1="An" 2="existing" 3="connection" 4="was" 5="forcibly" 6="closed" 7="by" 8="the" 9="remote" 10="host" language=".io.IOException:"][/java]
    java.io.IOException: An existing connection was forcibly closed by the remote host
    at sun.nio.ch.SocketDispatcher.read0(Native Method)
    at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
    at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
    at sun.nio.ch.IOUtil.read(IOUtil.java:197)
    at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:379)
    at org.apache.hadoop.ipc.Server.channelRead(Server.java:2558)
    at org.apache.hadoop.ipc.Server.access$2800(Server.java:130)
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1459)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:750)
    at org.apache.hadoop.ipc.Server$Listener$Reader.doRunLoop(Server.java:624)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:595)
    2014-07-22 07:58:10,128 INFO [DelayedContainerManager] org.apache.tez.dag.app.rm.TaskScheduler: No taskRequests. Container’s session delay expired or is new. Releasing container, containerId=container_1405689464469_0007_01_000022, containerExpiryTime=1406015890002, sessionDelay=10000, taskRequestsCount=0, heldContainers=3, delayedContainers=0, isNew=false
    2014-07-22 07:58:10,128 INFO [DelayedContainerManager] org.apache.tez.dag.app.rm.TaskScheduler: Releasing unused container: container_1405689464469_0007_01_000022
    2014-07-22 07:58:10,128 INFO [AsyncDispatcher event handler] org.apache.tez.dag.app.rm.container.AMContainerImpl: AMContainer container_1405689464469_0007_01_000022 transitioned from IDLE to STOP_REQUESTED via event C_STOP_REQUEST
    2014-07-22 07:58:10,128 INFO [ContainerLauncher #8] org.apache.tez.dag.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_STOP_REQUEST
    2014-07-22 07:58:10,128 INFO [ContainerLauncher #8] org.apache.tez.dag.app.launcher.ContainerLauncherImpl: Sending a stop request to the NM for ContainerId: container_1405689464469_0007_01_000022
    2014-07-22 07:58:10,128 INFO [AsyncDispatcher event handler] org.apache.tez.dag.app.rm.container.AMContainerImpl: AMContainer container_1405689464469_0007_01_000022 transitioned from STOP_REQUESTED to STOPPING via event C_NM_STOP_SENT
    2014-07-22 07:58:10,144 INFO [Socket Reader #5 for port 54711] org.apache.hadoop.ipc.Server: Socket Reader #5 for port 54711: readAndProcess from client 100.73.124.3 threw exception [java 1="An" 2="existing" 3="connection" 4="was" 5="forcibly" 6="closed" 7="by" 8="the" 9="remote" 10="host" language=".io.IOException:"][/java]
    java.io.IOException: An existing connection was forcibly closed by the remote host

to create new topics or reply. | New User Registration

You must be to reply to this topic. | Create Account

Support from the Experts

A HDP Support Subscription connects you experts with deep experience running Apache Hadoop in production, at-scale on the most demanding workloads.

Enterprise Support »

Become HDP Certified

Real world training designed by the core architects of Hadoop. Scenario-based training courses are available in-classroom or online from anywhere in the world

Training »

Hortonworks Data Platform
The Hortonworks Data Platform is a 100% open source distribution of Apache Hadoop that is truly enterprise grade having been built, tested and hardened with enterprise rigor.
Get started with Sandbox
Hortonworks Sandbox is a self-contained virtual machine with Apache Hadoop pre-configured alongside a set of hands-on, step-by-step Hadoop tutorials.
Modern Data Architecture
Tackle the challenges of big data. Hadoop integrates with existing EDW, RDBMS and MPP systems to deliver lower cost, higher capacity infrastructure.