Home Forums Flume Windows 2012 STD R2 Server – Flume Errors

This topic contains 1 reply, has 1 voice, and was last updated by  S R 4 months, 1 week ago.

  • Creator
    Topic
  • #51386

    S R
    Participant

    Hello:
    I am getting the following error message and it goes on a loop. Appreciate if someone has already fixed. I am using agent Source as spoolDir. The Syslog is sitting in the Ingest folder and also I see data.<Number>.seq file in HDFS

    09 Apr 2014 14:57:51,469 ERROR [pool-9-thread-1] (org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run:173) – Uncaught exception in Runnable
    java.lang.IllegalStateException: Serializer has been closed
    at org.apache.flume.serialization.LineDeserializer.ensureOpen(LineDeserializer.java:124)
    at org.apache.flume.serialization.LineDeserializer.readEvents(LineDeserializer.java:88)
    at org.apache.flume.client.avro.ReliableSpoolingFileEventReader.readEvents(ReliableSpoolingFileEventReader.java:221)
    at org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:160)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:744)

    Software Versions
    HDP Version : 2.0
    OS : Windows 2012 Standard Server R2

    Flume Configuration File:
    # Name the components on this agent
    agent.sources = WinHadoopC1Source
    agent.sinks = WinHadoopC1Sink1
    agent.channels = WinHadoopC1Channel1

    # Describe/configure the source
    agent.sources.WinHadoopC1Source.type=spooldir
    agent.sources.WinHadoopC1Source.spoolDir = C:/flume_spooldir
    agent.sources.WinHadoopC1Source.fileHeader = true

    # Describe the sink
    agent.sinks.WinHadoopC1Sink1.type=hdfs
    agent.sinks.WinHadoopC1Sink1.hdfs.path = hdfs://WIN-ATKSGSRL5DL/logspooldir
    agent.sinks..WinHadoopC1Sink1.hdfs.rollSize=1024000
    agent.sinks.WinHadoopC1Sink1.hdfs.fileType = SequenceFile
    agent.sinks.WinHadoopC1Sink1.hdfs.filePrefix = data
    agent.sinks.WinHadoopC1Sink1.hdfs.fileSuffix = .seq
    agent.sinks.WinHadoopC1Sink1.hdfs.idleTimeout=60

    # Use a channel which buffers events in memory
    agent.channels.WinHadoopC1Channel1.type = memory
    agent.channels.WinHadoopC1Channel1.capacity = 100000
    agent.channels.WinHadoopC1Channel1.transactionCapacity = 10000

    # Bind the source and sink to the channel
    agent.sources.WinHadoopC1Source.channels = WinHadoopC1Channel1
    agent.sinks.WinHadoopC1Sink1.channel = WinHadoopC1Channel1
    Thanks, Satya Raju

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #51844

    S R
    Participant

    Hi
    Was able to test successfully on Windows single node installation. Gave Read/Write permissions on the Spool Dir for Hadoop User and after that it started working. Interestingly I went back and removed the ‘Hadoop’ user Object from that Folder and still it works. It works but not sure of the root cause.

    Collapse
Viewing 1 replies (of 1 total)