Home Forums Sqoop Sqoop export FROM HDFS into SAP HANA

This topic contains 7 replies, has 3 voices, and was last updated by  Koelli Mungee 7 months, 3 weeks ago.

  • Creator
    Topic
  • #45808

    Felix Bekcer
    Participant

    Hi everybody,

    i’m working now with hadoop and sap hana for around 3 month and i stick heavily.
    My Hadoop System is importing and converting logfiles via pig script and generates a result, which i finally want to export to sap hana appliance.

    I wanted to use sqoop for this task, but at the moment, i’m wondering about the error message and the behavior of the system.
    To get handy with sqoop, i generated a test file with some simple input like this:

    1
    3
    5

    On the other side, i generated a table in sap hana with a text field column.
    I used this sqoop command to successfully export the hdfs test file into sap hana:

    sqoop export –connect jdbc:sap://saphana.XXX.XXX:30115/ –driver com.sap.db.jdbc.Driver –table XXX.XXX–username XXX –password XXX –export-dir /user/tom/test

    And it worked. I saw 1 3 and 5 on the sap hana side.
    Then i tried to export much more data, but till in the simple format like:

    1
    3
    5
    7
    9
    1

    And then i got some error messages, which dont have something to do with the change of my test file:

    15:18:44 INFO – org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(1422) | Task Id : attempt_201312180839_0052_m_000000_0, Status : FAILED
    java.io.IOException: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [257]: sql syntax error: incorrect syntax near “,”: line 1 col 35 (at pos 35)

    Could someone help me with this dirty problem?

    Suerte

Viewing 7 replies - 1 through 7 (of 7 total)

You must be logged in to reply to this topic.

  • Author
    Replies
  • #46473

    Koelli Mungee
    Moderator

    Hi Suerte,

    We apologize for the delay. Assuming you are still seeing this problem, would it be possible to send in the full stack trace from the error. Is there a difference between the first two lines and the rest of the lines that are not being exported. Is there a timeout on the SAP Hana side you could be hitting?

    thanks,
    Koelli

    Collapse
    #46250

    Vick R
    Participant

    Thanks Felix and Have a Happy New Year

    Collapse
    #46218

    Felix Bekcer
    Participant

    Hi Vick R,

    here you can find the client, studio and driver.

    Have fun ;)

    https://hanadeveditionsapicl.hana.ondemand.com/hanadevedition/

    Collapse
    #46212

    Vick R
    Participant

    Felix,
    I was looking at this thread. Although I can’t help right now,but I have a question regarding what drivers to use to connect HDFS to Hana. I am familiair with SAP SDA but when I tried to look at SAP Marketplace, I was unable to find the correct drivers.From you message above, it looks like you are using JDBC drivers? Are there are specific ones and how to get those?

    Vick

    Collapse
    #46206

    Felix Bekcer
    Participant

    [root@localhost attempt_201312270310_0023_m_000000_0]# cat syslog
    2013-12-28 15:33:18,150 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
    2013-12-28 15:33:18,703 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
    2013-12-28 15:33:18,711 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@2830ae41
    2013-12-28 15:33:18,979 INFO org.apache.hadoop.mapred.MapTask: Processing split: Paths:/user/tom/part2:0+10
    2013-12-28 15:33:19,706 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
    2013-12-28 15:33:19,710 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
    2013-12-28 15:33:19,717 WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library is available
    2013-12-28 15:33:19,717 INFO org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library loaded
    2013-12-28 15:33:19,748 DEBUG org.apache.sqoop.mapreduce.AutoProgressMapper: Instructing auto-progress thread to quit.
    2013-12-28 15:33:19,748 DEBUG org.apache.sqoop.mapreduce.AutoProgressMapper: Waiting for progress thread shutdown…
    2013-12-28 15:33:19,751 INFO org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
    2013-12-28 15:33:19,753 DEBUG org.apache.sqoop.mapreduce.AutoProgressMapper: Progress thread shutdown detected.
    2013-12-28 15:33:19,898 INFO org.apache.hadoop.mapred.MapTask: Ignoring exception during close for org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector@525c7734
    java.io.IOException: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: Object is closed: com.sap.db.jdbc.ConnectionSapDBFinalize@78ff9053[ID 220714].
    at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:184)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:649)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1792)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:778)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:363)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
    Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: Object is closed: com.sap.db.jdbc.ConnectionSapDBFinalize@78ff9053[ID 220714].
    at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
    at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateSQLException(SQLExceptionSapDB.java:136)
    at com.sap.db.jdbc.ConnectionSapDB.assertOpen(ConnectionSapDB.java:228)
    at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(Co

    Collapse
    #46188

    Felix Bekcer
    Participant

    couldn’t nobody help me? i’m still sticking at this point.

    please

    Collapse
    #45891

    Felix Bekcer
    Participant

    curiously sqoop exported the first two lines to sap hana and then crashes.


    15:18:19 DEBUG – org.apache.sqoop.mapreduce.ExportInputFormat.getSplits(76) | Generated splits:
    15:18:19 DEBUG – org.apache.sqoop.mapreduce.ExportInputFormat.getSplits(78) | Paths:/user/tom/part2:0+3 Locations:localhost.localdomain:;
    15:18:19 DEBUG – org.apache.sqoop.mapreduce.ExportInputFormat.getSplits(78) | Paths:/user/tom/part2:3+3 Locations:localhost.localdomain:;
    15:18:19 DEBUG – org.apache.sqoop.mapreduce.ExportInputFormat.getSplits(78) | Paths:/user/tom/part2:6+3 Locations:localhost.localdomain:;
    15:18:19 DEBUG – org.apache.sqoop.mapreduce.ExportInputFormat.getSplits(78) | Paths:/user/tom/part2:9+2,/user/tom/part2:11+3 Locations:localhost.localdomain:;
    15:18:19 INFO – org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(1380) | Running job: job_201312180839_0052
    15:18:20 INFO – org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(1393) | map 0% reduce 0%
    15:18:42 INFO – org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(1393) | map 25% reduce 0%
    15:18:43 INFO – org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(1393) | map 75% reduce 0%
    15:18:44 INFO – org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(1422) | Task Id : attempt_201312180839_0052_m_000000_0, Status : FAILED
    java.io.IOException: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: [257]: sql syntax error: incorrect syntax near “,”: line 1 col 35 (at pos 35)
    at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:184)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:649)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:363)

    My first idea was the autocommit feature, but i debugged in and checked it. it is false.
    i tried several text file values but the only thing that fixes the crash was to cut all values out expect the first two lines.

    Collapse
Viewing 7 replies - 1 through 7 (of 7 total)