【发布时间】:2020-09-01 19:35:24
【问题描述】:
我有一个 hdfs 格式的 csv 文件:
000000131,2020-07-22,0.0,"","",1595332359218,khf987ksdfi34
000000112,2020-07-22,0.0,"","",1595442610265,khf987ksdfi34
000000150,2020-07-22,0.0,"","",1595442610438,khf987ksdfi34
我想使用 sqoop 将此文件导出到 oracle,如下所示:
sqoop export --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=oracledb)(port=1521))(connect_data=(service_name=stgdb)))" --table CORE_ETL.DEPOSIT_TURNOVER --username xxxx --password xxxx --export-dir /tmp/merged_deposit_turnover/ --input-fields-terminated-by "," --input-lines-terminated-by '\n' --input-optionally-enclosed-by '\"' --map-column-java DATE=java.sql.Date,INSERT_TS=java.sql.Timestamp
但是这个过程以这个错误结束:
Caused by: java.lang.RuntimeException: Can't parse input data: '1595332359218' at
CORE_ETL_DEPOSIT_TURNOVER.__loadFromFields(CORE_ETL_DEPOSIT_TURNOVER.java:546) at
CORE_ETL_DEPOSIT_TURNOVER.parse(CORE_ETL_DEPOSIT_TURNOVER.java:431) at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:88) ... 10 more Caused
by: java.lang.IllegalArgumentException at java.sql.Date.valueOf(Date.java:143) at
CORE_ETL_DEPOSIT_TURNOVER.__loadFromFields(CORE_ETL_DEPOSIT_TURNOVER.java:529) ... 12 more
我想知道有没有一种方法可以在不更改 HDFS 中数据格式的情况下将此文件导出到 oracle。
还有 oracle 架构:
【问题讨论】:
标签: oracle mapreduce export sqoop