【发布时间】:2020-09-03 13:40:48
【问题描述】:
尝试将数据从 Data Fusion 加载到 Salesforce 时出现此错误:
java.lang.RuntimeException: There was issue communicating with Salesforce
at io.cdap.plugin.salesforce.plugin.sink.batch.SalesforceOutputFormat.getRecordWriter(SalesforceOutputFormat.java:53) ~[1599122485492-0/:na]
at org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.initWriter(SparkHadoopWriter.scala:350) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:120) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at org.apache.spark.scheduler.Task.run(Task.scala:109) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) ~[spark-core_2.11-2.3.4.jar:2.3.4]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_252]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_252]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_252]
Caused by: com.sforce.async.AsyncApiException: InvalidJob : Invalid job id: null
at com.sforce.async.BulkConnection.parseAndThrowException(BulkConnection.java:182) ~[na:na]
at com.sforce.async.BulkConnection.doHttpGet(BulkConnection.java:753) ~[na:na]
at com.sforce.async.BulkConnection.getJobStatus(BulkConnection.java:769) ~[na:na]
at com.sforce.async.BulkConnection.getJobStatus(BulkConnection.java:760) ~[na:na]
at io.cdap.plugin.salesforce.plugin.sink.batch.SalesforceRecordWriter.<init>(SalesforceRecordWriter.java:69) ~[1599122485492-0/:na]
at io.cdap.plugin.salesforce.plugin.sink.batch.SalesforceOutputFormat.getRecordWriter(SalesforceOutputFormat.java:51) ~[1599122485492-0/:na]
... 10 common frames omitted
2020-09-03 08:41:28,595 - WARN [task-result-getter-0:o.a.s.ThrowableSerializationWrapper@192] - Task exception could not be deserialized
java.lang.ClassNotFoundException: Class not found in all delegated ClassLoaders: com.sforce.async.AsyncApiException
at io.cdap.cdap.common.lang.CombineClassLoader.findClass(CombineClassLoader.java:96) ~[na:na]
at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_252]
at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_252]
at io.cdap.cdap.common.lang.WeakReferenceDelegatorClassLoader.findClass(WeakReferenceDelegatorClassLoader.java:58) ~[na:na]
这个错误是什么意思?我已确保输入字段与 SObject 定义匹配。
【问题讨论】:
-
"Invalid job id" 听起来像批量 api 作业失败。浏览 trailhead.salesforce.com/en/content/learn/modules/api_basics/… 中的批量 API 概念。检查设置 - >批量...如果作业提交正常,也许有更好的错误消息。成功登陆顺丰了吗?
-
作业出现在“Monitor Bulk Data Load Jobs”页面,但状态仍为“Open”,没有显示错误信息。
-
那么可能是Java库有问题? SF 需要文件上传完成的指令(因为你可能已经决定加载多个文件,最多 10K 记录块,SF 无法知道)然后它可以开始处理。在我发布的那个链接中,它被称为“关闭工作”。
-
感谢您的链接。不过,我没有使用工作台。我使用数据融合管道加载、转换数据,然后使用 Salesforce 接收器发送数据。
标签: salesforce google-cloud-data-fusion