【发布时间】:2019-02-07 00:04:24
【问题描述】:
这是我的代码:
ssc =streamingcontext(sparkcontext,Seconds(time))
spark = sparksession.builder.config(properties).getorcreate()
val Dstream1: ReceiverInputDstream[Document] = ssc.receiverStream(properties) // Dstream1 has Id1 and other fields
val Rdd2 = spark.sql("select Id1,key from hdfs.table").rdd // RDD[Row]
有没有办法加入这两个?
【问题讨论】:
标签: apache-spark spark-streaming rdd apache-spark-dataset dstream