【问题标题】:Spark-Redis connection issue when using SSL port(6380)使用 SSL 端口时的 Spark-Redis 连接问题 (6380)
【发布时间】:2021-07-20 07:01:36
【问题描述】:

按照此文档 (https://github.com/RedisLabs/spark-redis/blob/master/doc/configuration.md) 使用部署在 HDI 群集上的 Spark 作业将记录写入 Azure Redis 缓存。让它适用于非 SSL 端口(6379)。但是,当我将其更改为 SSL 端口(6380)时。我开始看到这些错误。想知道以前有没有人遇到过?感谢您的帮助和建议。 以下是使用的配置:

spark.redis.host = hostname of the Redis cluster
spark.redis.port = 6379
spark.redis.ssl = true
spark.redis.auth = auth-password

这是我们面临的错误:

EXCEPTION: Message: Error writing messages to Redis Cache, StackTrace: redis.clients.jedis.exceptions.JedisConnectionException: Could not get a resource from the pool
    at redis.clients.jedis.util.Pool.getResource(Pool.java:59)
    at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234)
    at com.redislabs.provider.redis.ConnectionPool$.connect(ConnectionPool.scala:33)
    at com.redislabs.provider.redis.RedisEndpoint.connect(RedisConfig.scala:69)
    at com.redislabs.provider.redis.RedisConfig.clusterEnabled(RedisConfig.scala:193)
    at com.redislabs.provider.redis.RedisConfig.getNodes(RedisConfig.scala:317)
    at com.redislabs.provider.redis.RedisConfig.getHosts(RedisConfig.scala:233)
    at com.redislabs.provider.redis.RedisConfig.<init>(RedisConfig.scala:132)
    at org.apache.spark.sql.redis.RedisSourceRelation.<init>(RedisSourceRelation.scala:34)
    at org.apache.spark.sql.redis.DefaultSource.createRelation(DefaultSource.scala:21)
    at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
    at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
    at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
    at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
    at workflow.EvaluatedCostSender.publishEvaluatedCost(EvaluatedCostSender.scala:53)
    at workflow.Runner$.workflow$Runner$$publishEvaluatedCostData(Runner.scala:215)
    at workflow.Runner$$anonfun$main$5.apply$mcVI$sp(Runner.scala:116)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
    at workflow.Runner$.main(Runner.scala:79)
    at workflow.Runner.main(Runner.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:678)
Caused by: redis.clients.jedis.exceptions.JedisConnectionException: java.net.SocketTimeoutException: Read timed out
    at redis.clients.jedis.util.RedisInputStream.ensureFill(RedisInputStream.java:205)
    at redis.clients.jedis.util.RedisInputStream.readByte(RedisInputStream.java:43)
    at redis.clients.jedis.Protocol.process(Protocol.java:155)
    at redis.clients.jedis.Protocol.read(Protocol.java:220)
    at redis.clients.jedis.Connection.readProtocolWithCheckingBroken(Connection.java:318)
    at redis.clients.jedis.Connection.getStatusCodeReply(Connection.java:236)
    at redis.clients.jedis.BinaryJedis.auth(BinaryJedis.java:2259)
    at redis.clients.jedis.JedisFactory.makeObject(JedisFactory.java:119)
    at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:836)
    at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:434)
    at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361)
    at redis.clients.jedis.util.Pool.getResource(Pool.java:50)
    ... 40 more
Caused by: java.net.SocketTimeoutException: Read timed out
    at java.net.SocketInputStream.socketRead0(Native Method)
    at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
    at java.net.SocketInputStream.read(SocketInputStream.java:171)
    at java.net.SocketInputStream.read(SocketInputStream.java:141)
    at java.net.SocketInputStream.read(SocketInputStream.java:127)
    at redis.clients.jedis.util.RedisInputStream.ensureFill(RedisInputStream.java:199)

【问题讨论】:

  • 这更具体到 Spark-Redis 连接器。我缺少任何必要的设置吗?

标签: apache-spark redis


【解决方案1】:

该问题与较旧的 Jedis 和 Spark-Redis 版本有关。迁移到最新版本有助于解决我们的问题。 推荐版本: jedis-3.6.2 spark-redis_2.11-2.6.0

还需要spark-redis下面的配置:

"spark.redis.ssl": true,
"spark.redis.sslprotocols": "tls12",
"spark.redis.timeout": "120000"

【讨论】:

    猜你喜欢
    • 2021-03-13
    • 2013-07-01
    • 1970-01-01
    • 1970-01-01
    • 2011-03-13
    • 1970-01-01
    • 1970-01-01
    • 2021-12-03
    • 1970-01-01
    相关资源
    最近更新 更多