【发布时间】:2021-10-24 10:30:01
【问题描述】:
我正在尝试在 Linux Ubuntu 20.04 上将 Kafka 与 MongoDB 连接,以前它工作正常,但现在我在运行时遇到错误。
这是我尝试将 Kafka 与 MongoDb 连接起来的方式。
我制作了一个单独的 connect-standalone_bare.properties 文件,其中包含以下实体:
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# These are defaults. This file just demonstrates how to override some settings.
bootstrap.servers=localhost:9092
# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.converters.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=false
value.converter.schemas.enable=false
rest.port:8084
offset.storage.file.filename=/tmp/connect.offsets-1
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include
# any combination of:
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples:
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
plugin.path=/home/ms-batch18/Documents/kafka_2.13-2.8.0/libs/mongo-kafka-connect-1.2.0-all.jar
这是我的 MongoDb 接收器连接器:
name=mongo-sink
topics=test
connector.class=com.mongodb.kafka.connect.MongoSinkConnector
tasks.max=1
key.ignore=true
connection.uri=mongodb://localhost:27017
database=test_kafka
collection=transaction
max.num.retries=3
retries.defer.timeout=5000
type.name=kafka-connect
schemas.enable=false
运行此命令时:
bin/connect-standalone.sh config/connect-standalone_bare.properties config/MongoSinkConnector.properties
我正面临这个错误:
(org.apache.kafka.connect.runtime.WorkerInfo:71)
[2021-08-24 12:09:14,826] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:126)
java.nio.file.NoSuchFileException: config/connect-standalone_bare.properties
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:116)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:219)
at java.base/java.nio.file.Files.newByteChannel(Files.java:371)
at java.base/java.nio.file.Files.newByteChannel(Files.java:422)
at java.base/java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:420)
at java.base/java.nio.file.Files.newInputStream(Files.java:156)
at org.apache.kafka.common.utils.Utils.loadProps(Utils.java:629)
at org.apache.kafka.common.utils.Utils.loadProps(Utils.java:616)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:75)
【问题讨论】:
标签: bash apache-kafka apache-kafka-connect