Questions tagged [apache-kafka-connect]
Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.
apache-kafka-connect
3,859
questions
0
votes
0
answers
9
views
Using the Confluent BigQuery sink Kafka connector, why are my messages/records not being written to new BQ tables, but existing tables are?
I am running Kafka Connect locally using Docker, but also in AWS MSK. Both are producing the same results.
My connector configuration is as follows:
{
"name": "connector-name",
&...
0
votes
0
answers
18
views
S3 Sink Confluent Kafka Connector JsonParser feature
I have json data in kafka, getting below stack trace
at org.apache.kafka.connect.json.JsonConverter.toConnectData(JsonConverter.java:331)
... 18 more Caused by: com.fasterxml.jackson.core....
0
votes
1
answer
23
views
Strimzi Kafka Connect Cluster - Resource planning
I'm planing resources which will be dedicated to run Kafka Connect Clusters in Strimzi Distribution.
My workloads are pretty small, there will be probably only 1 task per connector needed.
I would go ...
0
votes
0
answers
12
views
Can debezium/Kafka connect ignore or resize too large messages?
We're using debezium via AWS MSK serverless and MSK Connect to monitor the binlog of RDS Aurora MySQL.
For 99% of our data this all works fine, but very occasionally debezium fails with:
[Worker-...
-1
votes
0
answers
12
views
Custom vault kafka configProvider for Mongo sink Connector
I implemented a custom vault configProvider with dynamic credentials rotation using configData(Map<String, String> data, Long ttl) for a MongoSink Kafka connect.
The first version of plugin ...
0
votes
1
answer
27
views
Getting Timeout Kafka exception org.apache.kafka.common.errors.TimeoutException: Call(callName=deleteRecords(api=METADATA)
I am facing kafka timeout exception. Error is:
partition: B3-0 error: org.apache.kafka.common.errors.TimeoutException: Call(callName=deleteRecords(api=METADATA), deadlineMs=1721138713548, tries=721498,...
0
votes
1
answer
21
views
How many kafka connect clusters are optimal?
I use Kafka with Strimzi operator and I am wondering if there is a optimal number of kafka connect clusters I should make. I have a medium sized infrastructure with around 50 different business logic ...
0
votes
0
answers
21
views
Minimal MirrorSourceConnector configurations to replicate data to another Kafka instance
I am trying to replicate topics from one Kafka cluster to another using Mirror Maker.
I am using docker compose to create the following resources:
networks:
local_kafka:
name: local_kafka
...
0
votes
0
answers
15
views
Add headers from kafka records to mongo field using sink connector
Im using a mongo sink connector to sink data from kafka topic to mongo collection, all i need is to add headers from these Kafka record to some field in the mongo record.
Its ok if I can map each ...
0
votes
1
answer
20
views
SQL connector to KSQLDB , how to do it in docker.yml?
I found myself struggling to get documentation updated on how to put SQL server DB to KSQLDB ?
I would like to know what would be the easiest way to get the connection toward Azure SQL Server and ...
0
votes
0
answers
33
views
Kafka Connect to Snowflake connection via JDBC error
I've been trying to send data from Kafka to Snowflake using the JDBC driver with Kafka Connect.
Some details about the environment:
Kafka is running in a Cloudera private cluster (Base 7.1.9).
The ...
0
votes
0
answers
9
views
how to prevent terminal stop when error occur during the kafka connection?
1) kafka code
import { Injectable } from "@nestjs/common";
import { v4 as uuidv4 } from 'uuid';
import { KafKaQueueNames } from "../dto/enum/kakaQueueNames";
import { Kafka, ...
0
votes
1
answer
54
views
MSK connector failed to find class in MSK cluster with KRaft mode
I have a Kafka 3.7.x cluster using ZooKeeper mode which works well. Now I am trying new Kafak in KRaft mode:
However, for this Kafka cluster in KRaft mode, the same connector failed to start.
Here ...
-1
votes
1
answer
22
views
Ansible EDA Kafka Connection Error - trying to create a connection to kafka within a rulebook but the rulebook activation fails
I have been trying to create a connection to kafka within a rulebook but when I run my rulebook activation I experience this error:
ansible_rulebook.rule_set_runner - INFO - Ruleset: Listen for events,...
0
votes
0
answers
17
views
Debezium stops after inserted or update data into the table when using decoderbufs
The initial synchronization worked as expected, but when I insert or update data into the table, the connection with postgres closes.The connector is still marked as active and running.This issue only ...