Skip to main content
Stack Overflow
  1. About
  2. For Teams
Filter by
Sorted by
Tagged with
0 votes
0 answers
22 views

How can I make Schema Registry connect successfully to a Strimzi Kafka cluster using PLAINTEXT? [closed]

Why is Schema Registry unable to connect to the Strimzi Kafka cluster on port 9092 (PLAINTEXT)? Is there something special about how Strimzi advertises its broker listeners that prevents Schema ...
0 votes
1 answer
32 views

how to verify Confluent Kafka JsonSchema Serialized message?

I’m trying to verify whether messages produced using Confluent’s KafkaJsonSchemaSerializer are correctly serialized in the expected Confluent wire format (i.e. a "magic byte" followed by a 4-byte ...
0 votes
1 answer
31 views

Flink dynamic sink routing with confluent schema-registry

I have an Apache Flink app that is using a kafksink with a setTopicSelector KafkaSink<T>> sink = KafkaSink.<T>>builder() .setBootstrapServers(sink_brokers) ...
0 votes
0 answers
16 views

ksqldb multiple SchemaRegistry

I have: DC1 Kafka1 SR1 DC2 Kafka2 SR2 Clone SR1 I have MirrorMaker that copy data from DC1 to DC2. As well as internal SchemaRegistry topic with schemas. So in DC2 I have full copy SR1 to read ...
0 votes
0 answers
34 views

Handle env specific namespaces with schema evolution using KafkaAvroDeserializer

I am working in a Java Spring Boot app that primarily consumes Kafka events from a non compacted cdc topic, which a Debezium Connector publishes to. I am using basic defaults in KafkaAvroDeserializer ...
0 votes
1 answer
91 views

flink ConfluentRegistryAvroSerializationSchema not respecting registryConfigs

When I use in Apache Flink the KafkaRecordSerializationSchema with settings for the schema registry serialization , the registryConfigs settings are not taken in account settings like auto.register....
0 votes
0 answers
57 views

How to successfully implement FULL_TRANSITIVE evolution in schema registry with ENUMs

I have a schema with a set of enums and a schema registry set to full transitive compatibility. I'm seeing that simply adding an enum WITH a default field in both the V1 and V2 versions is returning ...
1 vote
1 answer
127 views

Confluent Schema Registry upgrade causing IAM authorization error with AWS MSK Kafka

We are running Kafka in AWS using MSK. We're also using Confluent's Schema Registry to manage Avro schemas used with Kafka. We run the Schema Registry in a container. We are trying to upgrade our base ...
0 votes
0 answers
11 views

Spark Schema Registry: Use additional header field

I use the code as described here: from pyspark.sql.functions import col, lit from pyspark.sql.avro.functions import from_avro, to_avro schema_registry_address = "https://confluent-schema-...
0 votes
0 answers
17 views

API call to get all subject aliases configured

As the title suggests. Is there a way to list every subject alias configured? What API calls am I supposed to use to get a list of "subjects" working effectively only as aliases to other ...
0 votes
0 answers
36 views

Unable to deserialize AVRO data and unable to load it into delta table [duplicate]

I have produced some data into a kafka-topic using schema-registry. So, the data is in a serialized wire format in kafka-topic. Now, I want to stream that data into a delta table using pyspark. My ...
0 votes
0 answers
100 views

Installing bitnami schema registry helm chart on strimzi kafka cluster on GKE + ssl

"I need to deploy the Bitnami Schema Registry Helm chart to my existing Strimzi Kafka cluster. I'm encountering challenges configuring the following: Enable SSL/TLS for HTTPS access (e.g., https:...
-1 votes
1 answer
38 views

JSON schema is not registered with Schema Registry

With the following Kafka Connect connector configuration: { "name": "Migrations_20250227172534930", "connector.class": "SqlServerCdcSourceV2", "kafka....
0 votes
1 answer
82 views

How to define timestamp in Flink SQL based on Kafka connector with Avro format

I have a Kafka topic that uses messages with value in Avro format with debezium types. It contains fields defined in Avro format in the following way: { "name": "updated", &...
0 votes
1 answer
105 views

Kafka Connect JDBC Source Connector unable to write write NULL record (tombstone) via SMT

I'm struggling with a jdbc source connector using io.confluent.connect.avro.AvroConverter and a Schema in the registry. I wrote a Custom SMT which allows Connect to return a null value (tombstone) if ...

15 30 50 per page
1
2 3 4 5
...
79

AltStyle によって変換されたページ (->オリジナル) /