Skip to main content
Filter by
Sorted by
Tagged with
Best practices
0 votes
0 replies
35 views

I am reading data from a Kafka topic using Kafka Streams and more specifically GlobalKTable to populate my store. In the case of corrupt data that can not successfully be parsed I wish to throw a ...
Andreas10001's user avatar
Best practices
0 votes
7 replies
61 views

I am using Kafka Streams, and trying to figure out the best practice for handling multiple stores that are populated using the same data without having to read it twice. My Kafka Streams service is ...
Andreas10001's user avatar
0 votes
0 answers
77 views

I’m running a Kafka Streams application (tested with versions 3.9.1 and 4.1.0) that uses a GlobalKTable backed by RocksDB. There are multiple instances of the application, each with 4 stream threads. ...
Sona Torosyan's user avatar
2 votes
1 answer
72 views

I'm developing a Spring Cloud Stream app that accepts messages in Avro format. After setting up everything, I start my app, then get an Unknown magic byte error. I'm thinking this is because I'm using ...
manaclan's user avatar
  • 1,044
0 votes
1 answer
42 views

Here is a simple topology: stream1 .merge(stream2) .merge(stream3) .to("topic1) stream2 = builder.stream("topic1") stream2. process(() -> new Processor<>() { process() { Read ...
user3092576's user avatar
2 votes
1 answer
66 views

I'm currently developing a Kafka Streams application. Due to internal requirements, I've disabled auto-commits by setting commit.interval.ms to a very long duration (e.g. Long.MAX). Instead, I'm using ...
Junhyun Kim's user avatar
0 votes
1 answer
48 views

I need to consume two topics A with 40 partitions and B with 10 partitions and keep some info in a shared persistent state store with social security number SSN of type string as its key and a custom ...
losingsleeep's user avatar
  • 1,899
1 vote
0 answers
29 views

The prior application is on kafka-streams (KStream) and has been rewritten to use Spring Cloud Stream. Both apps are on kafka-clients-3.1.2. When starting the upgraded application it gives: java.lang....
awgtek's user avatar
  • 1,899
-2 votes
1 answer
39 views

I have a kafka topic producing product for multiple companies {"company": "c1", "productIds": ["p1", "p2", "p3"]} {"company": &...
user2890683's user avatar
0 votes
1 answer
39 views

I'm building a conference system using Kafka Streams where users can join/leave rooms. I'm experiencing a race condition where multiple concurrent leave requests see the same stale room state, causing ...
lastpeony4's user avatar
0 votes
1 answer
59 views

I have read the explanation written here that one StreamTask is handeling all messages from co-partitioned topics: How do co-partitioning ensure that partition from 2 different topics end up assigned ...
Fatemah Soliman's user avatar
3 votes
0 answers
83 views

I’m comparing how GlobalKTable Kafka Streams handles checkpointing in version 2.5.0 versus 3.7.0, and I’ve identified a regression: in 3.7.0 the .checkpoint file isn’t created the first time you drain ...
paul's user avatar
  • 13.6k
0 votes
1 answer
55 views

I'm working on a Change Data Capture (CDC) application using Kafka Streams, and I have the following setup: A KStream is converted to a KTable (let's call it kt1). I perform a left join between kt1 ...
Hari Krishnan U's user avatar
0 votes
0 answers
113 views

I am expiriencing a very weird behaviour in my kafka streams application. Setup is the following: I create the state store "user-store" manually and connect it to a processor "filter-...
Tino's user avatar
  • 3
0 votes
0 answers
27 views

I have written sample Kafka Producer application without mentioning partition which sends message to Consumer application. My Consumer application is written using Kafka Streams as it is using state ...
Bandita Pradhan's user avatar
1 vote
2 answers
63 views

I'm having a topic with a size of 3gb of raw data in 60 Million records. But when I consume those records and I allocate in memory (Not using RocksDB) the needed heap memory is around 10GB. I cannot ...
PABLO GARCIA's user avatar
4 votes
0 answers
169 views

I have a project with kafka streams to create one minute candle on price for stock. My topology code is : List<String> inputTopics = new ArrayList<>(); inputTopics.add(tradeTopic); ...
mohammadjavadkh's user avatar
0 votes
1 answer
158 views

We have a KStream application (Kafka 3.6) which consumes from 3 topics and do some repartitioning, selectKey and reduce operation to create KTable and we use these 3 KTables to do LeftJoin and then ...
Justin's user avatar
  • 745
0 votes
1 answer
42 views

If I have a kafka input topic with multiple partitions and then in Kafka Streams I use kStream.map to change the key of each record and write that to an output topic, I will face the problem, that ...
selbstereg's user avatar
3 votes
1 answer
49 views

I am using Kafka Streams to group and reduce a kafka topic. I want to generate an output for a key, if the key and the value are equal for all values against a given key; otherwise don't output ...
simonalexander2005's user avatar
0 votes
1 answer
37 views

I have a stream of <K,V> messages. When emitting any satisfied sliding windows, I want to know the list that the window matched. I want to avoid taking on the accumulation job myself within my ....
redgiant's user avatar
  • 606
0 votes
1 answer
31 views

In the code below: var processed = builder.stream(topics.getRateLimitProcessorTopic(), Consumed.with(SerdesFactory.bucketKeySerdes(), SerdesFactory.bucketOperationSerdes())) .process(() -> ...
Max's user avatar
  • 1
1 vote
1 answer
128 views

I am designing a Kafka stream app and want to know few details in order to design my failover strategy. I tried reading Kafka stream doc and existing stackoverflow posts, but was unable to find the ...
Aakash Gupta's user avatar
0 votes
0 answers
197 views

I want to migrate my Kafka infrastructure to AWS MSK, and I've noticed there is new broker type called Express Brokers. I really want to use it since it is supposed to be faster, supports "hands-...
Amitb's user avatar
  • 612
2 votes
1 answer
156 views

We are using Kafka Streams and Karpenter with normal Deployment in order to manage the pods for a service that we have. After Karpenter decides to kill the pod, it brings a new Pod up, and we are ...
alext's user avatar
  • 842
0 votes
1 answer
39 views

I have a tricky case where a lot of historical JSON data was written to Kafka with a custom partitioning strategy. We are currently developing a system to make use of this historical data and perform ...
filpa's user avatar
  • 3,744
0 votes
1 answer
37 views

I use below: windowedBy(TimeWindows.of(Duration.ofHours(6))) .aggregate(aggregator, aggregator, Materialized.as("my-agg")) Changelog was created with below configs cleanup....
Abe's user avatar
  • 726
0 votes
0 answers
74 views

The error is ** Error starting ApplicationContext. To display the condition evaluation report re-run your application with 'debug' enabled. 2025-02-21T16:09:37.761+05:30 ERROR 38852 --- [Ecommerce] [ ...
Aindrail Santra's user avatar
0 votes
1 answer
61 views

What is the semantics for a Kafka Streams (3.7.1) KTable-KTable foreign key join, where the extracted foreign key has never matched against the primary key in the right-side ktable? In this example ...
KarlP's user avatar
  • 5,221
1 vote
1 answer
281 views

I'm working on a kafka streams application that consumes from a consumer group with three topics. One topic has 20 partitions, another 10, and the last one 5. So in total, this consumer group has 35 ...
rwachter's user avatar
0 votes
1 answer
38 views

I want to use Helidon SE 4.1.6 and producer the data to a specific partition of Apache Kafka using producer. Detail : I have gone through the https://helidon.io/docs/latest/se/reactive-messaging#...
MOHAMMAD SHADAB's user avatar
0 votes
0 answers
96 views

I am new to Kafka streams and have come across a use case where there is a need to detect anomalies in an event stream. Here is the high level use case: Events There is a stream of incoming events in ...
Saikat's user avatar
  • 568
1 vote
1 answer
69 views

We are using Kafka Streams in our application to process events. In order to join operation we use both repartition and selectKey methods of Kafka Streams and this cause to create an internal ...
ceb's user avatar
  • 39
0 votes
0 answers
85 views

We have a kafka streams service that performs a left join (KStreams) operation by their message key. The message size is 1 KB more or less. The left topic has around two hundred thousand (200,000) ...
Martinus Elvin's user avatar
1 vote
0 answers
85 views

Context and analysis Our Spring Boot application relies on information stored in Kafka to answer REST requests. It does so by retrieving information from a global state store via the ...
Cédric Schaller's user avatar
2 votes
1 answer
113 views

I have two Kafka queues(queue1 & queue2), and multiple producers(clients) are posting messages to both queues. Each client sends different types of messages to the queues (e.g., client1 send "...
deen's user avatar
  • 21
2 votes
1 answer
45 views

We have a Java application which uses Streams library to process data. The streams application does not join data from multiple topics and processes each message received independently. The ...
Andrey's user avatar
  • 478
1 vote
1 answer
58 views

After migrating to the latest streams version 3.9.0 from 3.5.0, I notice a behaviour in the left foreign key join, that I am not able to understand. For a left foreign key join : KTable<String, ...
Sumit Baurai's user avatar
0 votes
0 answers
22 views

I'm reading a Confluent blog about Windowing in Kafka Strams: https://www.confluent.io/blog/windowing-in-kafka-streams/ and I found this under the Session Windows: If your inactivity window is too ...
gregof's user avatar
  • 23
2 votes
2 answers
122 views

I'm developing a Quarkus microservice that utilizes Kafka Streams to process messages from multiple topics. Specifically, I'm attempting to join a KStream and a KTable derived from two of these topics....
Alex H's user avatar
  • 31
1 vote
1 answer
85 views

I have a Kafka Streams app that takes input from a Kafka Topic, aggregates it on three fields in the original's value in 5 minute windows. On the output side of this, I need to translate the ...
John Ament's user avatar
  • 11.8k
4 votes
0 answers
187 views

We have built real time data processing pipelines on kafka topics in the past with kafka streams technology. But we were always limited by the number of partitions on the kafka topic for concurrency ...
birinder tiwana's user avatar
0 votes
0 answers
76 views

org.apache.kafka:kafka-streams:jar:3.9.0 is suposed to use org.apache.kafka:kafka-clients:jar:3.9.0, but when I run mvn dependency:tree, I get [INFO] +- org.apache.kafka:kafka-streams:jar:3.9.0:...
M. Bouzaien's user avatar
0 votes
1 answer
78 views

I am working on a distributed system using Kafka Streams for communication between components. One of the components, (for simplicity BRAIN), manages a sequence of messages to other components (A, B, ...
Paul Marcelin Bejan's user avatar
1 vote
1 answer
203 views

What's the easiest or best way to create multiple topologies in a Spring Boot Kafka streams application? Is it possible to use the same default StreamBuilder bean? Or if I need to create a new ...
losingsleeep's user avatar
  • 1,899
0 votes
1 answer
49 views

I have a Kstreams application where I am reading from an input topic, performing aggregation in a window of 15 min , suppressing and then performing some operation on each record, following is the ...
Ashutosh Singh's user avatar
3 votes
1 answer
48 views

I'm experiencing two issues with Kafka Streams' processValues() and suppress operations: Getting NPE when using processValues(): @Bean public Function<KStream<String, String>, KStream<...
suno3's user avatar
  • 201
0 votes
1 answer
35 views

I'm trying to debug a problem in our production Kafka Streams app. The (simplified) topology looks something like this builder.stream("input").groupByKey().reduce( (agg, val) -> "...
Egor's user avatar
  • 1,660
0 votes
1 answer
69 views

I want to send multiple messages downstream using Transformer (kafka streams dsl) private ProcessorContext context; @Override public void init(ProcessorContext context) { this....
xmm_581's user avatar
  • 41
-1 votes
1 answer
139 views

We are building event-driven architecture for a live collaborative form. Our solution uses Kafka as the event broker, where event ordering and stateful stream processing are critical requirements. We ...
SB Praveen's user avatar

1
2 3 4 5
81