Can(and should) I have hundreds or a few thousands of partitions per Kafka topic

  Kiến thức lập trình

I’m new to the Kafka world and I’m trying to understand some things.
I know Kafka could support around 200K partitions per cluster and now with KRaft, it can go up to 2 million.

I’m more interested in “per topic” limitations (Theorectical or practical).
Let’s say I need to ingest and process a few thousands “New Order” messages in near-realtime (a few seconds up to a minute)

Producing the messages into a topic is not a problem.
Can I consume all those messages within a few small timeframe?
I know a partition is the unit of parallelism and if I move from 2 to 4 or to 8 or to 30 partitions, I do see the expected scalability improvements.

What if I need to create a few hundred consumers or 1K(on just one “New Orders” topic)?
The reason I’m asking is because of many cloud providers I’m considering to use,usually have limits of a few dozen partitions per topic which will not be enough for my use case.

I feel I’m missing something.

  • Is Kafka designed more to ingest a huge number of messages than
    consuming them?
  • Should I use a simpler message broker for this
    scenario like Azure Service Bus or RabbitMQ?
  • Should I use another approach?More Topics or something like that?

Thanks

LEAVE A COMMENT