Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PubSub Redis] Support per-subscription concurrency #3424

Open
olitomlinson opened this issue May 23, 2024 · 5 comments
Open

[PubSub Redis] Support per-subscription concurrency #3424

olitomlinson opened this issue May 23, 2024 · 5 comments
Labels
kind/enhancement New feature or request

Comments

@olitomlinson
Copy link

Describe the feature

As it stands the PubSub implementation of Redis Streams provides a concurrency control setting via metadata.

This concurrency control is not scoped to each individual subscription.

For example, if you have a Redis PubSub component, which has 2 topics

  • Topic A
  • Topic B

and you have an App which subscribes to both of those Topics, the concurrency will be distributed across both subscribers.

So given concurrency is set to 10 then this means that there are 10 workers assigned, and they will share the load across both subscriptions. It could be split 5:5 / 9:1 / 7:3 and change at random etc etc - which isn't helpful when trying to control the concurrency of each subscription.


I propose that a new feature is added that sets a per-subscription concurrency control, such that each subscription gets exactly that many workers / concurrency. This would allow a consistent 10 workers for subscription A and seperate 10 workers for Subscription B.

Ideally, the best solution here is to be able to specify different levels of concurrency per subscription, but thats possibly a step too far for now.

Release Note

RELEASE NOTE:

@olitomlinson olitomlinson added the kind/enhancement New feature or request label May 23, 2024
@berndverst
Copy link
Member

@olitomlinson I'd like to think about this from a user perspective before I can think about whether this can be done (or should be done):

A per subscription concurrency could be possible via subscription metadata (not component metadata), where either in declarative pubsub subscription or the dynamic programmatic subscription you provide metadata that indicates the desired concurrency (similar to the way a deadletter queue is defined).

Perhaps the existing concurrency setting can be redefined to mean per topic concurrency. And this could be the default that is used if you did not override the concurrency in the subscription metadata.

This would probably be my preference.

If we retained the existing concurrency concept we would have to otherwise enforce that all other concurrency numbers do not add up to something greater than the original concurrency feature. This is not ideal and could be confusing.

OR maybe something like the following:

concurrency (shared): unchanged behavior
pertopicconcurrency: new, cannot be used in parallel with concurrency. If this is set it will take precedence. It will give every topic this many concurrent workers by default unless,
subscription metadata: concurrency: If this is set, the given topic subscription will use the specified concurrency. If this is not set, it will use whatever is set in pertopiconcurrency. And if pertopicconcurrency is not set, then the existing shared global worker count is used.

It might be overcomplicating thing though.. the implementation of this could be rather complex.

So I'd like us to figure out how to simplify this.

Copy link

github-actions bot commented Jul 5, 2024

This issue has been automatically marked as stale because it has not had activity in the last 30 days. It will be closed in the next 7 days unless it is tagged (pinned, good first issue, help wanted or triaged/resolved) or other activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label Jul 5, 2024
Copy link

This issue has been automatically closed because it has not had activity in the last 37 days. If this issue is still valid, please ping a maintainer and ask them to label it as pinned, good first issue, help wanted or triaged/resolved. Thank you for your contributions.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 12, 2024
@olitomlinson olitomlinson reopened this Oct 11, 2024
@github-actions github-actions bot removed the stale label Oct 11, 2024
Copy link

This issue has been automatically marked as stale because it has not had activity in the last 30 days. It will be closed in the next 7 days unless it is tagged (pinned, good first issue, help wanted or triaged/resolved) or other activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label Nov 10, 2024
Copy link

This issue has been automatically closed because it has not had activity in the last 37 days. If this issue is still valid, please ping a maintainer and ask them to label it as pinned, good first issue, help wanted or triaged/resolved. Thank you for your contributions.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 17, 2024
@olitomlinson olitomlinson reopened this Dec 16, 2024
@github-actions github-actions bot removed the stale label Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants