Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider supporting Redis Streams as a true queue source #33

Open
oising opened this issue Sep 11, 2021 · 8 comments
Open

Consider supporting Redis Streams as a true queue source #33

oising opened this issue Sep 11, 2021 · 8 comments

Comments

@oising
Copy link

oising commented Sep 11, 2021

From looking at the source, it seems this provider is based on simple pub-sub and the queueing of data happens in memory, in the provider itself. I guess this is because this provider was born before redis 5.0 (where streams/lists were added) - it would add resilience and more flexibility if we could use native redis streams as a provider.

https://redis.io/topics/streams-intro

Have you looked at this already?

@berdon
Copy link
Owner

berdon commented Sep 11, 2021

I’ll look into it. Yeah, this definitely predates redis streams.

@berdon
Copy link
Owner

berdon commented Sep 11, 2021

No promises though on timeliness as I’m no longer using this at my day job. @TyBarthel maybe you guys could get this on a sprint?

@turowicz
Copy link

That part is extremely important. Thanks for pointing this out @oising

@turowicz
Copy link

It appears that in order to avoid having to loop for new messages this solution would have to be channel / queue hybrid, as I don't see any notification features for new messages being pushed in Redis. @oising @berdon

@berdon
Copy link
Owner

berdon commented Sep 13, 2021

@turowicz I'd be more than willing to accept pull requests too. ;)

@turowicz
Copy link

@berdon I've had a look at the code and perhaps will. Up for my team to decide if we are going to go with Redis or something else.

@oising
Copy link
Author

oising commented Sep 13, 2021

@turowicz - I'm not super familiar with redis stream behavior either, but perhaps a hybrid approach with pubsub could be implemented. When a new value is pushed into a LIST, a companion pubsub event could be used to notify waiting data.

Update: Actually, it seems there is more than enough functionality to have quite a robust implementation:

Listening for new items with XREAD
When we do not want to access items by a range in a stream, usually what we want instead is to subscribe to new items arriving to the stream. This concept may appear related to Redis Pub/Sub, where you subscribe to a channel, or to Redis blocking lists, where you wait for a key to get new elements to fetch, but there are fundamental differences in the way you consume a stream:

  • A stream can have multiple clients (consumers) waiting for data. Every new item, by default, will be delivered to every consumer that is waiting for data in a given stream. This behavior is different than blocking lists, where each consumer will get a different element. However, the ability to fan out to multiple consumers is similar to Pub/Sub.
  • While in Pub/Sub messages are fire and forget and are never stored anyway, and while when using blocking lists, when a message is received by the client it is popped (effectively removed) from the list, streams work in a fundamentally different way. All the messages are appended in the stream indefinitely (unless the user explicitly asks to delete entries): different consumers will know what is a new message from its point of view by remembering the ID of the last message received.
  • Streams Consumer Groups provide a level of control that Pub/Sub or blocking lists cannot achieve, with different groups for the same stream, explicit acknowledgment of processed items, ability to inspect the pending items, claiming of unprocessed messages, and coherent history visibility for each single client, that is only able to see its private past history of messages.

@turowicz
Copy link

Yes, it still requires a channel in order to keep things async. Because of that complication this solution may not be feasible for us.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants