You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As I can see in the push_stream_shared_memory_size description:
The size of the memory chunk this module will use to store published messages, channels and other shared structures.
When this memory is full any new request for publish a message or subscribe a channel will receive an 500 Internal Server Error response.
What is supposed to do when you get 500 trying to publish a mesasge?
I would prefer to have another option of the module behavior. I think it should (optionally) free memory when it needs more by freeing old messages. This way push_stream_message_ttl would have logic more like "try to keep messages for push_stream_message_ttl but can also delete them earlier".
Another way to use Redis as a cache is the maxmemory directive, a feature that allows specifying a maximum amount of memory to use. When new data is added to the server, and the memory limit was already reached, the server will remove some old data deleting a volatile key, that is, a key with an EXPIRE (a timeout) set, even if the key is still far from expiring automatically.
What do you think?
The text was updated successfully, but these errors were encountered:
Hello.
As I can see in the
push_stream_shared_memory_size
description:What is supposed to do when you get 500 trying to publish a mesasge?
I would prefer to have another option of the module behavior. I think it should (optionally) free memory when it needs more by freeing old messages. This way push_stream_message_ttl would have logic more like "try to keep messages for push_stream_message_ttl but can also delete them earlier".
E.g. this policy is available in Redis ( http://oldblog.antirez.com/post/redis-as-LRU-cache.html ):
What do you think?
The text was updated successfully, but these errors were encountered: