We are running a number of Discourse instances on Azure and are using a shared Redis service for this. For around 20 instances, I see a peak of 350 connections to Redis and it’s hovering around 330-340, averaging out to around 17 connections per instance. This hits the SKU connection limits pretty hard, particularly considering Redis is underutilized in both CPU and memory.
This is not recommended. There is some redis feature that is used that will leak between sites, I’ve been told more than once. I’m not sure how to find that topic, though.
I ran across one that was counter to that and stated when multisite was used (although these are just individual deployments), using a single Redis instance is fine. Redis doesn’t inherently “leak” - the app would have to be doing that.
Thanks. Yeah, I’ve generally made applications do that by default to avoid such issues - seems that would’ve been a safer default, but it is what it is.