Is secure storage for Redis necessary?

We are currently trying to install Discourse via helm charts on Kubernetes in GCP, see here.

Now, since it should also cover our prod environment, we must not lose any content brought to Discourse. Of course, we will use a hosted Postgresql DB on GCP, but we were not sure whether we also need a hosted Redis here.

In case, a loss of all data in Redis would not mean, that the Discourse functionality is affected, for instance, by loosing only cached data, then we could just use the Redis in K8s. However, if a loss of Redis has heavier implications, we would need a hosted Redis.

Could you enlighten us here?

1 Like

I covered it here: More details on how the Redis cache is utilized? - #2 by Falco

Redis is used for:

  • cache

  • background job queue

  • persistent connections backlog and pub/sub

I’d go with a hosted service in GCP if I were in your shoes, provided they ship a recent enough version of Redis and don’t lag much behind. AWS does great in keeping up with new versions, while GCP / Azure can lag sometimes.

3 Likes

@Falco Many thanks for your input here. So if Redis is lost, that will destroy all history etc.?

1 Like

Can you define “history” here?

Losing the Redis DB (equivalent of redis-cli flushall) is something the Discourse app will recover from without major problems, but you will lose some stuff, like emails that were in a queue, etc. So while it’s not something catastrophic, if you can avoid it without much hassle I’d recommend you to.

1 Like

Yeah, with history I meant all the topics and stuff like the conversation we are having right now here.

It would be very critical not to lose these ones. :slight_smile:

1 Like

Actual data like posts, topics, users, etc is stored in PostgreSQL.

2 Likes

So it only covers not so critical data, or more ephermal data.

2 Likes

The biggest thing you’d lose is stuff that’s queued to happen in the future. It’s generally not a problem.