I have a small django site which controls an anstronomy dome and house automation. On start up the project loads 3 json files: relays, conditions and homeautomation. To avoid constant reading and writing to the Pi4's ssd I load the json files into REDIS (on start up in apps, see below). I already have REDIS running in a docker as the project uses celery.
My problem is that within a few minutes of loading the json into REDIS it clears the data out of cache.
I load the json file in the form of a dictionary (dict) in apps
cache.set("REDIS_ashtreeautomation_dict", dict, timeout=None)
and set
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://redis:6379",
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"SERIALIZER": "django_redis.serializers.json.JSONSerializer",
"TIMEOUT": None
}
}
}
I don't need the data to persist if the dockers go down and I don't need db functions. Caching these files is ideal but I need them to 'stay alive' for the lifetime of the server.
Thank you.
TIMEOUTunderOPTIONS, when it should be at the same level in the hierarchy. See the example configuration in the documentation. Note thatNoneisn't a guarantee that your data won't ever get purged. Redis, for example, has its own set of (configurable) eviction policies that it will turn to when it starts to run out of memory.