Implementation of distributed cache
One of the things that we can't escape while developing a system, is the use of database to store data. However, we ought to know that the frequency of database access will directly affect our system's performance.
This is where caching comes in. Caching not only reduces the number of calls required to access the database, but also to increase the speed we could retrieve the data that we need. Cache is a storage pattern which allows for fast retrieval of data which we need frequently, and it should not be a set of data which changes constantly.
In case of ASP.NET Core, there are 2 types of caching that we can use; in-memory cache & distributed cache. In-memory cache means that the cache is available for access within the application itself only. If we are building a system that contains multiple apps. which spread across multiple servers, each app. will have its own cache, which might be redundant, since all of them utilize the same data. This is where distributed cache comes in.
When I first wanted to work on distributed cache, one of the problems involves cache refreshing. If we have a number of apps. accessing the cache, how could we ensure only one of them is refreshing the cache? 2 solutions came up at that time,
1. Attempt to use cache-lock to ensure only 1 client is doing the refresh, or
2. Leave the refresh job to a new process
Since our system is already utilizing AWS services, it crossed my thought that we might try to execute option 2 using the services in AWS. 2 things are required to build a self-triggered event (periodically):
1. A Windows task-scheduler alike tool (CloudWatch)
2. A process to refresh the cache (Lambda)
Well, if you are looking for a more budget solution, you could easily build your own cron job (e.g. console app. & task scheduler in Windows). If not, $1.00 per million custom events wouldn't exactly put a dent in your pocket as well.
This is where caching comes in. Caching not only reduces the number of calls required to access the database, but also to increase the speed we could retrieve the data that we need. Cache is a storage pattern which allows for fast retrieval of data which we need frequently, and it should not be a set of data which changes constantly.
In case of ASP.NET Core, there are 2 types of caching that we can use; in-memory cache & distributed cache. In-memory cache means that the cache is available for access within the application itself only. If we are building a system that contains multiple apps. which spread across multiple servers, each app. will have its own cache, which might be redundant, since all of them utilize the same data. This is where distributed cache comes in.
When I first wanted to work on distributed cache, one of the problems involves cache refreshing. If we have a number of apps. accessing the cache, how could we ensure only one of them is refreshing the cache? 2 solutions came up at that time,
1. Attempt to use cache-lock to ensure only 1 client is doing the refresh, or
2. Leave the refresh job to a new process
Since our system is already utilizing AWS services, it crossed my thought that we might try to execute option 2 using the services in AWS. 2 things are required to build a self-triggered event (periodically):
1. A Windows task-scheduler alike tool (CloudWatch)
2. A process to refresh the cache (Lambda)
Well, if you are looking for a more budget solution, you could easily build your own cron job (e.g. console app. & task scheduler in Windows). If not, $1.00 per million custom events wouldn't exactly put a dent in your pocket as well.
Comments
Post a Comment