Memory cache implementation with a static class inside a normal class
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
2
down vote
favorite
I'm trying to implement a cache available in a data provider (repository).
The data provider is a normal class, instanciated at every request made to the application.
The reason I have a static class is that I want all requests to share the same _lock
object.
The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.
Is this good design? I have test ran it under stress and it seems to work fine.
I think the locking in the Cache
class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).
I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).
Are there any obvious flaws you can spot? Is the use of MemoryCache
overkill in this case? I don't really need managing expiration and other options MemoryCache
offers, so I'm split about just using a Dictionary.
public class DataProvider
static SemaphoreSlim dbLock = new SemaphoreSlim(1);
//This class has a bunch of methods to access the DB directly, when needed.
/// <summary>
/// Get items from the MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel
if (Cache.Items<T>() == null)
await dbLock.WaitAsync();
try
var items = await GetItemsFromDatabase<T>();
Cache.SetItems(items);
finally
dbLock.Release();
return Cache.Items<T>();
/// <summary>
/// Update a cached item when DataProvider updates one
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="item"></param>
/// <returns></returns>
public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel
var item = JsonConvert.DeserializeObject<T>(itemData);
var cachedItems = await GetCacheItems<T>();
if (cachedItems == null)
throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");
var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);
if (cachedItem == null)
cachedItems.Add(item);
else
cachedItems.Remove(cachedItem);
cachedItems.Add(item);
Cache.SetItems(cachedItems);
public static class Cache
private static object _lock = new object();
private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);
/// <summary>
/// Retrieve items of specified type
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public static List<T> Items<T>()
var itemKey = typeof(T).Name;
lock (_lock)
return MemoryCache.Default.Get(itemKey) as List<T>;
/// <summary>
/// Load items in MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="items"></param>
public static void SetItems<T>(List<T> items)
var itemKey = typeof(T).Name;
lock (_lock)
MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);
c# .net cache
add a comment |Â
up vote
2
down vote
favorite
I'm trying to implement a cache available in a data provider (repository).
The data provider is a normal class, instanciated at every request made to the application.
The reason I have a static class is that I want all requests to share the same _lock
object.
The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.
Is this good design? I have test ran it under stress and it seems to work fine.
I think the locking in the Cache
class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).
I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).
Are there any obvious flaws you can spot? Is the use of MemoryCache
overkill in this case? I don't really need managing expiration and other options MemoryCache
offers, so I'm split about just using a Dictionary.
public class DataProvider
static SemaphoreSlim dbLock = new SemaphoreSlim(1);
//This class has a bunch of methods to access the DB directly, when needed.
/// <summary>
/// Get items from the MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel
if (Cache.Items<T>() == null)
await dbLock.WaitAsync();
try
var items = await GetItemsFromDatabase<T>();
Cache.SetItems(items);
finally
dbLock.Release();
return Cache.Items<T>();
/// <summary>
/// Update a cached item when DataProvider updates one
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="item"></param>
/// <returns></returns>
public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel
var item = JsonConvert.DeserializeObject<T>(itemData);
var cachedItems = await GetCacheItems<T>();
if (cachedItems == null)
throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");
var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);
if (cachedItem == null)
cachedItems.Add(item);
else
cachedItems.Remove(cachedItem);
cachedItems.Add(item);
Cache.SetItems(cachedItems);
public static class Cache
private static object _lock = new object();
private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);
/// <summary>
/// Retrieve items of specified type
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public static List<T> Items<T>()
var itemKey = typeof(T).Name;
lock (_lock)
return MemoryCache.Default.Get(itemKey) as List<T>;
/// <summary>
/// Load items in MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="items"></param>
public static void SetItems<T>(List<T> items)
var itemKey = typeof(T).Name;
lock (_lock)
MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);
c# .net cache
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I'm trying to implement a cache available in a data provider (repository).
The data provider is a normal class, instanciated at every request made to the application.
The reason I have a static class is that I want all requests to share the same _lock
object.
The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.
Is this good design? I have test ran it under stress and it seems to work fine.
I think the locking in the Cache
class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).
I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).
Are there any obvious flaws you can spot? Is the use of MemoryCache
overkill in this case? I don't really need managing expiration and other options MemoryCache
offers, so I'm split about just using a Dictionary.
public class DataProvider
static SemaphoreSlim dbLock = new SemaphoreSlim(1);
//This class has a bunch of methods to access the DB directly, when needed.
/// <summary>
/// Get items from the MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel
if (Cache.Items<T>() == null)
await dbLock.WaitAsync();
try
var items = await GetItemsFromDatabase<T>();
Cache.SetItems(items);
finally
dbLock.Release();
return Cache.Items<T>();
/// <summary>
/// Update a cached item when DataProvider updates one
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="item"></param>
/// <returns></returns>
public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel
var item = JsonConvert.DeserializeObject<T>(itemData);
var cachedItems = await GetCacheItems<T>();
if (cachedItems == null)
throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");
var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);
if (cachedItem == null)
cachedItems.Add(item);
else
cachedItems.Remove(cachedItem);
cachedItems.Add(item);
Cache.SetItems(cachedItems);
public static class Cache
private static object _lock = new object();
private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);
/// <summary>
/// Retrieve items of specified type
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public static List<T> Items<T>()
var itemKey = typeof(T).Name;
lock (_lock)
return MemoryCache.Default.Get(itemKey) as List<T>;
/// <summary>
/// Load items in MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="items"></param>
public static void SetItems<T>(List<T> items)
var itemKey = typeof(T).Name;
lock (_lock)
MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);
c# .net cache
I'm trying to implement a cache available in a data provider (repository).
The data provider is a normal class, instanciated at every request made to the application.
The reason I have a static class is that I want all requests to share the same _lock
object.
The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.
Is this good design? I have test ran it under stress and it seems to work fine.
I think the locking in the Cache
class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).
I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).
Are there any obvious flaws you can spot? Is the use of MemoryCache
overkill in this case? I don't really need managing expiration and other options MemoryCache
offers, so I'm split about just using a Dictionary.
public class DataProvider
static SemaphoreSlim dbLock = new SemaphoreSlim(1);
//This class has a bunch of methods to access the DB directly, when needed.
/// <summary>
/// Get items from the MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel
if (Cache.Items<T>() == null)
await dbLock.WaitAsync();
try
var items = await GetItemsFromDatabase<T>();
Cache.SetItems(items);
finally
dbLock.Release();
return Cache.Items<T>();
/// <summary>
/// Update a cached item when DataProvider updates one
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="item"></param>
/// <returns></returns>
public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel
var item = JsonConvert.DeserializeObject<T>(itemData);
var cachedItems = await GetCacheItems<T>();
if (cachedItems == null)
throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");
var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);
if (cachedItem == null)
cachedItems.Add(item);
else
cachedItems.Remove(cachedItem);
cachedItems.Add(item);
Cache.SetItems(cachedItems);
public static class Cache
private static object _lock = new object();
private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);
/// <summary>
/// Retrieve items of specified type
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public static List<T> Items<T>()
var itemKey = typeof(T).Name;
lock (_lock)
return MemoryCache.Default.Get(itemKey) as List<T>;
/// <summary>
/// Load items in MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="items"></param>
public static void SetItems<T>(List<T> items)
var itemKey = typeof(T).Name;
lock (_lock)
MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);
c# .net cache
edited Apr 16 at 21:35
asked Apr 16 at 19:13
Francis Ducharme
1384
1384
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
The MemoryCache is thread safe, all those locks and semaphore are useless.
Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.
If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
The MemoryCache is thread safe, all those locks and semaphore are useless.
Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.
If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem
add a comment |Â
up vote
1
down vote
The MemoryCache is thread safe, all those locks and semaphore are useless.
Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.
If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem
add a comment |Â
up vote
1
down vote
up vote
1
down vote
The MemoryCache is thread safe, all those locks and semaphore are useless.
Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.
If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem
The MemoryCache is thread safe, all those locks and semaphore are useless.
Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.
If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem
edited Jun 24 at 21:46
answered Jun 24 at 20:57
TheQult
817
817
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f192234%2fmemory-cache-implementation-with-a-static-class-inside-a-normal-class%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password