Memory cache implementation with a static class inside a normal class

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
2
down vote

favorite












I'm trying to implement a cache available in a data provider (repository).



The data provider is a normal class, instanciated at every request made to the application.



The reason I have a static class is that I want all requests to share the same _lock object.



The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.



Is this good design? I have test ran it under stress and it seems to work fine.



I think the locking in the Cache class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).



I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).



Are there any obvious flaws you can spot? Is the use of MemoryCache overkill in this case? I don't really need managing expiration and other options MemoryCache offers, so I'm split about just using a Dictionary.



public class DataProvider 

static SemaphoreSlim dbLock = new SemaphoreSlim(1);

//This class has a bunch of methods to access the DB directly, when needed.

/// <summary>
/// Get items from the MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel

if (Cache.Items<T>() == null)

await dbLock.WaitAsync();
try

var items = await GetItemsFromDatabase<T>();
Cache.SetItems(items);

finally

dbLock.Release();



return Cache.Items<T>();




/// <summary>
/// Update a cached item when DataProvider updates one
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="item"></param>
/// <returns></returns>
public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel

var item = JsonConvert.DeserializeObject<T>(itemData);
var cachedItems = await GetCacheItems<T>();

if (cachedItems == null)
throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");

var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);

if (cachedItem == null)

cachedItems.Add(item);

else

cachedItems.Remove(cachedItem);
cachedItems.Add(item);


Cache.SetItems(cachedItems);



public static class Cache

private static object _lock = new object();
private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);

/// <summary>
/// Retrieve items of specified type
/// </summary>
/// <typeparam name="T"></typeparam>
/// <returns></returns>
public static List<T> Items<T>()

var itemKey = typeof(T).Name;

lock (_lock)

return MemoryCache.Default.Get(itemKey) as List<T>;




/// <summary>
/// Load items in MemoryCache
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="items"></param>
public static void SetItems<T>(List<T> items)

var itemKey = typeof(T).Name;

lock (_lock)

MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);











share|improve this question



























    up vote
    2
    down vote

    favorite












    I'm trying to implement a cache available in a data provider (repository).



    The data provider is a normal class, instanciated at every request made to the application.



    The reason I have a static class is that I want all requests to share the same _lock object.



    The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.



    Is this good design? I have test ran it under stress and it seems to work fine.



    I think the locking in the Cache class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).



    I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).



    Are there any obvious flaws you can spot? Is the use of MemoryCache overkill in this case? I don't really need managing expiration and other options MemoryCache offers, so I'm split about just using a Dictionary.



    public class DataProvider 

    static SemaphoreSlim dbLock = new SemaphoreSlim(1);

    //This class has a bunch of methods to access the DB directly, when needed.

    /// <summary>
    /// Get items from the MemoryCache
    /// </summary>
    /// <typeparam name="T"></typeparam>
    /// <returns></returns>
    public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel

    if (Cache.Items<T>() == null)

    await dbLock.WaitAsync();
    try

    var items = await GetItemsFromDatabase<T>();
    Cache.SetItems(items);

    finally

    dbLock.Release();



    return Cache.Items<T>();




    /// <summary>
    /// Update a cached item when DataProvider updates one
    /// </summary>
    /// <typeparam name="T"></typeparam>
    /// <param name="item"></param>
    /// <returns></returns>
    public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel

    var item = JsonConvert.DeserializeObject<T>(itemData);
    var cachedItems = await GetCacheItems<T>();

    if (cachedItems == null)
    throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");

    var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);

    if (cachedItem == null)

    cachedItems.Add(item);

    else

    cachedItems.Remove(cachedItem);
    cachedItems.Add(item);


    Cache.SetItems(cachedItems);



    public static class Cache

    private static object _lock = new object();
    private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);

    /// <summary>
    /// Retrieve items of specified type
    /// </summary>
    /// <typeparam name="T"></typeparam>
    /// <returns></returns>
    public static List<T> Items<T>()

    var itemKey = typeof(T).Name;

    lock (_lock)

    return MemoryCache.Default.Get(itemKey) as List<T>;




    /// <summary>
    /// Load items in MemoryCache
    /// </summary>
    /// <typeparam name="T"></typeparam>
    /// <param name="items"></param>
    public static void SetItems<T>(List<T> items)

    var itemKey = typeof(T).Name;

    lock (_lock)

    MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);











    share|improve this question























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      I'm trying to implement a cache available in a data provider (repository).



      The data provider is a normal class, instanciated at every request made to the application.



      The reason I have a static class is that I want all requests to share the same _lock object.



      The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.



      Is this good design? I have test ran it under stress and it seems to work fine.



      I think the locking in the Cache class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).



      I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).



      Are there any obvious flaws you can spot? Is the use of MemoryCache overkill in this case? I don't really need managing expiration and other options MemoryCache offers, so I'm split about just using a Dictionary.



      public class DataProvider 

      static SemaphoreSlim dbLock = new SemaphoreSlim(1);

      //This class has a bunch of methods to access the DB directly, when needed.

      /// <summary>
      /// Get items from the MemoryCache
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <returns></returns>
      public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel

      if (Cache.Items<T>() == null)

      await dbLock.WaitAsync();
      try

      var items = await GetItemsFromDatabase<T>();
      Cache.SetItems(items);

      finally

      dbLock.Release();



      return Cache.Items<T>();




      /// <summary>
      /// Update a cached item when DataProvider updates one
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <param name="item"></param>
      /// <returns></returns>
      public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel

      var item = JsonConvert.DeserializeObject<T>(itemData);
      var cachedItems = await GetCacheItems<T>();

      if (cachedItems == null)
      throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");

      var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);

      if (cachedItem == null)

      cachedItems.Add(item);

      else

      cachedItems.Remove(cachedItem);
      cachedItems.Add(item);


      Cache.SetItems(cachedItems);



      public static class Cache

      private static object _lock = new object();
      private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);

      /// <summary>
      /// Retrieve items of specified type
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <returns></returns>
      public static List<T> Items<T>()

      var itemKey = typeof(T).Name;

      lock (_lock)

      return MemoryCache.Default.Get(itemKey) as List<T>;




      /// <summary>
      /// Load items in MemoryCache
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <param name="items"></param>
      public static void SetItems<T>(List<T> items)

      var itemKey = typeof(T).Name;

      lock (_lock)

      MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);











      share|improve this question













      I'm trying to implement a cache available in a data provider (repository).



      The data provider is a normal class, instanciated at every request made to the application.



      The reason I have a static class is that I want all requests to share the same _lock object.



      The data being cached doesn't change very often and is quite small (a few deserialized JSON documents) but they are often requested by other logic in the application.



      Is this good design? I have test ran it under stress and it seems to work fine.



      I think the locking in the Cache class should do what it is supposed to be doing (making sure data is not changed by two concurent requests and data that is being read is always up to date).



      I also lock when reading from the database in the case where a request comes in and for some reason is slower than a second one (the first request would then populate the cache with possibly outdated data).



      Are there any obvious flaws you can spot? Is the use of MemoryCache overkill in this case? I don't really need managing expiration and other options MemoryCache offers, so I'm split about just using a Dictionary.



      public class DataProvider 

      static SemaphoreSlim dbLock = new SemaphoreSlim(1);

      //This class has a bunch of methods to access the DB directly, when needed.

      /// <summary>
      /// Get items from the MemoryCache
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <returns></returns>
      public async Task<List<T>> GetCacheItems<T>() where T : IDocumentModel

      if (Cache.Items<T>() == null)

      await dbLock.WaitAsync();
      try

      var items = await GetItemsFromDatabase<T>();
      Cache.SetItems(items);

      finally

      dbLock.Release();



      return Cache.Items<T>();




      /// <summary>
      /// Update a cached item when DataProvider updates one
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <param name="item"></param>
      /// <returns></returns>
      public async Task UpdateCacheItem<T>(string itemData) where T : IDocumentModel

      var item = JsonConvert.DeserializeObject<T>(itemData);
      var cachedItems = await GetCacheItems<T>();

      if (cachedItems == null)
      throw new Exception($"Error retrieving cached items of type 'typeof(T).Name'");

      var cachedItem = cachedItems.SingleOrDefault(x => x.Id == item.Id);

      if (cachedItem == null)

      cachedItems.Add(item);

      else

      cachedItems.Remove(cachedItem);
      cachedItems.Add(item);


      Cache.SetItems(cachedItems);



      public static class Cache

      private static object _lock = new object();
      private static DateTimeOffset offset = DateTimeOffset.UtcNow.AddYears(5);

      /// <summary>
      /// Retrieve items of specified type
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <returns></returns>
      public static List<T> Items<T>()

      var itemKey = typeof(T).Name;

      lock (_lock)

      return MemoryCache.Default.Get(itemKey) as List<T>;




      /// <summary>
      /// Load items in MemoryCache
      /// </summary>
      /// <typeparam name="T"></typeparam>
      /// <param name="items"></param>
      public static void SetItems<T>(List<T> items)

      var itemKey = typeof(T).Name;

      lock (_lock)

      MemoryCache.Default.AddOrGetExisting(itemKey, items, offset);













      share|improve this question












      share|improve this question




      share|improve this question








      edited Apr 16 at 21:35
























      asked Apr 16 at 19:13









      Francis Ducharme

      1384




      1384




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote













          The MemoryCache is thread safe, all those locks and semaphore are useless.



          Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.



          If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem






          share|improve this answer























            Your Answer




            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["\$", "\$"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "196"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: false,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );








             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f192234%2fmemory-cache-implementation-with-a-static-class-inside-a-normal-class%23new-answer', 'question_page');

            );

            Post as a guest






























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            1
            down vote













            The MemoryCache is thread safe, all those locks and semaphore are useless.



            Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.



            If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem






            share|improve this answer



























              up vote
              1
              down vote













              The MemoryCache is thread safe, all those locks and semaphore are useless.



              Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.



              If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem






              share|improve this answer

























                up vote
                1
                down vote










                up vote
                1
                down vote









                The MemoryCache is thread safe, all those locks and semaphore are useless.



                Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.



                If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem






                share|improve this answer















                The MemoryCache is thread safe, all those locks and semaphore are useless.



                Also, the Memory Cache by itself makes some serializations so at every update you are deserializing the whole collection of given type, add the item and re-serialize the whole collection.



                If you can really use a Dictionary use it, a ConcurrentDictionary < Type,List < T >> will resolve all your problem







                share|improve this answer















                share|improve this answer



                share|improve this answer








                edited Jun 24 at 21:46


























                answered Jun 24 at 20:57









                TheQult

                817




                817






















                     

                    draft saved


                    draft discarded


























                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f192234%2fmemory-cache-implementation-with-a-static-class-inside-a-normal-class%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Popular posts from this blog

                    Chat program with C++ and SFML

                    Function to Return a JSON Like Objects Using VBA Collections and Arrays

                    Will my employers contract hold up in court?