Fetching data through HTTP and caching JSON result in a file

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
4
down vote

favorite












The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.



The following function works but it is clumsy due to:



  • The nested try/except, which is difficult to read

  • It's difficult to figure out what the error is (I am catching both HTTPError and the JSONDecodeError to avoid further nesting

  • I don't dare to use a context manager for opening the file (with) due to a further nesting level within a try/except clause

Have you got any ideas for improvement?



def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns
a python list
"""
http = urllib3.PoolManager()
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
logger.error("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
except (IOError, JSONDecodeError):
logger.error("Cache File not found or broken")
raise
else:
with open(cache_file, "w") as f:
f.write(raw_data)

return data






share|improve this question



























    up vote
    4
    down vote

    favorite












    The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.



    The following function works but it is clumsy due to:



    • The nested try/except, which is difficult to read

    • It's difficult to figure out what the error is (I am catching both HTTPError and the JSONDecodeError to avoid further nesting

    • I don't dare to use a context manager for opening the file (with) due to a further nesting level within a try/except clause

    Have you got any ideas for improvement?



    def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
    """Goes to the default location and returns
    a python list
    """
    http = urllib3.PoolManager()
    try:
    r = http.request("GET",
    location)
    raw_data = r.data.decode("utf-8")
    data = json.loads(raw_data)
    except (urllib3.exceptions.HTTPError, JSONDecodeError):
    logger.error("Cannot access Intranet List Location - fetching cache")
    try:
    data = json.loads(open(cache_file).readlines())
    except (IOError, JSONDecodeError):
    logger.error("Cache File not found or broken")
    raise
    else:
    with open(cache_file, "w") as f:
    f.write(raw_data)

    return data






    share|improve this question























      up vote
      4
      down vote

      favorite









      up vote
      4
      down vote

      favorite











      The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.



      The following function works but it is clumsy due to:



      • The nested try/except, which is difficult to read

      • It's difficult to figure out what the error is (I am catching both HTTPError and the JSONDecodeError to avoid further nesting

      • I don't dare to use a context manager for opening the file (with) due to a further nesting level within a try/except clause

      Have you got any ideas for improvement?



      def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
      """Goes to the default location and returns
      a python list
      """
      http = urllib3.PoolManager()
      try:
      r = http.request("GET",
      location)
      raw_data = r.data.decode("utf-8")
      data = json.loads(raw_data)
      except (urllib3.exceptions.HTTPError, JSONDecodeError):
      logger.error("Cannot access Intranet List Location - fetching cache")
      try:
      data = json.loads(open(cache_file).readlines())
      except (IOError, JSONDecodeError):
      logger.error("Cache File not found or broken")
      raise
      else:
      with open(cache_file, "w") as f:
      f.write(raw_data)

      return data






      share|improve this question













      The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.



      The following function works but it is clumsy due to:



      • The nested try/except, which is difficult to read

      • It's difficult to figure out what the error is (I am catching both HTTPError and the JSONDecodeError to avoid further nesting

      • I don't dare to use a context manager for opening the file (with) due to a further nesting level within a try/except clause

      Have you got any ideas for improvement?



      def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
      """Goes to the default location and returns
      a python list
      """
      http = urllib3.PoolManager()
      try:
      r = http.request("GET",
      location)
      raw_data = r.data.decode("utf-8")
      data = json.loads(raw_data)
      except (urllib3.exceptions.HTTPError, JSONDecodeError):
      logger.error("Cannot access Intranet List Location - fetching cache")
      try:
      data = json.loads(open(cache_file).readlines())
      except (IOError, JSONDecodeError):
      logger.error("Cache File not found or broken")
      raise
      else:
      with open(cache_file, "w") as f:
      f.write(raw_data)

      return data








      share|improve this question












      share|improve this question




      share|improve this question








      edited Jul 17 at 18:46









      Daniel

      4,1032835




      4,1032835









      asked Jul 17 at 16:38









      ProfHase85

      1794




      1794




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          3
          down vote



          accepted










          If your logger object is derived from the logging module, use the logger.exception which'll log the traceback information as well. The docs (linked above) also specify this behaviour:




          Exception info is added to the logging message. This function should
          only be called from an exception handler.




          You can avoid nesting try-except block by just passing in first exception and place the write to cache inside the first try block itself:



          def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
          """Goes to the default location and returns a python list.
          """
          try:
          http = urllib3.PoolManager()
          request = http.request("GET", location)
          raw_data = request.data.decode("utf-8")
          data = json.loads(raw_data)
          with open(cache_file, "w") as cache:
          cache.write(raw_data)
          return data
          except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
          logger.exception("Cannot access Intranet List Location - fetching cache")
          try:
          data = json.loads(open(cache_file).readlines())
          return data
          except (IOError, JSONDecodeError):
          logger.exception("Cache File not found or broken")
          raise





          share|improve this answer























          • Isn't the pass superfluous?
            – 200_success
            Jul 17 at 17:00










          • @200_success yes, with the logger statement. The suggestion for pass was a generic one.
            – hjpotter92
            Jul 17 at 17:06










          • @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
            – ProfHase85
            Jul 17 at 17:09










          • @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
            – hjpotter92
            Jul 17 at 17:13

















          up vote
          1
          down vote













          Handle exceptions selectively



          Currently you put all exception-throwing code into one big try-block:



          try:
          r = http.request("GET",
          location)
          raw_data = r.data.decode("utf-8")
          data = json.loads(raw_data)
          except (urllib3.exceptions.HTTPError, JSONDecodeError):


          On first glance it is not visible, that there might be a UnicodeDecodeError, which you are not catching (intentionally?).
          It might increase readability to handle the possible exceptions of one statement at a time.



          Separate your concerns



          The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
          A decorator might be a fitting solution here.



          Handle unavailable endpoint value as error



          A non-available value from the Web-API can thus be handled through a specific exception by the decorator



          #! /usr/bin/env python3

          from functools import wraps
          from json import JSONDecodeError, dump, load, loads
          from logging import getLogger
          from urllib3 import PoolManager
          from urllib3.exceptions import HTTPError


          LOCATION = 'http://ip.jsontest.com/'
          CACHE = '/home/neumann/cache.json'
          LOGGER = getLogger('MyLogger')


          class DataNotRetrievable(Exception):
          """Indicates that the required data was not retrievable
          from the endpoint and a cached value is required.
          """

          pass


          def json_cache(filename):
          """Chaches return value of the wrapped funtion to the respective file."""

          def decorator(function):
          """Actual decorator."""

          @wraps(function)
          def wrapper(*args, **kwargs):
          """Wraps the decorated function."""
          try:
          json = function(*args, **kwargs)
          except DataNotRetrievable:
          LOGGER.exception('Could not retrieve data from website.')

          with open(filename, 'r') as cache:
          return load(cache)

          with open(filename, 'w') as cache:
          dump(json, cache)

          return json

          return wrapper

          return decorator


          @json_cache(CACHE)
          def fetch_list(location=LOCATION):
          """Goes to the default location and returns a python list."""

          pmgr = PoolManager()

          try:
          response = pmgr.request("GET", location)
          except HTTPError:
          raise DataNotRetrievable()

          try:
          text = response.data.decode()
          except UnicodeDecodeError:
          raise DataNotRetrievable()

          try:
          return loads(text)
          except JSONDecodeError:
          raise DataNotRetrievable()


          if __name__ == '__main__':
          print(fetch_list())





          share|improve this answer





















            Your Answer




            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["\$", "\$"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "196"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: false,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );








             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f199698%2ffetching-data-through-http-and-caching-json-result-in-a-file%23new-answer', 'question_page');

            );

            Post as a guest






























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            3
            down vote



            accepted










            If your logger object is derived from the logging module, use the logger.exception which'll log the traceback information as well. The docs (linked above) also specify this behaviour:




            Exception info is added to the logging message. This function should
            only be called from an exception handler.




            You can avoid nesting try-except block by just passing in first exception and place the write to cache inside the first try block itself:



            def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
            """Goes to the default location and returns a python list.
            """
            try:
            http = urllib3.PoolManager()
            request = http.request("GET", location)
            raw_data = request.data.decode("utf-8")
            data = json.loads(raw_data)
            with open(cache_file, "w") as cache:
            cache.write(raw_data)
            return data
            except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
            logger.exception("Cannot access Intranet List Location - fetching cache")
            try:
            data = json.loads(open(cache_file).readlines())
            return data
            except (IOError, JSONDecodeError):
            logger.exception("Cache File not found or broken")
            raise





            share|improve this answer























            • Isn't the pass superfluous?
              – 200_success
              Jul 17 at 17:00










            • @200_success yes, with the logger statement. The suggestion for pass was a generic one.
              – hjpotter92
              Jul 17 at 17:06










            • @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
              – ProfHase85
              Jul 17 at 17:09










            • @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
              – hjpotter92
              Jul 17 at 17:13














            up vote
            3
            down vote



            accepted










            If your logger object is derived from the logging module, use the logger.exception which'll log the traceback information as well. The docs (linked above) also specify this behaviour:




            Exception info is added to the logging message. This function should
            only be called from an exception handler.




            You can avoid nesting try-except block by just passing in first exception and place the write to cache inside the first try block itself:



            def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
            """Goes to the default location and returns a python list.
            """
            try:
            http = urllib3.PoolManager()
            request = http.request("GET", location)
            raw_data = request.data.decode("utf-8")
            data = json.loads(raw_data)
            with open(cache_file, "w") as cache:
            cache.write(raw_data)
            return data
            except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
            logger.exception("Cannot access Intranet List Location - fetching cache")
            try:
            data = json.loads(open(cache_file).readlines())
            return data
            except (IOError, JSONDecodeError):
            logger.exception("Cache File not found or broken")
            raise





            share|improve this answer























            • Isn't the pass superfluous?
              – 200_success
              Jul 17 at 17:00










            • @200_success yes, with the logger statement. The suggestion for pass was a generic one.
              – hjpotter92
              Jul 17 at 17:06










            • @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
              – ProfHase85
              Jul 17 at 17:09










            • @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
              – hjpotter92
              Jul 17 at 17:13












            up vote
            3
            down vote



            accepted







            up vote
            3
            down vote



            accepted






            If your logger object is derived from the logging module, use the logger.exception which'll log the traceback information as well. The docs (linked above) also specify this behaviour:




            Exception info is added to the logging message. This function should
            only be called from an exception handler.




            You can avoid nesting try-except block by just passing in first exception and place the write to cache inside the first try block itself:



            def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
            """Goes to the default location and returns a python list.
            """
            try:
            http = urllib3.PoolManager()
            request = http.request("GET", location)
            raw_data = request.data.decode("utf-8")
            data = json.loads(raw_data)
            with open(cache_file, "w") as cache:
            cache.write(raw_data)
            return data
            except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
            logger.exception("Cannot access Intranet List Location - fetching cache")
            try:
            data = json.loads(open(cache_file).readlines())
            return data
            except (IOError, JSONDecodeError):
            logger.exception("Cache File not found or broken")
            raise





            share|improve this answer















            If your logger object is derived from the logging module, use the logger.exception which'll log the traceback information as well. The docs (linked above) also specify this behaviour:




            Exception info is added to the logging message. This function should
            only be called from an exception handler.




            You can avoid nesting try-except block by just passing in first exception and place the write to cache inside the first try block itself:



            def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
            """Goes to the default location and returns a python list.
            """
            try:
            http = urllib3.PoolManager()
            request = http.request("GET", location)
            raw_data = request.data.decode("utf-8")
            data = json.loads(raw_data)
            with open(cache_file, "w") as cache:
            cache.write(raw_data)
            return data
            except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
            logger.exception("Cannot access Intranet List Location - fetching cache")
            try:
            data = json.loads(open(cache_file).readlines())
            return data
            except (IOError, JSONDecodeError):
            logger.exception("Cache File not found or broken")
            raise






            share|improve this answer















            share|improve this answer



            share|improve this answer








            edited Jul 17 at 17:06


























            answered Jul 17 at 16:54









            hjpotter92

            4,89611538




            4,89611538











            • Isn't the pass superfluous?
              – 200_success
              Jul 17 at 17:00










            • @200_success yes, with the logger statement. The suggestion for pass was a generic one.
              – hjpotter92
              Jul 17 at 17:06










            • @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
              – ProfHase85
              Jul 17 at 17:09










            • @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
              – hjpotter92
              Jul 17 at 17:13
















            • Isn't the pass superfluous?
              – 200_success
              Jul 17 at 17:00










            • @200_success yes, with the logger statement. The suggestion for pass was a generic one.
              – hjpotter92
              Jul 17 at 17:06










            • @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
              – ProfHase85
              Jul 17 at 17:09










            • @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
              – hjpotter92
              Jul 17 at 17:13















            Isn't the pass superfluous?
            – 200_success
            Jul 17 at 17:00




            Isn't the pass superfluous?
            – 200_success
            Jul 17 at 17:00












            @200_success yes, with the logger statement. The suggestion for pass was a generic one.
            – hjpotter92
            Jul 17 at 17:06




            @200_success yes, with the logger statement. The suggestion for pass was a generic one.
            – hjpotter92
            Jul 17 at 17:06












            @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
            – ProfHase85
            Jul 17 at 17:09




            @hjpotter92 thanks, do you have any idea about the duplicate catching of the JSONDecodeError?
            – ProfHase85
            Jul 17 at 17:09












            @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
            – hjpotter92
            Jul 17 at 17:13




            @ProfHase85 You can skip decoding in the request section, and set raw_data = open(cache_file).readlines(). Do the json.loads once only. But that'd increase complexity imo.
            – hjpotter92
            Jul 17 at 17:13












            up vote
            1
            down vote













            Handle exceptions selectively



            Currently you put all exception-throwing code into one big try-block:



            try:
            r = http.request("GET",
            location)
            raw_data = r.data.decode("utf-8")
            data = json.loads(raw_data)
            except (urllib3.exceptions.HTTPError, JSONDecodeError):


            On first glance it is not visible, that there might be a UnicodeDecodeError, which you are not catching (intentionally?).
            It might increase readability to handle the possible exceptions of one statement at a time.



            Separate your concerns



            The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
            A decorator might be a fitting solution here.



            Handle unavailable endpoint value as error



            A non-available value from the Web-API can thus be handled through a specific exception by the decorator



            #! /usr/bin/env python3

            from functools import wraps
            from json import JSONDecodeError, dump, load, loads
            from logging import getLogger
            from urllib3 import PoolManager
            from urllib3.exceptions import HTTPError


            LOCATION = 'http://ip.jsontest.com/'
            CACHE = '/home/neumann/cache.json'
            LOGGER = getLogger('MyLogger')


            class DataNotRetrievable(Exception):
            """Indicates that the required data was not retrievable
            from the endpoint and a cached value is required.
            """

            pass


            def json_cache(filename):
            """Chaches return value of the wrapped funtion to the respective file."""

            def decorator(function):
            """Actual decorator."""

            @wraps(function)
            def wrapper(*args, **kwargs):
            """Wraps the decorated function."""
            try:
            json = function(*args, **kwargs)
            except DataNotRetrievable:
            LOGGER.exception('Could not retrieve data from website.')

            with open(filename, 'r') as cache:
            return load(cache)

            with open(filename, 'w') as cache:
            dump(json, cache)

            return json

            return wrapper

            return decorator


            @json_cache(CACHE)
            def fetch_list(location=LOCATION):
            """Goes to the default location and returns a python list."""

            pmgr = PoolManager()

            try:
            response = pmgr.request("GET", location)
            except HTTPError:
            raise DataNotRetrievable()

            try:
            text = response.data.decode()
            except UnicodeDecodeError:
            raise DataNotRetrievable()

            try:
            return loads(text)
            except JSONDecodeError:
            raise DataNotRetrievable()


            if __name__ == '__main__':
            print(fetch_list())





            share|improve this answer

























              up vote
              1
              down vote













              Handle exceptions selectively



              Currently you put all exception-throwing code into one big try-block:



              try:
              r = http.request("GET",
              location)
              raw_data = r.data.decode("utf-8")
              data = json.loads(raw_data)
              except (urllib3.exceptions.HTTPError, JSONDecodeError):


              On first glance it is not visible, that there might be a UnicodeDecodeError, which you are not catching (intentionally?).
              It might increase readability to handle the possible exceptions of one statement at a time.



              Separate your concerns



              The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
              A decorator might be a fitting solution here.



              Handle unavailable endpoint value as error



              A non-available value from the Web-API can thus be handled through a specific exception by the decorator



              #! /usr/bin/env python3

              from functools import wraps
              from json import JSONDecodeError, dump, load, loads
              from logging import getLogger
              from urllib3 import PoolManager
              from urllib3.exceptions import HTTPError


              LOCATION = 'http://ip.jsontest.com/'
              CACHE = '/home/neumann/cache.json'
              LOGGER = getLogger('MyLogger')


              class DataNotRetrievable(Exception):
              """Indicates that the required data was not retrievable
              from the endpoint and a cached value is required.
              """

              pass


              def json_cache(filename):
              """Chaches return value of the wrapped funtion to the respective file."""

              def decorator(function):
              """Actual decorator."""

              @wraps(function)
              def wrapper(*args, **kwargs):
              """Wraps the decorated function."""
              try:
              json = function(*args, **kwargs)
              except DataNotRetrievable:
              LOGGER.exception('Could not retrieve data from website.')

              with open(filename, 'r') as cache:
              return load(cache)

              with open(filename, 'w') as cache:
              dump(json, cache)

              return json

              return wrapper

              return decorator


              @json_cache(CACHE)
              def fetch_list(location=LOCATION):
              """Goes to the default location and returns a python list."""

              pmgr = PoolManager()

              try:
              response = pmgr.request("GET", location)
              except HTTPError:
              raise DataNotRetrievable()

              try:
              text = response.data.decode()
              except UnicodeDecodeError:
              raise DataNotRetrievable()

              try:
              return loads(text)
              except JSONDecodeError:
              raise DataNotRetrievable()


              if __name__ == '__main__':
              print(fetch_list())





              share|improve this answer























                up vote
                1
                down vote










                up vote
                1
                down vote









                Handle exceptions selectively



                Currently you put all exception-throwing code into one big try-block:



                try:
                r = http.request("GET",
                location)
                raw_data = r.data.decode("utf-8")
                data = json.loads(raw_data)
                except (urllib3.exceptions.HTTPError, JSONDecodeError):


                On first glance it is not visible, that there might be a UnicodeDecodeError, which you are not catching (intentionally?).
                It might increase readability to handle the possible exceptions of one statement at a time.



                Separate your concerns



                The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
                A decorator might be a fitting solution here.



                Handle unavailable endpoint value as error



                A non-available value from the Web-API can thus be handled through a specific exception by the decorator



                #! /usr/bin/env python3

                from functools import wraps
                from json import JSONDecodeError, dump, load, loads
                from logging import getLogger
                from urllib3 import PoolManager
                from urllib3.exceptions import HTTPError


                LOCATION = 'http://ip.jsontest.com/'
                CACHE = '/home/neumann/cache.json'
                LOGGER = getLogger('MyLogger')


                class DataNotRetrievable(Exception):
                """Indicates that the required data was not retrievable
                from the endpoint and a cached value is required.
                """

                pass


                def json_cache(filename):
                """Chaches return value of the wrapped funtion to the respective file."""

                def decorator(function):
                """Actual decorator."""

                @wraps(function)
                def wrapper(*args, **kwargs):
                """Wraps the decorated function."""
                try:
                json = function(*args, **kwargs)
                except DataNotRetrievable:
                LOGGER.exception('Could not retrieve data from website.')

                with open(filename, 'r') as cache:
                return load(cache)

                with open(filename, 'w') as cache:
                dump(json, cache)

                return json

                return wrapper

                return decorator


                @json_cache(CACHE)
                def fetch_list(location=LOCATION):
                """Goes to the default location and returns a python list."""

                pmgr = PoolManager()

                try:
                response = pmgr.request("GET", location)
                except HTTPError:
                raise DataNotRetrievable()

                try:
                text = response.data.decode()
                except UnicodeDecodeError:
                raise DataNotRetrievable()

                try:
                return loads(text)
                except JSONDecodeError:
                raise DataNotRetrievable()


                if __name__ == '__main__':
                print(fetch_list())





                share|improve this answer













                Handle exceptions selectively



                Currently you put all exception-throwing code into one big try-block:



                try:
                r = http.request("GET",
                location)
                raw_data = r.data.decode("utf-8")
                data = json.loads(raw_data)
                except (urllib3.exceptions.HTTPError, JSONDecodeError):


                On first glance it is not visible, that there might be a UnicodeDecodeError, which you are not catching (intentionally?).
                It might increase readability to handle the possible exceptions of one statement at a time.



                Separate your concerns



                The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
                A decorator might be a fitting solution here.



                Handle unavailable endpoint value as error



                A non-available value from the Web-API can thus be handled through a specific exception by the decorator



                #! /usr/bin/env python3

                from functools import wraps
                from json import JSONDecodeError, dump, load, loads
                from logging import getLogger
                from urllib3 import PoolManager
                from urllib3.exceptions import HTTPError


                LOCATION = 'http://ip.jsontest.com/'
                CACHE = '/home/neumann/cache.json'
                LOGGER = getLogger('MyLogger')


                class DataNotRetrievable(Exception):
                """Indicates that the required data was not retrievable
                from the endpoint and a cached value is required.
                """

                pass


                def json_cache(filename):
                """Chaches return value of the wrapped funtion to the respective file."""

                def decorator(function):
                """Actual decorator."""

                @wraps(function)
                def wrapper(*args, **kwargs):
                """Wraps the decorated function."""
                try:
                json = function(*args, **kwargs)
                except DataNotRetrievable:
                LOGGER.exception('Could not retrieve data from website.')

                with open(filename, 'r') as cache:
                return load(cache)

                with open(filename, 'w') as cache:
                dump(json, cache)

                return json

                return wrapper

                return decorator


                @json_cache(CACHE)
                def fetch_list(location=LOCATION):
                """Goes to the default location and returns a python list."""

                pmgr = PoolManager()

                try:
                response = pmgr.request("GET", location)
                except HTTPError:
                raise DataNotRetrievable()

                try:
                text = response.data.decode()
                except UnicodeDecodeError:
                raise DataNotRetrievable()

                try:
                return loads(text)
                except JSONDecodeError:
                raise DataNotRetrievable()


                if __name__ == '__main__':
                print(fetch_list())






                share|improve this answer













                share|improve this answer



                share|improve this answer











                answered Jul 19 at 10:00









                Richard Neumann

                1,659620




                1,659620






















                     

                    draft saved


                    draft discarded


























                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f199698%2ffetching-data-through-http-and-caching-json-result-in-a-file%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Popular posts from this blog

                    Chat program with C++ and SFML

                    Function to Return a JSON Like Objects Using VBA Collections and Arrays

                    Will my employers contract hold up in court?