Fetching data through HTTP and caching JSON result in a file
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
4
down vote
favorite
The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.
The following function works but it is clumsy due to:
- The nested
try
/except
, which is difficult to read - It's difficult to figure out what the error is (I am catching both
HTTPError
and theJSONDecodeError
to avoid further nesting - I don't dare to use a context manager for opening the file (
with
) due to a further nesting level within atry
/except
clause
Have you got any ideas for improvement?
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns
a python list
"""
http = urllib3.PoolManager()
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
logger.error("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
except (IOError, JSONDecodeError):
logger.error("Cache File not found or broken")
raise
else:
with open(cache_file, "w") as f:
f.write(raw_data)
return data
python json error-handling http cache
add a comment |Â
up vote
4
down vote
favorite
The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.
The following function works but it is clumsy due to:
- The nested
try
/except
, which is difficult to read - It's difficult to figure out what the error is (I am catching both
HTTPError
and theJSONDecodeError
to avoid further nesting - I don't dare to use a context manager for opening the file (
with
) due to a further nesting level within atry
/except
clause
Have you got any ideas for improvement?
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns
a python list
"""
http = urllib3.PoolManager()
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
logger.error("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
except (IOError, JSONDecodeError):
logger.error("Cache File not found or broken")
raise
else:
with open(cache_file, "w") as f:
f.write(raw_data)
return data
python json error-handling http cache
add a comment |Â
up vote
4
down vote
favorite
up vote
4
down vote
favorite
The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.
The following function works but it is clumsy due to:
- The nested
try
/except
, which is difficult to read - It's difficult to figure out what the error is (I am catching both
HTTPError
and theJSONDecodeError
to avoid further nesting - I don't dare to use a context manager for opening the file (
with
) due to a further nesting level within atry
/except
clause
Have you got any ideas for improvement?
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns
a python list
"""
http = urllib3.PoolManager()
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
logger.error("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
except (IOError, JSONDecodeError):
logger.error("Cache File not found or broken")
raise
else:
with open(cache_file, "w") as f:
f.write(raw_data)
return data
python json error-handling http cache
The task is to fetch some JSON data from an API with a GET request. If the location is not available for any reason, read a cache file. Otherwise write the cache file for future use.
The following function works but it is clumsy due to:
- The nested
try
/except
, which is difficult to read - It's difficult to figure out what the error is (I am catching both
HTTPError
and theJSONDecodeError
to avoid further nesting - I don't dare to use a context manager for opening the file (
with
) due to a further nesting level within atry
/except
clause
Have you got any ideas for improvement?
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns
a python list
"""
http = urllib3.PoolManager()
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
logger.error("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
except (IOError, JSONDecodeError):
logger.error("Cache File not found or broken")
raise
else:
with open(cache_file, "w") as f:
f.write(raw_data)
return data
python json error-handling http cache
edited Jul 17 at 18:46
Daniel
4,1032835
4,1032835
asked Jul 17 at 16:38
ProfHase85
1794
1794
add a comment |Â
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
3
down vote
accepted
If your logger
object is derived from the logging
module, use the logger.exception
which'll log the traceback information as well. The docs (linked above) also specify this behaviour:
Exception info is added to the logging message. This function should
only be called from an exception handler.
You can avoid nesting try-except block by just pass
ing in first exception and place the write to cache inside the first try
block itself:
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns a python list.
"""
try:
http = urllib3.PoolManager()
request = http.request("GET", location)
raw_data = request.data.decode("utf-8")
data = json.loads(raw_data)
with open(cache_file, "w") as cache:
cache.write(raw_data)
return data
except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
logger.exception("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
return data
except (IOError, JSONDecodeError):
logger.exception("Cache File not found or broken")
raise
Isn't thepass
superfluous?
â 200_success
Jul 17 at 17:00
@200_success yes, with the logger statement. The suggestion forpass
was a generic one.
â hjpotter92
Jul 17 at 17:06
@hjpotter92 thanks, do you have any idea about the duplicate catching of theJSONDecodeError
?
â ProfHase85
Jul 17 at 17:09
@ProfHase85 You can skip decoding in the request section, and setraw_data = open(cache_file).readlines()
. Do thejson.loads
once only. But that'd increase complexity imo.
â hjpotter92
Jul 17 at 17:13
add a comment |Â
up vote
1
down vote
Handle exceptions selectively
Currently you put all exception-throwing code into one big try
-block:
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
On first glance it is not visible, that there might be a UnicodeDecodeError
, which you are not catching (intentionally?).
It might increase readability to handle the possible exceptions of one statement at a time.
Separate your concerns
The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
A decorator might be a fitting solution here.
Handle unavailable endpoint value as error
A non-available value from the Web-API can thus be handled through a specific exception by the decorator
#! /usr/bin/env python3
from functools import wraps
from json import JSONDecodeError, dump, load, loads
from logging import getLogger
from urllib3 import PoolManager
from urllib3.exceptions import HTTPError
LOCATION = 'http://ip.jsontest.com/'
CACHE = '/home/neumann/cache.json'
LOGGER = getLogger('MyLogger')
class DataNotRetrievable(Exception):
"""Indicates that the required data was not retrievable
from the endpoint and a cached value is required.
"""
pass
def json_cache(filename):
"""Chaches return value of the wrapped funtion to the respective file."""
def decorator(function):
"""Actual decorator."""
@wraps(function)
def wrapper(*args, **kwargs):
"""Wraps the decorated function."""
try:
json = function(*args, **kwargs)
except DataNotRetrievable:
LOGGER.exception('Could not retrieve data from website.')
with open(filename, 'r') as cache:
return load(cache)
with open(filename, 'w') as cache:
dump(json, cache)
return json
return wrapper
return decorator
@json_cache(CACHE)
def fetch_list(location=LOCATION):
"""Goes to the default location and returns a python list."""
pmgr = PoolManager()
try:
response = pmgr.request("GET", location)
except HTTPError:
raise DataNotRetrievable()
try:
text = response.data.decode()
except UnicodeDecodeError:
raise DataNotRetrievable()
try:
return loads(text)
except JSONDecodeError:
raise DataNotRetrievable()
if __name__ == '__main__':
print(fetch_list())
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
accepted
If your logger
object is derived from the logging
module, use the logger.exception
which'll log the traceback information as well. The docs (linked above) also specify this behaviour:
Exception info is added to the logging message. This function should
only be called from an exception handler.
You can avoid nesting try-except block by just pass
ing in first exception and place the write to cache inside the first try
block itself:
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns a python list.
"""
try:
http = urllib3.PoolManager()
request = http.request("GET", location)
raw_data = request.data.decode("utf-8")
data = json.loads(raw_data)
with open(cache_file, "w") as cache:
cache.write(raw_data)
return data
except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
logger.exception("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
return data
except (IOError, JSONDecodeError):
logger.exception("Cache File not found or broken")
raise
Isn't thepass
superfluous?
â 200_success
Jul 17 at 17:00
@200_success yes, with the logger statement. The suggestion forpass
was a generic one.
â hjpotter92
Jul 17 at 17:06
@hjpotter92 thanks, do you have any idea about the duplicate catching of theJSONDecodeError
?
â ProfHase85
Jul 17 at 17:09
@ProfHase85 You can skip decoding in the request section, and setraw_data = open(cache_file).readlines()
. Do thejson.loads
once only. But that'd increase complexity imo.
â hjpotter92
Jul 17 at 17:13
add a comment |Â
up vote
3
down vote
accepted
If your logger
object is derived from the logging
module, use the logger.exception
which'll log the traceback information as well. The docs (linked above) also specify this behaviour:
Exception info is added to the logging message. This function should
only be called from an exception handler.
You can avoid nesting try-except block by just pass
ing in first exception and place the write to cache inside the first try
block itself:
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns a python list.
"""
try:
http = urllib3.PoolManager()
request = http.request("GET", location)
raw_data = request.data.decode("utf-8")
data = json.loads(raw_data)
with open(cache_file, "w") as cache:
cache.write(raw_data)
return data
except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
logger.exception("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
return data
except (IOError, JSONDecodeError):
logger.exception("Cache File not found or broken")
raise
Isn't thepass
superfluous?
â 200_success
Jul 17 at 17:00
@200_success yes, with the logger statement. The suggestion forpass
was a generic one.
â hjpotter92
Jul 17 at 17:06
@hjpotter92 thanks, do you have any idea about the duplicate catching of theJSONDecodeError
?
â ProfHase85
Jul 17 at 17:09
@ProfHase85 You can skip decoding in the request section, and setraw_data = open(cache_file).readlines()
. Do thejson.loads
once only. But that'd increase complexity imo.
â hjpotter92
Jul 17 at 17:13
add a comment |Â
up vote
3
down vote
accepted
up vote
3
down vote
accepted
If your logger
object is derived from the logging
module, use the logger.exception
which'll log the traceback information as well. The docs (linked above) also specify this behaviour:
Exception info is added to the logging message. This function should
only be called from an exception handler.
You can avoid nesting try-except block by just pass
ing in first exception and place the write to cache inside the first try
block itself:
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns a python list.
"""
try:
http = urllib3.PoolManager()
request = http.request("GET", location)
raw_data = request.data.decode("utf-8")
data = json.loads(raw_data)
with open(cache_file, "w") as cache:
cache.write(raw_data)
return data
except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
logger.exception("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
return data
except (IOError, JSONDecodeError):
logger.exception("Cache File not found or broken")
raise
If your logger
object is derived from the logging
module, use the logger.exception
which'll log the traceback information as well. The docs (linked above) also specify this behaviour:
Exception info is added to the logging message. This function should
only be called from an exception handler.
You can avoid nesting try-except block by just pass
ing in first exception and place the write to cache inside the first try
block itself:
def fetch_list(location=MY_LOCATION, cache_file=CACHE_FILE):
"""Goes to the default location and returns a python list.
"""
try:
http = urllib3.PoolManager()
request = http.request("GET", location)
raw_data = request.data.decode("utf-8")
data = json.loads(raw_data)
with open(cache_file, "w") as cache:
cache.write(raw_data)
return data
except (urllib3.exceptions.HTTPError, JSONDecodeError) as exc:
logger.exception("Cannot access Intranet List Location - fetching cache")
try:
data = json.loads(open(cache_file).readlines())
return data
except (IOError, JSONDecodeError):
logger.exception("Cache File not found or broken")
raise
edited Jul 17 at 17:06
answered Jul 17 at 16:54
hjpotter92
4,89611538
4,89611538
Isn't thepass
superfluous?
â 200_success
Jul 17 at 17:00
@200_success yes, with the logger statement. The suggestion forpass
was a generic one.
â hjpotter92
Jul 17 at 17:06
@hjpotter92 thanks, do you have any idea about the duplicate catching of theJSONDecodeError
?
â ProfHase85
Jul 17 at 17:09
@ProfHase85 You can skip decoding in the request section, and setraw_data = open(cache_file).readlines()
. Do thejson.loads
once only. But that'd increase complexity imo.
â hjpotter92
Jul 17 at 17:13
add a comment |Â
Isn't thepass
superfluous?
â 200_success
Jul 17 at 17:00
@200_success yes, with the logger statement. The suggestion forpass
was a generic one.
â hjpotter92
Jul 17 at 17:06
@hjpotter92 thanks, do you have any idea about the duplicate catching of theJSONDecodeError
?
â ProfHase85
Jul 17 at 17:09
@ProfHase85 You can skip decoding in the request section, and setraw_data = open(cache_file).readlines()
. Do thejson.loads
once only. But that'd increase complexity imo.
â hjpotter92
Jul 17 at 17:13
Isn't the
pass
superfluous?â 200_success
Jul 17 at 17:00
Isn't the
pass
superfluous?â 200_success
Jul 17 at 17:00
@200_success yes, with the logger statement. The suggestion for
pass
was a generic one.â hjpotter92
Jul 17 at 17:06
@200_success yes, with the logger statement. The suggestion for
pass
was a generic one.â hjpotter92
Jul 17 at 17:06
@hjpotter92 thanks, do you have any idea about the duplicate catching of the
JSONDecodeError
?â ProfHase85
Jul 17 at 17:09
@hjpotter92 thanks, do you have any idea about the duplicate catching of the
JSONDecodeError
?â ProfHase85
Jul 17 at 17:09
@ProfHase85 You can skip decoding in the request section, and set
raw_data = open(cache_file).readlines()
. Do the json.loads
once only. But that'd increase complexity imo.â hjpotter92
Jul 17 at 17:13
@ProfHase85 You can skip decoding in the request section, and set
raw_data = open(cache_file).readlines()
. Do the json.loads
once only. But that'd increase complexity imo.â hjpotter92
Jul 17 at 17:13
add a comment |Â
up vote
1
down vote
Handle exceptions selectively
Currently you put all exception-throwing code into one big try
-block:
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
On first glance it is not visible, that there might be a UnicodeDecodeError
, which you are not catching (intentionally?).
It might increase readability to handle the possible exceptions of one statement at a time.
Separate your concerns
The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
A decorator might be a fitting solution here.
Handle unavailable endpoint value as error
A non-available value from the Web-API can thus be handled through a specific exception by the decorator
#! /usr/bin/env python3
from functools import wraps
from json import JSONDecodeError, dump, load, loads
from logging import getLogger
from urllib3 import PoolManager
from urllib3.exceptions import HTTPError
LOCATION = 'http://ip.jsontest.com/'
CACHE = '/home/neumann/cache.json'
LOGGER = getLogger('MyLogger')
class DataNotRetrievable(Exception):
"""Indicates that the required data was not retrievable
from the endpoint and a cached value is required.
"""
pass
def json_cache(filename):
"""Chaches return value of the wrapped funtion to the respective file."""
def decorator(function):
"""Actual decorator."""
@wraps(function)
def wrapper(*args, **kwargs):
"""Wraps the decorated function."""
try:
json = function(*args, **kwargs)
except DataNotRetrievable:
LOGGER.exception('Could not retrieve data from website.')
with open(filename, 'r') as cache:
return load(cache)
with open(filename, 'w') as cache:
dump(json, cache)
return json
return wrapper
return decorator
@json_cache(CACHE)
def fetch_list(location=LOCATION):
"""Goes to the default location and returns a python list."""
pmgr = PoolManager()
try:
response = pmgr.request("GET", location)
except HTTPError:
raise DataNotRetrievable()
try:
text = response.data.decode()
except UnicodeDecodeError:
raise DataNotRetrievable()
try:
return loads(text)
except JSONDecodeError:
raise DataNotRetrievable()
if __name__ == '__main__':
print(fetch_list())
add a comment |Â
up vote
1
down vote
Handle exceptions selectively
Currently you put all exception-throwing code into one big try
-block:
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
On first glance it is not visible, that there might be a UnicodeDecodeError
, which you are not catching (intentionally?).
It might increase readability to handle the possible exceptions of one statement at a time.
Separate your concerns
The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
A decorator might be a fitting solution here.
Handle unavailable endpoint value as error
A non-available value from the Web-API can thus be handled through a specific exception by the decorator
#! /usr/bin/env python3
from functools import wraps
from json import JSONDecodeError, dump, load, loads
from logging import getLogger
from urllib3 import PoolManager
from urllib3.exceptions import HTTPError
LOCATION = 'http://ip.jsontest.com/'
CACHE = '/home/neumann/cache.json'
LOGGER = getLogger('MyLogger')
class DataNotRetrievable(Exception):
"""Indicates that the required data was not retrievable
from the endpoint and a cached value is required.
"""
pass
def json_cache(filename):
"""Chaches return value of the wrapped funtion to the respective file."""
def decorator(function):
"""Actual decorator."""
@wraps(function)
def wrapper(*args, **kwargs):
"""Wraps the decorated function."""
try:
json = function(*args, **kwargs)
except DataNotRetrievable:
LOGGER.exception('Could not retrieve data from website.')
with open(filename, 'r') as cache:
return load(cache)
with open(filename, 'w') as cache:
dump(json, cache)
return json
return wrapper
return decorator
@json_cache(CACHE)
def fetch_list(location=LOCATION):
"""Goes to the default location and returns a python list."""
pmgr = PoolManager()
try:
response = pmgr.request("GET", location)
except HTTPError:
raise DataNotRetrievable()
try:
text = response.data.decode()
except UnicodeDecodeError:
raise DataNotRetrievable()
try:
return loads(text)
except JSONDecodeError:
raise DataNotRetrievable()
if __name__ == '__main__':
print(fetch_list())
add a comment |Â
up vote
1
down vote
up vote
1
down vote
Handle exceptions selectively
Currently you put all exception-throwing code into one big try
-block:
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
On first glance it is not visible, that there might be a UnicodeDecodeError
, which you are not catching (intentionally?).
It might increase readability to handle the possible exceptions of one statement at a time.
Separate your concerns
The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
A decorator might be a fitting solution here.
Handle unavailable endpoint value as error
A non-available value from the Web-API can thus be handled through a specific exception by the decorator
#! /usr/bin/env python3
from functools import wraps
from json import JSONDecodeError, dump, load, loads
from logging import getLogger
from urllib3 import PoolManager
from urllib3.exceptions import HTTPError
LOCATION = 'http://ip.jsontest.com/'
CACHE = '/home/neumann/cache.json'
LOGGER = getLogger('MyLogger')
class DataNotRetrievable(Exception):
"""Indicates that the required data was not retrievable
from the endpoint and a cached value is required.
"""
pass
def json_cache(filename):
"""Chaches return value of the wrapped funtion to the respective file."""
def decorator(function):
"""Actual decorator."""
@wraps(function)
def wrapper(*args, **kwargs):
"""Wraps the decorated function."""
try:
json = function(*args, **kwargs)
except DataNotRetrievable:
LOGGER.exception('Could not retrieve data from website.')
with open(filename, 'r') as cache:
return load(cache)
with open(filename, 'w') as cache:
dump(json, cache)
return json
return wrapper
return decorator
@json_cache(CACHE)
def fetch_list(location=LOCATION):
"""Goes to the default location and returns a python list."""
pmgr = PoolManager()
try:
response = pmgr.request("GET", location)
except HTTPError:
raise DataNotRetrievable()
try:
text = response.data.decode()
except UnicodeDecodeError:
raise DataNotRetrievable()
try:
return loads(text)
except JSONDecodeError:
raise DataNotRetrievable()
if __name__ == '__main__':
print(fetch_list())
Handle exceptions selectively
Currently you put all exception-throwing code into one big try
-block:
try:
r = http.request("GET",
location)
raw_data = r.data.decode("utf-8")
data = json.loads(raw_data)
except (urllib3.exceptions.HTTPError, JSONDecodeError):
On first glance it is not visible, that there might be a UnicodeDecodeError
, which you are not catching (intentionally?).
It might increase readability to handle the possible exceptions of one statement at a time.
Separate your concerns
The retrieval of the data from the Web-API and the caching of it seem to be two separate concerns to me, which should be handled by separate functions.
A decorator might be a fitting solution here.
Handle unavailable endpoint value as error
A non-available value from the Web-API can thus be handled through a specific exception by the decorator
#! /usr/bin/env python3
from functools import wraps
from json import JSONDecodeError, dump, load, loads
from logging import getLogger
from urllib3 import PoolManager
from urllib3.exceptions import HTTPError
LOCATION = 'http://ip.jsontest.com/'
CACHE = '/home/neumann/cache.json'
LOGGER = getLogger('MyLogger')
class DataNotRetrievable(Exception):
"""Indicates that the required data was not retrievable
from the endpoint and a cached value is required.
"""
pass
def json_cache(filename):
"""Chaches return value of the wrapped funtion to the respective file."""
def decorator(function):
"""Actual decorator."""
@wraps(function)
def wrapper(*args, **kwargs):
"""Wraps the decorated function."""
try:
json = function(*args, **kwargs)
except DataNotRetrievable:
LOGGER.exception('Could not retrieve data from website.')
with open(filename, 'r') as cache:
return load(cache)
with open(filename, 'w') as cache:
dump(json, cache)
return json
return wrapper
return decorator
@json_cache(CACHE)
def fetch_list(location=LOCATION):
"""Goes to the default location and returns a python list."""
pmgr = PoolManager()
try:
response = pmgr.request("GET", location)
except HTTPError:
raise DataNotRetrievable()
try:
text = response.data.decode()
except UnicodeDecodeError:
raise DataNotRetrievable()
try:
return loads(text)
except JSONDecodeError:
raise DataNotRetrievable()
if __name__ == '__main__':
print(fetch_list())
answered Jul 19 at 10:00
Richard Neumann
1,659620
1,659620
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f199698%2ffetching-data-through-http-and-caching-json-result-in-a-file%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password