Caching expensive API calls in DynamoDb

 Clash Royale CLAN TAG#URR8PPP
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
0
down vote
favorite
We have an API which is expensive to call. Each call costs us $ and information freshness is near 2 weeks, so we want to have some caching. We use AWS stack.
The API client is a basic ApacheHttpClient-based class:
TypicalApiClient
I came up with this approach to have an annotation:
@DynamoDbCached(table="typical_api_client") 
so before calling the real method we go into the aspect implementation:
DynamoDbTable will have a resourceUrl - resource URL we call, for example:
"GET-http://someapi.com/product/123:55666778" 
its unique and apiResponse field which contains JSON response from API we had.
class DynamoDbCachedAspect
 public String handleCall(ProceedingJoinPoint point)
 BaseHttpRequest request = point.getArgs()[0];
 // resourceUrl <- we build this one from request object
 String resourceUrl = "GET http://someapi.com/product/123:556667"; 
 // here I decided to have two caches - one ehcache for in-memory/disk 
 String cachedApiResult = ehCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult; // we return if we already have it in memory/disk
 
 cachedApiResult = dynamoDbCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult;
 
 // if no cache hits, then call real method
 String apiResponse = joinPoint.proceed(args);
 // then we save everything into cache, so we don't call api later
 ehCache.put(resourceUrl, apiResponse);
 dynamoDbCache.put(resourceUrl, apiResponse);
 return apiResponse;
 
Now the most interesting part. DynamoDb has batch calls. No problem to call dynamodb every time we need something from cache or to write to cache. But they have batches - 25 items for PUT operation and 100 items for GET operation.
So my ideas are, say, 100 threads calling dynamoDb.get(resourceUrl):
class LockableResponse
 // url of api that we call "GET http://something.co/1233
 String resourceUrl; 
 // see LockableResponses class, but basically its index from array we use
 Integer index;
 // this is api response that we extract from dynamodb
 String apiResponse; 
 // semaphore we will use to block the incoming thread until we have response
 Semaphore semaphore = new Semaphore(0);
 public void clean()
 index = -1;
 resourceUrl = null;
 apiResponse = null;
 
// here we really want to associate resourceUrl argument that each thread brings
// with LockableResponse structure, so we can wait for response and wake up 
// when our response is ready
ConcurrentHashMap<String, LockableResponse> responseMap = new ConcurrentHashMap<>();
String get(resourceUrl)
 // we either sleep if there's no place in batch or we proceed
 LockableResponse response = lockableResponses.take();
 // we save our resource url
 response.setResourceUrl(resourceUrl);
 // I'm thinking about taking 
 String apiResponse = response.getApiResponse();
 lockableResponses.release(response);
 return apiResponse; 
// thinking about this class as a container for 
// incoming requests that will wait for response
 class LockableResponses
 // so rather than creating these classes I decided to reuse them
 // and have rather "indexes" to be shared
 // so when we want to take resource we take index from queue 
 // or sleep if there's no "capacity" 
 private BlockingQueue<Integer> lockableIndexes = new LinkedBlockingQueue<>();
 private LockableResponse lockableResponses;
 public LockableResponses(int batchCapacity)
 lockableResponses = new LockableResponse[batchCapacity];
 for(int i=0; i<batchCapacity; i++)
 lockableResponses[i] = new LockableResponse();
 lockableIndexes.add(i);
 
 
 public LockableResponse take()
 Integer availableIndex = lockableIndexes.take();
 LockableResponse response = lockableResponses[availableIndex];
 response.setIndex(availableIndex);
 return response;
 
 public void release(LockableResource lockableResponse) 
 lockableResponse.clean();
 
java multithreading database cache
add a comment |Â
up vote
0
down vote
favorite
We have an API which is expensive to call. Each call costs us $ and information freshness is near 2 weeks, so we want to have some caching. We use AWS stack.
The API client is a basic ApacheHttpClient-based class:
TypicalApiClient
I came up with this approach to have an annotation:
@DynamoDbCached(table="typical_api_client") 
so before calling the real method we go into the aspect implementation:
DynamoDbTable will have a resourceUrl - resource URL we call, for example:
"GET-http://someapi.com/product/123:55666778" 
its unique and apiResponse field which contains JSON response from API we had.
class DynamoDbCachedAspect
 public String handleCall(ProceedingJoinPoint point)
 BaseHttpRequest request = point.getArgs()[0];
 // resourceUrl <- we build this one from request object
 String resourceUrl = "GET http://someapi.com/product/123:556667"; 
 // here I decided to have two caches - one ehcache for in-memory/disk 
 String cachedApiResult = ehCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult; // we return if we already have it in memory/disk
 
 cachedApiResult = dynamoDbCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult;
 
 // if no cache hits, then call real method
 String apiResponse = joinPoint.proceed(args);
 // then we save everything into cache, so we don't call api later
 ehCache.put(resourceUrl, apiResponse);
 dynamoDbCache.put(resourceUrl, apiResponse);
 return apiResponse;
 
Now the most interesting part. DynamoDb has batch calls. No problem to call dynamodb every time we need something from cache or to write to cache. But they have batches - 25 items for PUT operation and 100 items for GET operation.
So my ideas are, say, 100 threads calling dynamoDb.get(resourceUrl):
class LockableResponse
 // url of api that we call "GET http://something.co/1233
 String resourceUrl; 
 // see LockableResponses class, but basically its index from array we use
 Integer index;
 // this is api response that we extract from dynamodb
 String apiResponse; 
 // semaphore we will use to block the incoming thread until we have response
 Semaphore semaphore = new Semaphore(0);
 public void clean()
 index = -1;
 resourceUrl = null;
 apiResponse = null;
 
// here we really want to associate resourceUrl argument that each thread brings
// with LockableResponse structure, so we can wait for response and wake up 
// when our response is ready
ConcurrentHashMap<String, LockableResponse> responseMap = new ConcurrentHashMap<>();
String get(resourceUrl)
 // we either sleep if there's no place in batch or we proceed
 LockableResponse response = lockableResponses.take();
 // we save our resource url
 response.setResourceUrl(resourceUrl);
 // I'm thinking about taking 
 String apiResponse = response.getApiResponse();
 lockableResponses.release(response);
 return apiResponse; 
// thinking about this class as a container for 
// incoming requests that will wait for response
 class LockableResponses
 // so rather than creating these classes I decided to reuse them
 // and have rather "indexes" to be shared
 // so when we want to take resource we take index from queue 
 // or sleep if there's no "capacity" 
 private BlockingQueue<Integer> lockableIndexes = new LinkedBlockingQueue<>();
 private LockableResponse lockableResponses;
 public LockableResponses(int batchCapacity)
 lockableResponses = new LockableResponse[batchCapacity];
 for(int i=0; i<batchCapacity; i++)
 lockableResponses[i] = new LockableResponse();
 lockableIndexes.add(i);
 
 
 public LockableResponse take()
 Integer availableIndex = lockableIndexes.take();
 LockableResponse response = lockableResponses[availableIndex];
 response.setIndex(availableIndex);
 return response;
 
 public void release(LockableResource lockableResponse) 
 lockableResponse.clean();
 
java multithreading database cache
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
We have an API which is expensive to call. Each call costs us $ and information freshness is near 2 weeks, so we want to have some caching. We use AWS stack.
The API client is a basic ApacheHttpClient-based class:
TypicalApiClient
I came up with this approach to have an annotation:
@DynamoDbCached(table="typical_api_client") 
so before calling the real method we go into the aspect implementation:
DynamoDbTable will have a resourceUrl - resource URL we call, for example:
"GET-http://someapi.com/product/123:55666778" 
its unique and apiResponse field which contains JSON response from API we had.
class DynamoDbCachedAspect
 public String handleCall(ProceedingJoinPoint point)
 BaseHttpRequest request = point.getArgs()[0];
 // resourceUrl <- we build this one from request object
 String resourceUrl = "GET http://someapi.com/product/123:556667"; 
 // here I decided to have two caches - one ehcache for in-memory/disk 
 String cachedApiResult = ehCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult; // we return if we already have it in memory/disk
 
 cachedApiResult = dynamoDbCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult;
 
 // if no cache hits, then call real method
 String apiResponse = joinPoint.proceed(args);
 // then we save everything into cache, so we don't call api later
 ehCache.put(resourceUrl, apiResponse);
 dynamoDbCache.put(resourceUrl, apiResponse);
 return apiResponse;
 
Now the most interesting part. DynamoDb has batch calls. No problem to call dynamodb every time we need something from cache or to write to cache. But they have batches - 25 items for PUT operation and 100 items for GET operation.
So my ideas are, say, 100 threads calling dynamoDb.get(resourceUrl):
class LockableResponse
 // url of api that we call "GET http://something.co/1233
 String resourceUrl; 
 // see LockableResponses class, but basically its index from array we use
 Integer index;
 // this is api response that we extract from dynamodb
 String apiResponse; 
 // semaphore we will use to block the incoming thread until we have response
 Semaphore semaphore = new Semaphore(0);
 public void clean()
 index = -1;
 resourceUrl = null;
 apiResponse = null;
 
// here we really want to associate resourceUrl argument that each thread brings
// with LockableResponse structure, so we can wait for response and wake up 
// when our response is ready
ConcurrentHashMap<String, LockableResponse> responseMap = new ConcurrentHashMap<>();
String get(resourceUrl)
 // we either sleep if there's no place in batch or we proceed
 LockableResponse response = lockableResponses.take();
 // we save our resource url
 response.setResourceUrl(resourceUrl);
 // I'm thinking about taking 
 String apiResponse = response.getApiResponse();
 lockableResponses.release(response);
 return apiResponse; 
// thinking about this class as a container for 
// incoming requests that will wait for response
 class LockableResponses
 // so rather than creating these classes I decided to reuse them
 // and have rather "indexes" to be shared
 // so when we want to take resource we take index from queue 
 // or sleep if there's no "capacity" 
 private BlockingQueue<Integer> lockableIndexes = new LinkedBlockingQueue<>();
 private LockableResponse lockableResponses;
 public LockableResponses(int batchCapacity)
 lockableResponses = new LockableResponse[batchCapacity];
 for(int i=0; i<batchCapacity; i++)
 lockableResponses[i] = new LockableResponse();
 lockableIndexes.add(i);
 
 
 public LockableResponse take()
 Integer availableIndex = lockableIndexes.take();
 LockableResponse response = lockableResponses[availableIndex];
 response.setIndex(availableIndex);
 return response;
 
 public void release(LockableResource lockableResponse) 
 lockableResponse.clean();
 
java multithreading database cache
We have an API which is expensive to call. Each call costs us $ and information freshness is near 2 weeks, so we want to have some caching. We use AWS stack.
The API client is a basic ApacheHttpClient-based class:
TypicalApiClient
I came up with this approach to have an annotation:
@DynamoDbCached(table="typical_api_client") 
so before calling the real method we go into the aspect implementation:
DynamoDbTable will have a resourceUrl - resource URL we call, for example:
"GET-http://someapi.com/product/123:55666778" 
its unique and apiResponse field which contains JSON response from API we had.
class DynamoDbCachedAspect
 public String handleCall(ProceedingJoinPoint point)
 BaseHttpRequest request = point.getArgs()[0];
 // resourceUrl <- we build this one from request object
 String resourceUrl = "GET http://someapi.com/product/123:556667"; 
 // here I decided to have two caches - one ehcache for in-memory/disk 
 String cachedApiResult = ehCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult; // we return if we already have it in memory/disk
 
 cachedApiResult = dynamoDbCache.get(resourceUrl);
 if (cachedApiResult != null)
 return cachedApiResult;
 
 // if no cache hits, then call real method
 String apiResponse = joinPoint.proceed(args);
 // then we save everything into cache, so we don't call api later
 ehCache.put(resourceUrl, apiResponse);
 dynamoDbCache.put(resourceUrl, apiResponse);
 return apiResponse;
 
Now the most interesting part. DynamoDb has batch calls. No problem to call dynamodb every time we need something from cache or to write to cache. But they have batches - 25 items for PUT operation and 100 items for GET operation.
So my ideas are, say, 100 threads calling dynamoDb.get(resourceUrl):
class LockableResponse
 // url of api that we call "GET http://something.co/1233
 String resourceUrl; 
 // see LockableResponses class, but basically its index from array we use
 Integer index;
 // this is api response that we extract from dynamodb
 String apiResponse; 
 // semaphore we will use to block the incoming thread until we have response
 Semaphore semaphore = new Semaphore(0);
 public void clean()
 index = -1;
 resourceUrl = null;
 apiResponse = null;
 
// here we really want to associate resourceUrl argument that each thread brings
// with LockableResponse structure, so we can wait for response and wake up 
// when our response is ready
ConcurrentHashMap<String, LockableResponse> responseMap = new ConcurrentHashMap<>();
String get(resourceUrl)
 // we either sleep if there's no place in batch or we proceed
 LockableResponse response = lockableResponses.take();
 // we save our resource url
 response.setResourceUrl(resourceUrl);
 // I'm thinking about taking 
 String apiResponse = response.getApiResponse();
 lockableResponses.release(response);
 return apiResponse; 
// thinking about this class as a container for 
// incoming requests that will wait for response
 class LockableResponses
 // so rather than creating these classes I decided to reuse them
 // and have rather "indexes" to be shared
 // so when we want to take resource we take index from queue 
 // or sleep if there's no "capacity" 
 private BlockingQueue<Integer> lockableIndexes = new LinkedBlockingQueue<>();
 private LockableResponse lockableResponses;
 public LockableResponses(int batchCapacity)
 lockableResponses = new LockableResponse[batchCapacity];
 for(int i=0; i<batchCapacity; i++)
 lockableResponses[i] = new LockableResponse();
 lockableIndexes.add(i);
 
 
 public LockableResponse take()
 Integer availableIndex = lockableIndexes.take();
 LockableResponse response = lockableResponses[availableIndex];
 response.setIndex(availableIndex);
 return response;
 
 public void release(LockableResource lockableResponse) 
 lockableResponse.clean();
 
java multithreading database cache
edited Mar 26 at 5:21


200_success
123k14143401
123k14143401
asked Feb 5 at 11:09
Sergii Nevydanchuk
1948
1948
add a comment |Â
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f186804%2fcaching-expensive-api-calls-in-dynamodb%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password