On this page
Overview
Use this policy with highly reoccurring and static responses to manage latency and response times, and reduce load on the proxy server. An incoming HTTP request to one of your Proxy endpoints returns the cached response after a specified time period.
The API owner can control time intervals and configure query strings or headers so that you can cache responses that contain those (with every key-value pair).
In the policy settings, you specify the cache key parameters that HTTP requests map to and set the cache response time and refresh interval.
Because API Versions use endpoints generated in the SnapLogic IIP, the HTTP Response Cache policy supports Proxy endpoints only by design. In the UI, the API Policy Manager menus API and Version Details tabs do not display this policy.
Policy Execution Order
This policy is applied after all other API policies. All the security, authorization and traffic-shaping policies occur before this policy is applied.
Known Issue
Both the HTTP Cache Response and HTTP Retry Policies fail to evaluate expressions. As a result, the When this policy should be applied field should not be set. Doing so can cause issues with the way the policy is applied.
Architecture
This policy offers a map to send responses. The cache must be enabled through feature flags on the Snaplex.
You can confirm the following cache key types for the client:
Protocol
Host
Path
HTTP Method
The policy supports using headers and query parameters to access the cache keys. The key is hashed with a SHA1 algorithm.
Expired caches cannot be accessed and new entries overwrite any existing ones.
Limitations
Each response cache can only contain 85 MB. The policy always returns the response, but any data over the limit renders the payload incomplete.
A response cache is not effective to use for POST and PUT HTTP methods because these operations are meant to alter the state of data, and hence should not be cached.
Settings
Parameter Name | Description | Default Value |
---|---|---|
Label | Required. The name for the API policy. | HTTP Respone Cache |
When this policy should be applied | An expression-enabled field that determines the condition to be fulfilled for the API policy to execute. For example, if the value in this field is | N/A |
Cache Interval | The time period of the current cache before it is refreshed. | 1 |
Time Unit | The time unit for the Cache Interval value. | Hour |
Use HTTP Request Headers to Create Cache Keys | Enables the use of specific headers to identify a cache. | Unselected |
Use HTTP Request Query Parameter to Create Cache Keys | Enables the use of query parameters to identify a cache. | Unselected |
Status | Specifies whether the API policy is enabled or disabled. | Enabled |
Example
The policy supports using headers and query parameters to access the cache keys:
We can recommend that header and query string keys can be configured for caching HTTP responses with unique key-value pairs. For example in a REST GET endpoint that multiplies two integers:
GET /gateway/proxy/multiply?intA=[valueA]&intB=[valueB]
Configuring the policy to use header keys intA and intB to cache policies for every key-value pair that has query strings with intA and intB.
For example, the following GET requests will have cached responses when using the HTTP Response Cache policy:
/gateway/proxy/multiply?intA=5&intB=5 /gateway/proxy/multiply?intA=2&intB=2 /gateway/proxy/multiply?intA=3&intB=1
The following HTTP responses will be cached so that the proxied server does not need to perform the calculation:
Response 1 { "operation": "5 x 5", "result": 25 } Response 2 { "operation": "2 x 2", "result": 4 } Response 3 { "operation": "3 x 1", "result": 3 }