Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Overview

On this page

Table of Contents
minLevel1
maxLevel3
outlinefalse
typelist
printablefalse

The HTTP Response Cache policy adds the capability for API Proxies to cache reoccurring and/or static HTTP responses to manage latency, decrease response times, and reduce load on a proxied server. By default, cached HTTP responses are stored based on the proxy's URL and request method. Additionally, you can configure cache responses for specified request header keys and query string keys. Following policy configuration, cached responses are returned to HTTP clients instead of requesting resources from the proxied server.

...

  • Because API Versions use endpoints generated in the SnapLogic IIP, the HTTP Response Cache policy supports Proxy endpoints only by design. In the UI, the API Policy Manager menus API and Version Details tabs do not display this policy.

  • The entire HTTP response is cached, including headers.

Policy Execution Order

This policy runs after every request and response policy.

Known Issues

Both the HTTP Cache Response and HTTP Retry Policies fail to evaluate expressions. As a result, the When this policy should be applied field should not be set. Doing so can cause issues with the way the policy is applied.

Architecture

Each Snaplex node has its own cache, so the cache for an incoming request might be missed on each node until properly distributed and hydrated on each node.

...

The policy supports incorporating headers and query parameters as part of the cache key (not to access the cache key). The headers and query parameter values are used as part of the generated cache key to ensure unique cache entries as needed.

Usage Guidelines

  • We limit each policy to store 100 entries in the cache per policy. Accordingly, the policy evicts the least used entry after 100.

  • Expired caches cannot be accessed, and new entries overwrite any existing ones. The maximum quantity of bytes that an HTTP response caches Response Caches in the default configuration is 85.83 MB of the payload.

These properties are configurable through Feature Flags. Refer to your CSM to modify these property settings.

Limitations

  • Each response cache can only contain 85 MB. The policy always returns the response, but any data over the limit renders the payload incomplete without notice.

  • A response cache is not effective to use for POST and PUT HTTP methods because these operations are meant to alter the state of data, and therefore should not be cached.

Settings

Parameter Name

Description

Default Value

Label

Required. The name for the API policy.

HTTP Respone Cache

When this policy should be applied

An expression-enabled field that determines the condition to be fulfilled for the API policy to execute.

For example, if the value in this field is request.method == "POST", the API policy is executed only if the request method is a POST.

N/A

Cache Interval

The time period of the current cache before it is evicted.

1

Time Unit

The time unit for the Cache Interval value.

Hour

Use HTTP Request Headers to Create Cache Keys

Enables use of all header values as part of the generated cache key.

Unselected

Use HTTP Request Query Parameter to Create Cache Keys

Enables use of all query parameter values as part of the generated cache key.

Unselected

Status

Specifies whether the API policy is enabled or disabled. 

Enabled

Best Practices

  • Do not use any headers and query string parameters, like a timestamp, as part of your cache key, or it will never result in a fulfilled request. You can optionally remove those headers using the Request Transformer policy. Or, avoid headers query string parameters as part of the cache key and only as needed when a unique value is present per request.

  • To invalidate cached data, use the Invalidate Response Cache API. The Invalidate Response Cache API enables you to keep cached data up to date with the latest changes. You can define caching rules for the specific API endpoint, specifying parameters such as cache duration, cache key generation, and cache storage. The cache invalidation mechanism monitors changes in the data source and updates the cache as configured to ensure data consistency.

Example

To demonstrate the advantage of caching reoccurring and/or static HTTP responses, the following proxy endpoint GET request is provided as an example.

...

When future requests are made to the proxy endpoint for Request 1, Request 2, and Request 3, HTTP responses can be served from the HTTP Response Cache policy with payloads from Response 1, Response 2, and Response 3 instead of querying the upstream server, and therefore, freeing up computation resources for other operations.

...