Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Info

To customize the amount of cache stored in memory, consult your CSM.

  • Because API Versions use endpoints generated in the SnapLogic IIP, the HTTP Response Cache policy supports Proxy endpoints only by design. In the UI, the API Policy Manager menus API and Version Details tabs do not display this policy.

  • The entire HTTP response is cached, including headers.

Policy Execution Order

This policy runs after every request and response policy.

...

Parameter Name

Description

Default Value

Label

Required. The name for the API policy.

HTTP Respone Cache

When this policy should be applied

An expression-enabled field that determines the condition to be fulfilled for the API policy to execute.

For example, if the value in this field is request.method == "POST", the API policy is executed only if the request method is a POST.

N/A

Cache Interval

The time period of the current cache before it is evicted.

1

Time Unit

The time unit for the Cache Interval value.

Hour

Use HTTP Request Headers to Create Cache Keys

Enables use of all header values as part of the generated cache key.

Unselected

Use HTTP Request Query Parameter to Create Cache Keys

Enables use of all query parameter values as part of the generated cache key.

Unselected

Status

Specifies whether the API policy is enabled or disabled. 

Enabled

Example

...

Best Practices

Do not use any headers and query parameters to generate keys:

...

string parameters, like a timestamp, as part of your cache key, or it will never result in a fulfilled request. You can optionally remove those headers using the Request Transformer policy. Or, avoid headers query string parameters as part of the cache key and only as needed when a unique value is present per request.

Example

To demonstrate the advantage of caching reoccurring and/or static HTTP responses, the following proxy endpoint GET request is provided as an example.

This proxy endpoint reaches out to a calculator service that multiplies two integers, and the endpoint can be configured to cache responses whose query string keys are intA and intB.

Code Block
GET
/gateway/proxy/multiply?intA=[valueA]&intB=[valueB]

...

With this configuration, HTTP responses are cached for HTTP requests containing query string keys of intA and intB .

For example, the The following GET requests each have a separate entry:have these query strings.

Code Block
GET Request 1
/gateway/proxy/multiply?intA=5&intB=5

GET Request 2
/gateway/proxy/multiply?intA=2&intB=2

GET Request 3
/gateway/proxy/multiply?intA=3&intB=1

...

Where each cached HTTP response is:

Code Block
Response 1 
{
    "operation": "5 x 5", 
    "result": 25
}

Response 2 
{
    "operation": "2 x 2", 
    "result": 4
}

Response 3
{
    "operation": "3 x 1", 
    "result": 3
}

If int C x int D were additional existing parameters, they would not be cached because they're not configured in the policy.

The entire HTTP response is cached, including headers.

Best Practices

Do not use any headers and query string parameters, like a timestamp, as part of your cache key, or it will never result in a fulfilled request. You can optionally remove those headers using the Request Transformer policy. Or, avoid headers query string parameters as part of the cache key and only as needed when a unique value is present per request.When future requests are made to the proxy endpoint for Request 1, Request 2, and Request 3, HTTP responses can be served from the HTTP Response Cache policy with payloads from Response 1, Response 2, and Response 3 instead of querying the upstream server, and therefore, freeing up computation resources for other operations.

Furthermore, each policy can be configured to evict cached responses for a specified timeframe. In a case where cached responses are expected to change after a 40-hour cycle, a Cache Interval of 40 and Time Unit of HOURS can be set so that the policy can evict previous HTTP responses and serve updated cached responses.

For example, the following GET request for billable work hours per week demonstrates the use of specifying a Cache Interval of 40 and a Time Unit of HOURS:

Code Block
GET
/gateway/proxy/last-weeks-billable-hours?user='admin'

The cached response is:

Code Block
Response 
{
    "week_of":  "01-01-2024",
    "department":  "Engineering", 
    "billable_hours":  3000, 
    "user:  "admin"
}

The policy continues serving cached HTTP responses from the previous example until it is evicted per the policy configurations.

Any GET requests to this proxy endpoint on the week of 01-08-2024 elapse the specified cache interval, and future cached responses return payloads like the following for the specified interval:

Code Block
Response 
{
    "week_of":  "01-08-2024",
    "department":  "Engineering", 
    "billable_hours":  4500, 
    "user:  "admin"
}