Skip to main content
Version: ACE 5

How to optimize latency

ACE is a middleware, which typically composes or aggregates data from the different sources (hence the composition engine).

Single call to ACE can involve calls to multiple APIs that are accessible through the internet.

While service that is being accessed can handle hundreds of concurrent requests, it may never respond quicker than in 500ms, due to network latency.

ACE scales together with the service. ACE will be able to handle hundreds of parallel requests, that all will return with minimal overhead of couple of milliseconds per request in parallel.

When calling multiple such services in sequence, user experience can become suboptimal, because latency will stack up and response from ACE will exceed 1000ms.

Problem

Flow getCustomerOrdersQuotesPayments has to call three APIs to collect information necessary for the customer information screen.

Each of these calls has latency of 500ms.

note

For demonstration purposes rest steps are calling Mockbin delay API

getCustomerOrdersQuotesPayments Flow
    tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: orders
disableHeadersInKey: false
description: Load customer Orders
name: REST Http
condition: ""
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: quotes
disableHeadersInKey: false
description: Load customer Quotes
name: REST Http
condition: ""
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: payments
disableHeadersInKey: false
description: Load customer Payments
name: REST Http
condition: ""
sampleData: {}

Flow executed in 1524ms

Test run output
{
"doc": {
"orders": {
"delay": 500
},
"quotes": {
"delay": 500
},
"payments": {
"delay": 500
}
},
"errors": [],
"performance": {
"steps": [
{
"step": "rest-new",
"executionTime": 507
},
{
"step": "rest-new",
"executionTime": 509
},
{
"step": "rest-new",
"executionTime": 507
}
],
"executionTimeOfFlow": 1523,
"timeMetric": "ms"
}
}

This is suboptimal User experience (ACE responds after 1523ms). It would be much more better if composition would not incur such a latency.

Solution

getCustomerOrdersQuotesPayments flow has to be split into four flows.

getCustomerOrdersQuotesPaymentsNoLatency Flow
    tags: []
steps:
- stepType: mixedflow
config:
mode: flow
async: true
flowIds:
- processAsArray: false
flowId: getCustomerOrders
- processAsArray: false
flowId: getCustomerQuotes
- processAsArray: false
flowId: getCustomerPayments
concurrency: 0
description: Load customer Orders, Quotes and Payments in parralel
name: Mixed-flow
condition: ""
sampleData: {}
getCustomerOrders Flow
    tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: orders
disableHeadersInKey: false
name: REST Http
description: ""
condition: ""
sampleData: {}
getCustomerQuotes Flow
    tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: quotes
disableHeadersInKey: false
name: REST Http
description: ""
condition: ""
sampleData: {}
getCustomerPayments Flow
    tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: payments
disableHeadersInKey: false
name: REST Http
description: ""
condition: ""
sampleData: {}
Test run output
{
"doc": {
"orders": {
"delay": 500
},
"Quotes": {
"delay": 500
},
"payments": {
"delay": 500
}
},
"errors": [],
"performance": {
"steps": [
{
"step": "mixedflow",
"executionTime": 574
}
],
"executionTimeOfFlow": 574,
"timeMetric": "ms"
}
}

As you can see in test run output User can load the same information in just 574ms.