How to optimize latency
ACE is a middleware, which typically composes or aggregates data from the different sources (hence the composition engine).
Single call to ACE can involve calls to multiple APIs that are accessible through the internet.
While service that is being accessed can handle hundreds of concurrent requests, it may never respond quicker than in 500ms
, due to network latency.
ACE scales together with the service. ACE will be able to handle hundreds of parallel requests, that all will return with minimal overhead of couple of milliseconds
per request in parallel.
When calling multiple such services in sequence, user experience can become suboptimal, because latency will stack up and response from ACE will exceed 1000ms
.
Problem
Flow getCustomerOrdersQuotesPayments
has to call three APIs to collect information necessary for the customer information screen.
Each of these calls has latency of 500ms
.
For demonstration purposes rest steps are calling Mockbin delay API
tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: orders
disableHeadersInKey: false
description: Load customer Orders
name: REST Http
condition: ""
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: quotes
disableHeadersInKey: false
description: Load customer Quotes
name: REST Http
condition: ""
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: payments
disableHeadersInKey: false
description: Load customer Payments
name: REST Http
condition: ""
sampleData: {}
Flow executed in 1524ms
{
"doc": {
"orders": {
"delay": 500
},
"quotes": {
"delay": 500
},
"payments": {
"delay": 500
}
},
"errors": [],
"performance": {
"steps": [
{
"step": "rest-new",
"executionTime": 507
},
{
"step": "rest-new",
"executionTime": 509
},
{
"step": "rest-new",
"executionTime": 507
}
],
"executionTimeOfFlow": 1523,
"timeMetric": "ms"
}
}
This is suboptimal User experience (ACE responds after 1523ms
). It would be much more better if composition would not incur such a latency.
Solution
getCustomerOrdersQuotesPayments flow has to be split into four flows.
tags: []
steps:
- stepType: mixedflow
config:
mode: flow
async: true
flowIds:
- processAsArray: false
flowId: getCustomerOrders
- processAsArray: false
flowId: getCustomerQuotes
- processAsArray: false
flowId: getCustomerPayments
concurrency: 0
description: Load customer Orders, Quotes and Payments in parralel
name: Mixed-flow
condition: ""
sampleData: {}
tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: orders
disableHeadersInKey: false
name: REST Http
description: ""
condition: ""
sampleData: {}
tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: quotes
disableHeadersInKey: false
name: REST Http
description: ""
condition: ""
sampleData: {}
tags: []
steps:
- stepType: rest-new
config:
endpoint:
url: http://mockbin/delay/500
queryParams: {}
headers: {}
restRequest: JSON
expectedResponseLocation: body
oAuthConfig: {}
json: {}
targetPath: payments
disableHeadersInKey: false
name: REST Http
description: ""
condition: ""
sampleData: {}
{
"doc": {
"orders": {
"delay": 500
},
"Quotes": {
"delay": 500
},
"payments": {
"delay": 500
}
},
"errors": [],
"performance": {
"steps": [
{
"step": "mixedflow",
"executionTime": 574
}
],
"executionTimeOfFlow": 574,
"timeMetric": "ms"
}
}
As you can see in test run output User can load the same information in just 574ms
.