Skip to main content
Version: ACE 4

How to optimize latency

ACE is a middleware, which typically composes or aggregates data from the different sources (hence the composition engine).

Single call to ACE can involve calls to multiple APIs that are accessible through the internet.

While service that is being accessed can handle hundreds of concurrent requests, it may never respond quicker than in 500ms, due to network latency.

ACE scales together with the service. ACE will be able to handle hundreds of parallel requests, that all will return with minimal overhead of couple of milliseconds per request in parallel.

When calling multiple such services in sequence, user experience can become suboptimal, because latency will stack up and response from ACE will exceed 1000ms.

Problem

Flow getCustomerOrdersQuotesPayments has to call three APIs to collect information necessary for the customer information screen.

Each of these calls has latency of 500ms.

note

For demonstration purposes rest steps are calling Mockbin delay API

getCustomerOrdersQuotesPayments Flow
{
"steps": [
{
"stepType": "rest-new",
"color": "rgb(247,225,211)",
"displayName": "REST Http",
"isSelected": false,
"config": {
"endpoint": {
"url": "http://mockbin/delay/500"
},
"queryParams": {},
"headers": {},
"restRequest": "JSON",
"expectedResponseLocation": "body",
"oAuthConfig": {},
"json": {},
"targetPath": "orders"
},
"description": "Load customer Orders"
},
{
"stepType": "rest-new",
"color": "rgb(247,225,211)",
"displayName": "REST Http",
"isSelected": true,
"config": {
"endpoint": {
"url": "http://mockbin/delay/500"
},
"queryParams": {},
"headers": {},
"restRequest": "JSON",
"expectedResponseLocation": "body",
"oAuthConfig": {},
"json": {},
"targetPath": "quotes"
},
"description": "Load customer Quotes"
},
{
"stepType": "rest-new",
"color": "rgb(247,225,211)",
"displayName": "REST Http",
"isSelected": false,
"config": {
"endpoint": {
"url": "http://mockbin/delay/500"
},
"queryParams": {},
"headers": {},
"restRequest": "JSON",
"expectedResponseLocation": "body",
"oAuthConfig": {},
"json": {},
"targetPath": "payments"
},
"description": "Load customer Payments"
}
],
"name": "getCustomerOrdersQuotesPayments"
}

Flow executed in 1524ms

Test run output
{
"doc": {
"orders": {
"delay": 500
},
"quotes": {
"delay": 500
},
"payments": {
"delay": 500
}
},
"errors": [],
"performance": {
"steps": [
{
"step": "rest-new",
"executionTime": 507
},
{
"step": "rest-new",
"executionTime": 509
},
{
"step": "rest-new",
"executionTime": 507
}
],
"executionTimeOfFlow": 1523,
"timeMetric": "ms"
}
}

This is suboptimal User experience (ACE responds after 1523ms). It would be much more better if composition would not incur such a latency.

Solution

getCustomerOrdersQuotesPayments flow has to be split into four flows.

getCustomerOrdersQuotesPaymentsNoLatency Flow
{
"steps": [
{
"stepType": "mixedflow",
"color": "rgb(189,208,196)",
"displayName": "Mixed-flow",
"isSelected": true,
"config": {
"mode": "flow",
"async": true,
"flowIds": [
{
"processAsArray": false,
"flowId": "getCustomerOrders"
},
{
"processAsArray": false,
"flowId": "getCustomerQuotes"
},
{
"processAsArray": false,
"flowId": "getCustomerPayments"
}
],
"concurrency": 0
},
"description": "Load customer Orders, Quotes and Payments in parralel"
}
],
"name": "getCustomerOrdersQuotesPaymentsNoLatency"
}
getCustomerOrders Flow
{
"steps": [
{
"stepType": "rest-new",
"color": "rgb(247,225,211)",
"displayName": "REST Http",
"isSelected": true,
"config": {
"endpoint": {
"url": "http://mockbin/delay/500"
},
"queryParams": {},
"headers": {},
"restRequest": "JSON",
"expectedResponseLocation": "body",
"oAuthConfig": {},
"json": {},
"targetPath": "orders"
}
}
],
"name": "getCustomerOrders"
}
getCustomerQuotes Flow
{
"steps": [
{
"stepType": "rest-new",
"color": "rgb(247,225,211)",
"displayName": "REST Http",
"isSelected": true,
"config": {
"endpoint": {
"url": "http://mockbin/delay/500"
},
"queryParams": {},
"headers": {},
"restRequest": "JSON",
"expectedResponseLocation": "body",
"oAuthConfig": {},
"json": {},
"targetPath": "quotes"
}
}
],
"name": "getCustomerQuotes"
}
getCustomerPayments Flow
{
"steps": [
{
"stepType": "rest-new",
"color": "rgb(247,225,211)",
"displayName": "REST Http",
"isSelected": true,
"config": {
"endpoint": {
"url": "http://mockbin/delay/500"
},
"queryParams": {},
"headers": {},
"restRequest": "JSON",
"expectedResponseLocation": "body",
"oAuthConfig": {},
"json": {},
"targetPath": "payments"
}
}
],
"name": "getCustomerPayments"
}
Test run output
{
"doc": {
"orders": {
"delay": 500
},
"Quotes": {
"delay": 500
},
"payments": {
"delay": 500
}
},
"errors": [],
"performance": {
"steps": [
{
"step": "mixedflow",
"executionTime": 574
}
],
"executionTimeOfFlow": 574,
"timeMetric": "ms"
}
}

As you can see in test run output User can load the same information in just 574ms.