Jessica Edwards
Jessica is the Director of Content Marketing at Cockroach Labs, the makers of CockroachDB. She has a passion for open source and has been marketing to developers for 10+ years.

As Facebook, Google and new-age fintech apps like Betterment and Robinhood have driven consumers to expect feature-rich applications, every business needs to be concerned with performance. While this remains a top priority for financial services organizations, another component unique to the banking industry needs to be top of mind — keeping latency low. Another significant benefit of cloud native applications for banks is that they can support very high volumes of transactions with low latency.

Traditionally, the main drivers for reducing latency have been to speed up transactions and increase revenue — or minimize potential lost revenue. In its Gospel of Speed from 2012, Google revealed that a 400ms delay in search page generation caused a 0.44% drop in search traffic volume. Now, with every Google search, you’ll see how many milliseconds it took to return results. Around the same time, in a presentation by a staff engineer, Amazon revealed that every 100ms of latency cost it 1% in sales. There are countless other examples of companies facing lost revenue due to delays or downtime.

Read More:   Update Stream Processing Could Replace ETL Deployments

In financial services, low latency is most often associated with high-frequency financial trading; or trading that is entirely automated and optimized to take advantage of changing market prices. These applications make millions of decisions per millisecond, so receiving data a fraction of a second faster than a competitor’s systems can equate to millions of dollars. Most other industries do not have these demands, especially under strict compliance guidelines and regulations.

But low latency should be a concern for every financial services organization. A recent study found that nearly 90% of business leaders need low latency of 10ms or less to ensure their applications’ success. Financial services organizations must also consider the effect of latency on new use cases — like cryptocurrency, edge computing, artificial intelligence (AI), and machine learning (ML). By taking advantage of low latency, data scientists can make informed real-time business intelligence decisions. Also, banks can use AI for real-time fraud detection.

Multiregion Deployments and Geo-Partitioning

The majority of data moves between elements in a distributed application over public networks. This means even a perfectly architected application can experience lag if it has to communicate with a database thousands of miles away. As most banks have operations spanning regions, nations, and even the globe, they need to make infrastructure decisions considering these dispersed applications or customers. The solution is simple in theory: put the data closer to the application or customer. One way to keep the data closer to users is through a multiregion deployment.

For a U.S. financial software company seeking a new database solution for its customer identity access management (CIAM), a multiregion deployment was the solution for achieving high performance and consistency. The CIAM layer was initially built on Oracle, with multiregion replication using GoldenGate. However, the company soon discovered this configuration did not provide the speed or the always-available login experience it needed. Customers would experience a lag in authentication after creating an account, resulting in a poor user experience. The team decided to deploy CockroachDB across three AWS regions in the U.S., which brought resiliency by replicating data and distributing the replicas to maximize geodiversity.

Read More:   Give Jenkins a Software Bill of Materials with Syft – InApps 2022

However, multiregion deployments can be complicated for organizations with distributed databases, because managing state across a set of machines is never straightforward. Organizations need to determine if the benefits outweigh the costs, since using a single-region deployment is detrimental to speed and availability. This is where geo-partitioning of data comes in. Geo-partitioning provides row-level replication control, meaning organizations can attach data to a specific location.

A global financial data firm, for example, deployed CockroachDB to reduce latency across four GCP regions and two on-premise data centers, by creating a single hybrid, geo-partitioned deployment. The firm had outgrown its expensive and dated Oracle database architecture. It chose CockroachDB to migrate its identity access management microservice, because the geo-partitioning features provided a solution for authenticating entities — even when strongly tied to specific geographic regions.

Geo-partitioning can also work even if a customer moves or travels, which is crucial for payment applications.

Don’t Forget Regulations

Beyond the speed complications for banks operating in multiple regions, financial organizations need to consider data regulations.

Data privacy is a hot-button issue, with new laws and regulations coming into effect every year.

At the start of 2020, more than 120 countries had more than 200 legislations to protect data and consumer privacy. These regulations range from newer state-wide mandates, like the California Consumer Privacy Act of 2018 (CCPA), which gives consumers more control over the personal information that businesses collect about them, to sweeping regulations like the European Union’s General Data Protection Regulation (GDPR), which covers everything from data collection and sharing to data storage, erasure, destruction, and even more.

The most important aspect of these regulations for organizations to keep in mind when growing a broad regional or global customer base, is that they often prohibit storing certain data outside of certain boundaries. For example, this could mean a U.S. bank with customers in Europe may need to store those customers’ data within the EU. Keeping data closer to the application or customer offers another important benefit of geo-partitioning. The ability to pin data to a specific location can help ensure compliance in countries or regions that require data to be stored within the borders.

Read More:   Update Add It Up: Why Salesforce and Google Bought Tableau and Looker

By using a database solution that offers geo-partitioning for multiregion deployments, developers at banks and financial services organizations can designate where data should be stored at the database, table, and row-level. With this, organizations can deliver their applications with the lowest possible latency while keeping compliant with the latest data protection and privacy regulations.

Download the eBook ‘How Financial Service Companies Can Successfully Migrate Critical Applications to the Cloud’ to learn more. 

Feature image via Pixabay.