GPU-Accelerated Databases: Addressing FRTB and Other Performance-at-Scale Challenges in Financial Services
Financial Services companies have a growing need for performance-at-scale access to very large datasets (e.g. for FRTB reporting and analysis). In support of this trend, Citihub Digital has completed some initial baselining and comparative analysis of GPU Accelerated database solutions, to determine the maturity and performance potential of these tools.
Graphics Processing Units (GPUs) began life as dedicated chips in gaming PCs and consoles, but their ability to offload and parallelise computational activities from CPUs was soon hijacked by research and analytics teams, eager to repurpose existing compute grids used to run Monte Carlo simulations in industries such as financial services and energy. More recently, the role of GPUs has extended further into the database space, offering the potential to massively parallelise database jobs when run on dedicated GPU hardware.
The business challenge. Capital Markets trading firms today have both ‘interactive data’ and ‘Big Data’ challenges around market risk reporting, compounded by regulatory drivers that are creating a compelling need for a unified, rich and near real-time risk data set for the front office, which must be easy to query. In many firms, this type of unified risk reporting is currently constrained as datasets are distributed across collections of asset-class aligned systems, that typically only refresh once per day.
Interactive data challenges represent the ability to capture, process and report changes at scale and in near real-time – e.g. capturing live updates with ticking market data or intra-day limit monitoring.
Big Data challenges represent the ability to easily access the huge quantities of risk result information that are generated daily across many financial services firms without losing fidelity through aggregation. Use cases for this are typically data analytics-based, e.g. mining the data for trends, correlations and historic events. Users in these cases are able to wait minutes (or longer) for responses to these types of large-scale queries.
In most firms, current risk technology stacks don’t lend themselves well to the unification of scale and performance as defined by the need to support both interactive queries, and ever-expanding Big Data sized datasets.
A potential new technology solution? There are several new GPU database solutions starting to make a big noise in IT circles, reporting vast performance increases over both traditional and Big Data style SQL database solutions. Independent technologist Mark Litwintschik runs a popular blog reporting benchmarks across a range of these database solutions, which demonstrates clearly the potential of GPU solutions using generic datasets (http://tech.marksblogg.com/benchmarks.html).
So we got curious. Given the performance-at-scale results reporting challenges we were seeing in leading financial services firms, we were curious to know whether GPU accelerated databases could assist with vastly improved SQL processing times. Could the performance gains offered by GPU databases have the potential to solve the interactive data and Big Data challenges facing financial services firms?
To put this potential to the test, we worked with a tier 1 investment bank, to generate a pseudo financial services industry risk result dataset, suited specifically to FRTB reporting and analysis and set about putting the GPU databases to the test.
Our objective was to see how fast GPU databases could potentially be and in particular, assess:
- query performance – as compared to alternative platforms
- scalability and relative performance – on a single node type system
- overall product enterprise readiness and functional maturity
Thus, taking this approach, we would be able to consider interesting questions like:
- are GPU databases candidates to support live intra-day type blotters?
- can we envisage pre-calculated OLAP cubes being replaced by GPU databases fulfilling live queries on in-bound streaming data?
- are GPU databases viable and performant alternatives to Big Data solutions?
- are GPU databases mature enough for mission-critical solutions?
To read our findings, download the paper now.
Make data accessible, usable, accurate and secure.
NYU partnered with Citihub to offer a course on public cloud security technologies
Citihub was recently added as an industry partner to New York University’s (NYU Tandon) Cyber Security program. Exclusive to NYU Cyber...
Ian Tivey & Jim Oulton Named Technical Directors
Ian Tivey and Jim Oulton have been promoted to Technical Directors, a role reserved for senior leaders in Citihub who provide...
In the press
Using a ‘Three Lines of Defense’ Program to Balance Development Stakeholder Needs
Using the NIST three layers of defence as a framework, Citihub’s Glen Notman outlines how to leverage agile development capabilities and underpin them...
In the press
The Balancing Act
In this podcast, we will go into the details of how the “technical” automation-for-speed perspective is shifting to a “business-centric” perspective...
Life (and work) in the time of Corona
Less than two months after starting his job at Citihub, Senior Consultant Luis Carrazana, together with the rest of New York,...
In the press
Role of Security in a Digital First Enterprise
Join Citihub’s Glen Notman as he injects practical insights on how to enable security practices in a digital enterprise.
In the press
Compliance Challenges in a Lockdown World
The ongoing coronavirus crisis has changed business norms around the world, but as organisations struggle to come to terms with large-scale...
In the press
Institutionalizing DevSecOps in the Large Enterprise
Citihub’s Chris Zanelli, joined by several industry peers, will discuss topics across DevOps & DevSecOps, Enterprise Compliance as Code, Cloud Compliance...
Military Veterans are Welcome at Citihub Digital
This Memorial Day, when the rest of the United States of America will pay tribute to the military personnel who have...