Risk Focus has been serving the capital markets and financial services industry for over 15 years, specializing in risk management systems and regulatory reporting. Clients include top-tier banks and several of the largest hedge funds. We sat down with Cary Dym, global business development manager for DevOps, and Subir Grewal, lead DevOps architect, to learn about how the firm is helping clients harness the full power of their digital transformation.
Tell us about Risk Focus and the unique practices you provide to your clients?
Risk Focus is headquartered in New York City with field offices in London, Amsterdam, and Toronto. Through our extensive work and experience with banks, brokers, exchanges, prime brokers, and hedge fund managers, we have a deep understanding of the needs and constraints our customers operate under.
One of the biggest changes we have observed over the last six years is the profound business opportunities that are presented to the financial industry by cloud providers and DevOps practices. But in order to capture these opportunities, our clients need to undergo big changes in processes and culture—what is usually referred to as a DevOps transformation.
Describe the journey you take with your clients as they embrace agile methodologies and move to the cloud.
Typically, there is a specific objective driving the transition to DevOps. There might be a top-down initiative to be more efficient or a new business outcome to meet. The decision to move to the cloud is frequently cost-based. Clients might not want to reinvest in a data center or minimize new investments in their data centers, for example.
Clients who use our agile transformation services can get practical help with building out a CI/CD pipeline to transform how they plan and deliver projects. Anyone can use the various cloud services or IaaS offerings, but the real prize is in the holistic change of mindset and methodology, which is the transition to agile from waterfall that leads to higher velocity and improved flexibility. In order to do that, we focus on tangible and practical deliverables while leading by example.
Our approach involves not only coaching and training to streamline workflows, but also implementing and helping clients optimize the required technology. Frequently, the best way to start the move to the cloud is to deliver innovation more efficiently on-premises. Once processes and patterns are adapted and maintained, they can easily be moved off-site where the value is realized more quickly, predictably, and thoroughly.
For large banks, these transformations tend to be multi-year journeys. The issue is less about the technology and more about organizational challenges that involve the need to embrace change across people, governance, and processes. It is easy to get stuck on a lower level of the maturity curve just two to three quarters into the transition because many times, clients have an impending business deadline, so they fall back into their old ways of doing things and slow down the adoption of automation and full-fledged DevOps pipelines.
What are some of the barriers that get in the way of a fully realized transformation?
The biggest barriers have to do with process and mindset. The smallest barrier is the technology, so we frequently start from there with practical use cases. Initially, we help clients accelerate application development by building a more efficient pipeline. We coach them on where they can streamline processes that helps them think about how they can holistically change the way they work on applications, from beginning to end.
As they start to mature, we work with them to introduce automated testing to get them closer to a CI/CD software delivery model. But this is often where things get stuck. For a majority of our clients, these speed bumps include access to good data for testing, the ability to spin up test environments on-demand, and concerns about data security when moving to the cloud.
It’s critical for organizations to eliminate concerns about securing and provisioning realistic data to lower-level test environments, and implementing and masking test data in the release pipeline helps mitigate those concerns.
How would you describe the impact when businesses are able to overcome their data-related challenges?
The benefit is increased velocity, which enables companies to build more functionality, faster, and more frequently. Data is a common problem for many enterprises, and what we notice is that an average client who leverages data virtualization and automated masking can reduce test data provisioning from a multi-week process to something that takes hours or even minutes.
Consider an application team’s need for test data in any given project. Typically, they don’t have a strong pipeline or process for provisioning test data. For example, developers will go to a DBA and request a database restore, then find someone else from another team to secure data to use it safely in a test environment—this requires multiple days for every request.
“If you need 10 database refreshes for a project, you can easily lose 20 days out of the timeline waiting on data.”
With Delphix, dev and test teams can deploy secure virtual databases on-demand, use its developer GUI and self-service tools. The APIs automate everything, and software teams can use the developer GUI and self-service tools to work independently without any need for ops support.
How are clients specifically using Delphix to improve data security?
All of the financial applications we develop for are data heavy. That data is extremely sensitive and requires elevated levels of security. Security is not only needed to protect the data from external parties but also from internal employees.
One of our clients, a global financial institution, must comply with the Gramm-Leach-Bliley Act. They need to ensure they can pass upcoming audits, so the organization was looking for a solution to mask nonpublic personal information (NPPI) residing in their non-production systems, files, databases, etc. They had a variety of tools being used by different application teams, which not only added costs but resulted in challenges with referential integrity during test cycles that involved multiple systems.
We implemented Delphix in three data centers to support the client’s entire application catalog. We identified non-production systems with NPPI data and masked several dozens of databases. We were able to scan a database and discover existing NPPI data within seconds.
“Masking jobs were hyper-efficient, safeguarding millions of records in a matter of minutes.”
At the end of the engagement, all databases in lower environments were successfully masked to meet GLBA requirements. Then as part of the handover to internal teams, we created documentation and how-to guides and conducted hands-on training to transition ongoing operations.
Lastly, what inspired Risk Focus to team up with Delphix?
We first encountered Delphix for a data masking project we were working on. We evaluated different competitive masking solutions and settled on Delphix for several reasons.
First, we needed a masking solution that would cover all of our client’s databases. Performance was also a huge consideration, so we created a performance comparison chart and Delphix outperformed the competition by a handy margin. We assessed the technology by looking at the maturity of the products, levels of customer support offered, how well the products were being maintained, what the product roadmap looked like, among many other things. Crucially, we needed a product that could expose simple RESTful APIs, so we could integrate it into automated pipelines.
We ran an impartial evaluation tailored to our clients’ problem, and Delphix did the best. We’ve successfully stayed vendor-independent, and we shy away from reselling licenses to eliminate conflict of interest. We like to say we’re technology-agnostic but opinionated!