About Key Group
Based in Preston, UK, Key Group is a financial services provider offering later-life products to homeowners in the United Kingdom, allowing them to unlock a better retirement by releasing the wealth tied up in their properties.
Key Group
Based in Preston, UK, Key Group is a financial services provider offering later-life products to homeowners in the United Kingdom, allowing them to unlock a better retirement by releasing the wealth tied up in their properties.
Replacing a legacy on-premise data warehouse with a modern data solution.
Key Group's goals were to shorten their processing time, improve the stability of the process, and reduce the cost of ownership.
Their existing solution was managed by a team that were keen to embrace and learn new technologies and expand their skill set to meet the needs of the business.
The data warehouse was originally designed to provide management information from Key Group’s Optimising Advice Delivery platform.
Originally implemented as a decision support system to aid company representatives in selecting products which best fit the needs of the customer, it was later determined to be a rich source of business intelligence.
The discovery stage of the project confirmed the Microsoft-recommended pathway of migrating the current solution - which consisted of several on-premise SQL Servers, a series of SSIS packages, and scripts - to Azure SQL Server and Data Factory.
The majority of the source data for the legacy warehouse was in a series of Azure SQL databases, which were managed by a third party. Azure DevOps pipelines were used to copy data on a nightly basis from the primary Azure SQL database to a secondary Azure SQL database. The on-premise warehouse then used a series of SSIS packages to retrieve the data from the secondary Azure SQL database and transform it for reporting purposes.
The remaining source data for the legacy warehouse was stored in an Azure Cosmo DB using Mongo API. The on-premise data transfer process used a series of C# and Python scripts to retrieve data from Cosmos and transform it into a relational format suitable for reporting.
The feature set used by the Optimizing Advice Delivery data warehouse meant it could be migrated to an Azure SQL database without any refactoring. Due to the large volume of data a Hyperscale tier SQL database was selected for the task. Moving the warehouse database into the cloud improved ETL times because the pipeline was no longer constrained by on-premise internet connectivity.
None of the above processes were orchestrated from a single point. The on-premise data transfer process relied on the databases being copied between Azure SQL instances on time. If this process failed, the BI team would not be able to restart the process without assistance from their third party application provider.
The third party provider was already hosting the primary database on a business-critical tier Azure SQL Server. One of the features of the business-critical tier is scalable read-only replicas at no additional cost.
This allowed the warehouse data extract queries to be executed by the read-only replicas without impacting read-write operations, meaning that the data warehouse could be refreshed periodically throughout the day for up-to-the-minute business intelligence.
The scalability meant that more compute resource could be dynamically allocated by Azure without intervention from the customer. This improved performance without increasing costs.
By going directly to the source data, the secondary Azure SQL database and its associated infrastructure - as well as the Azure DevOps pipelines - could be decommissioned.
We used Azure Data Factory to orchestrate the process from end-to-end. The initial plan had Data Factory running the SSIS packages in their current form. Further assessment determined that the packages would be suitable for refactoring as Data Factory pipelines, removing the additional infrastructure complexity required to run SSIS in Azure.
Due to some current limitations of Data Factory, were were unable to fully implement the functionality afforded by the legacy C# & Python scripts natively in Data Factory.
After exploring several options - including running the legacy code in an Azure Function - we ultimately solved the problem by using Data Factory to export the data from Azure Cosmos into Azure Blob Storage.
Azure SQL Server then used a series of stored procedures to retrieve the data from Azure Blob Storage and stage it into a relational format. Finally, the staged data was transformed into the schema required by the existing reporting solution. Again, Data Factory was used to orchestrate this multi-step process across differing cloud technologies.
Greater control and faster time to data insights
The end result was a more reliable process over which the BI team had greater control.
The new solution meant the warehouse run - which originally took over six hours and could only take place outside of business hours - could now be completed in under ten minutes
Plus, it could be refreshed, as required according to changing business needs, throughout the day.
More reliable process and greater control, with a reduced reliance on third-party support
Access to up-to-date management information enables rapid reactions to changing market conditions and immediate evaluation of marketing campaigns
The latest Microsoft technology ensures greater availability, security, and scalability allowing their data solution to flex in line with their growth strategy
Key Group now has a modern data solution that allows them to make better business decisions in a more cost-effective way using future-proof technology that will flex in line with their growth strategy.
We worked with Key on another project to build a new templated website, with new imagery and improved branding that could be easily managed by internal teams.
Another core component of this new digital hub was an equity release calculator.
More successful digital transformation projects...
Want to learn more about the work we do with clients? Take a look below.