Oracle Data Platform for Financial Services

Improve risk calculations and regulatory reporting

Regulatory reporting challenges

With the increasing complexity of reporting requirements from regulators around the globe, the cost and resource burden of regulatory reporting has soared in recent years. Struggling to keep up with the constant pace of change, financial firms must find ways to meet expanding data requirements more efficiently and accurately while strategically evolving their data architecture to improve performance and drive growth.

Many financial services organizations are still wasting a significant amount of time and skilled resources preparing regulatory reports. Without an automated system that performs data quality checks and eliminates data silos, banks can’t be confident that their regulatory submissions are accurate without spending countless hours reviewing the reports. Accessing data at the desired level of granularity is another challenge because different systems capture data at different levels—for instance, loan systems capture data at the account and transaction level, loan origination systems capture data at the enquiry level, and credit card systems capture data at the card and transaction level. Analyzing data at a consistent level of granularity allows financial institutions to gain a 360-degree understanding of their operations, customers, and markets. It enables them to view data in context and identify relationships, patterns, and trends that might be missed if the data was aggregated or disaggregated in an inconsistent manner.

To address these issues, financial services organizations are redefining their approach to risk calculation, regulatory reporting, and compliance as a holistic process and seeking end-to-end automation and governance—from data capture and analysis to reporting, including the final mile submission to regulators.

Manage compliance and risk more effectively with machine learning and AI

The following architecture demonstrates how we can combine Oracle components and capabilities, including advanced analytics, AI, and machine learning, to create a comprehensive data platform for regulatory reporting and risk calculation that facilitates data integration, data quality, standardization, processing, lineage, and agility. The data platform provides financial institutions with a robust foundation to help them meet regulatory requirements, create timely and accurate reports, and perform effective risk calculations.

reduce risk and regulatory report diagram, description below

This image shows how Oracle Data Platform for healthcare can be used to support value-based care with performance monitoring. The platform includes the following five pillars:

  1. 1. Data Sources, Discovery
  2. 2. Ingest, Transform
  3. 3. Persist, Curate, Create
  4. 4. Analyze, Learn, Predict
  5. 5. Measure, Act

The Data Sources, Discovery pillar includes three categories of data.

  1. 1. Oracle Apps entails Fusion SaaS, Oracle E-Business Suite and EPM.
  2. 2. Business Records (First-Party Data) is comprised of transactions, revenue and margin.
  3. 3. Third Parties includes data from foreign exchange rates, market feeds and commodity prices.

The Ingest, Transform pillar comprises four capabilities.

  1. 1. Bulk transfer uses OCI FastConnect, OCI Data Transfer, MFT, and OCI CLI.
  2. 2. Batch ingestion uses OCI Data Integration, Oracle Integration Cloud, and Data Studio.
  3. 3. Change data capture uses OCI GoldenGate and Oracle Data Integrator.
  4. 4. Streaming ingest uses OCI Streaming, Kafka Connect and DB Tools.

All four capabilities connect unidirectionally into the cloud storage within the Persist, Curate, Create pillar.

The Persist, Curate, Create pillar comprises five capabilities.

  1. 1. The serving data store uses Autonomous Data Warehouse.
  2. 2. Compute Farms uses HPC.
  3. 3. Cloud storage uses OCI Object Storage.
  4. 4. Batch processing uses OCI Data Flow.
  5. 5. Governance uses OCI Data Catalog.

These capabilities are connected within the pillar. Cloud storage/Data Lake is unidirectionally connected to the serving data store; it is also bidirectionally connected to batch processing and compute farm.

Two capabilities connect into the Analyze, Learn, Predict pillar: The serving data store connects unidirectionally to the analytics and visualization capability and is bidirectionally connected to the AI Services capability. Cloud storage connects to the AI services capability.

The Analyze, Learn, Predict pillar comprises three capabilities.

  1. 1. Analytics and visualization use GraphStudio, Oracle Analytics Cloud, and ISVs.
  2. 2. AI Services includes OCI Anomaly Detection, OCI Language, OCI Forecasting and OCI Vision.
  3. 3. The serving data store, analytics and visualization, and object storage supply metadata to the OCI Data Catalog.

The Measure, Act pillar captures how the data analysis may be applied to support a risk calculation and regulatory reporting solution.These applications are divided into two groups.

  1. 1. The first group “People and Partners” includes compliance and regulatory reporting along with risk aggregation and reporting.
  2. 2. The second group “Applications” includes credit risk analytics and market risk, analytics value-at-risk, operational risk analytics, liquidity risk analytics and stress testing and scenario analysis.
  3. The three central pillars—Ingest, Transform; Persist, Curate, Create; and Analyze, Learn, Predict—are supported by infrastructure, network, security, and IAM.



There are three main ways to inject data into an architecture to enable financial services organizations to streamline risk calculation and regulatory reporting processes while improving accuracy.

  • To start, we need to ingest data from transactional systems and core banking applications. This data can then be enriched with customer data from third-party sources, which could include unstructured data from social media, for example. Frequent real-time or near real-time extracts requiring change data capture are common, and data is regularly ingested from transactional, risk, and customer management systems using Oracle Cloud Infrastructure (OCI) GoldenGate. OCI GoldenGate is also a critical component of evolving data mesh architectures where “data products” are managed via enterprise data ledgers and polyglot data streams that perform continuous transform and load processes (rather than the batch ingest and extract, transform, and load processes used in monolithic architectures).
  • We can now use streaming ingest to ingest real-time trading data. For example, when a trade is executed, all the information attached to it is ingested and then deployed to update accounts and ledgers, recalculate risk, and kick off settlement processes. This data is ingested in raw (untransformed form) via the HDFS/S3 connector for long-term persistence, and some basic transformations/aggregations are performed before the data is stored in cloud storage. In parallel with ingestion, we can filter, aggregate, correlate, and analyze high volumes of data from multiple sources in real time using streaming analytics. This helps financial institutions detect business threats and risks. Correlating events and identified patterns can be fed back (manually), and the raw data can be examined using OCI Data Science. Additionally, events can be generated to trigger actions. These actions can be directly customer focused, such as notifying customers about potential fraud via email or SMS or blocking compromised debit cards, or they can streamline internal processes—for example, by notifying the compliance team that a potential issue has been identified. OCI GoldenGate Stream Analytics is an in-memory technology that performs real-time analytical computations on streaming data.
  • Access to historical performance data, trends, and patterns is necessary to understand and predict risk accurately. This typically requires loading a large volume of transactional data and other operational metrics and datasets (such as market data and commodity prices) from on-premises data stores using bulk transfer methods and services, such as OCI Data Transfer Service.
  • While real-time needs are evolving, the most common extract from core banking, customer, and financial systems is a batch ingestion using an extract, transform, and load process. Batch ingestion is often used to import data from systems that can’t support streaming ingestion (for example, older mainframe systems) or data that doesn’t necessarily need to be analyzed in real time, such as loan and mortgage data. This data is highly structured with a high degree of data quality/integrity and is often processed by the transactional application/system in bulk according to a specific schedule—for example, hourly at 15 minutes past the hour or daily at noon (periods may be longer than this to accommodate complex processes). Ingestion in bulk after source processing is complete is the most computationally and network efficient manner of ingestion. Batch ingestions can be frequent, as often as every 10 or 15 minutes, but they are still bulk in nature as groups of transactions are extracted and processed rather than individual transactions. OCI offers different services to handle batch ingestion, such as the native OCI Data Integration service or Oracle Data Integrator running on an OCI Compute instance. Depending on the volumes and data types, data can be loaded into object storage or loaded directly into a structured relational database for persistent storage.

Data persistence and processing is built on three (optionally four) components.

  • Ingested raw data is stored in cloud storage for algorithmic purposes; we use OCI Object Storage as the primary data persistence tier. Spark in OCI Data Flow is the primary batch processing engine for data such as transactional, location, application, and geo-mapping data. Batch processing involves several activities, including basic noise treatment, missing data management, and filtering based on defined outbound datasets. Results are written back to various layers of object storage or to a persistent relational repository based on the processing needed and the data types used.
  • These processed datasets are returned to cloud storage for onward persistence, curation, and analysis and ultimately for loading in optimized form to the serving data store, provided here by Oracle Autonomous Data Warehouse. Data is now persisted in optimized relational form for curation and query performance. Alternatively, depending on architectural preference, this can be accomplished with Oracle Big Data Service as a managed Hadoop cluster. In this use case, all the data needed to train the machine learning models is accessed in raw form from object storage. To train the models, historical patterns are combined with transaction-level records to identify and label potential risks. Combining these datasets with others, such as device data and geospatial data, lets us apply data science techniques to refine existing models and develop new ones to better manage and predict risk. This type of persistence can also be used to store data for schemas that are part of the data stores accessed via external tables and hybrid partitions.
  • As described in the ingestion section, financial services organizations deal with massive amounts of data, including historical market data, real-time trading data, economic indicators, and more. High performance computing (HPC) enables the efficient processing and analysis of large datasets, allowing for comprehensive risk assessment. Financial risk prediction involves the use of complex mathematical and statistical models, such as Monte Carlo simulations, option pricing models, and risk factor models. These models require substantial computational power to perform calculations and simulations accurately and quickly. HPC systems in a compute farm provide the necessary computational resources to handle these complex models in an extremely resource-efficient manner by using the principles of cloud computing.

The ability to analyze, learn, and predict is built on three technologies.

  • Analytics and visualization services, such as Oracle Analytics Cloud, deliver analytics based on curated data from the serving data store. This includes descriptive analytics (describes current risk identification trends and flagged activity with histograms and charts), predictive analytics, such as time series analysis (predicts future patterns, identifies trends, and determines the probability of uncertain outcomes), and prescriptive analytics (proposes suitable actions to support optimal decision-making). These analytics can be used to answer questions such as: How does flagged risk this period compare to previous periods?
  • Alongside advanced analytics, machine learning models are developed, trained, and deployed. These trained models can be run on both current and historical transactional data to help financial organizations better predict and manage risk—for example, by matching patterns of transactions and behaviors to detect money laundering—and the results can be persisted back to the serving layer and reported using analytics tools such as Oracle Analytics Cloud. To optimize model training, the model and data can also be fed into machine learning systems, such as OCI Data Science, to further train the models for more effective risk analysis. These models can be accessed via APIs, deployed within the serving data store, or embedded as part of the OCI GoldenGate streaming analytics pipeline.
  • Additionally, we can use the advanced capabilities of cloud native artificial intelligence services.
    • OCI Anomaly Detection is an artificial intelligence service that makes it easy to build business-specific anomaly detection models that flag critical incidents, speeding up detection and resolution. In this use case, we would deploy these models to identify noncompliance and monitor nonadherence to IFRS 9 and IFRS 17, CECL, LDTI, OECD, Basel, and other standards and requirements. This identification can be used along with historical resolution data for remediation and process improvement. For risk assessment, including credit, liquidity, market, and enterprise performance risk assessments, OCI Anomaly Detection can be used to monitor performance metrics to help ensure current performance and transactions aren’t increasing overall risk.
    • We can also use OCI Anomaly Detection to monitor the number of compliant/noncompliant occurrences by category to identify if any specific change in the business causes unusual compliance escalations. Furthermore, OCI Anomaly Detection can help identify the root cause of noncompliance by monitoring the use of compliance rules to check whether recent transactions show unusual usage.
    • OCI Forecasting can be used to forecast performance metrics as well as external factors, such as market conditions and customer behaviors, to analyze the likelihood of and potentially identify impending risk.
    • OCI Language and OCI Vision can ingest documents and text that can help enrich data for risk management activities.
  • Data governance is another critical component. This is delivered by OCI Data Catalog, a free service providing data governance and metadata management (for both technical and business metadata) for all the data sources in the data lakehouse ecosystem. OCI Data Catalog is also a critical component for queries from Oracle Autonomous Data Warehouse to OCI Object Storage as it provides a way to quickly locate data regardless of its storage method. This allows end users, developers, and data scientists to use a common access language (SQL) across all the persisted data stores in the architecture.
  • Finally, our now curated, tested, high-quality, and governed data and models can be exposed as a data product (API) within a data mesh architecture for distribution across the financial services organization.

Improve risk calculations and regulatory reporting with the right data platform

Oracle Data Platform can help financial services organizations keep up with the swiftly changing risk management and regulatory reporting landscape, manage the increasing complexity of reporting requirements from regulators worldwide, and ensure they have access to data at the correct level of granularity. Oracle’s solution provides an integrated environment and framework for managing risk data that reduces the valuable time and resources organizations must dedicate to preparing regulatory reports. With an automated solution that applies quality rules and eliminates data silos, organizations can have confidence in their regulatory submissions and better understand, manage, and minimize risk.

Related resources

Get started

Try 20+ Always Free cloud services, with a 30-day trial for even more

Oracle offers a Free Tier with no time limits on more than 20 services such as Autonomous Database, Arm Compute, and Storage, as well as US$300 in free credits to try additional cloud services. Get the details and sign up for your free account today.

  • What’s included with Oracle Cloud Free Tier?

    • 2 Autonomous Databases, 20 GB each
    • AMD and Arm Compute VMs
    • 200 GB total block storage
    • 10 GB object storage
    • 10 TB outbound data transfer per month
    • 10+ more Always Free services
    • US$300 in free credits for 30 days for even more

Learn with step-by-step guidance

Experience a wide range of OCI services through tutorials and hands-on labs. Whether you're a developer, admin, or analyst, we can help you see how OCI works. Many labs run on the Oracle Cloud Free Tier or an Oracle-provided free lab environment.

  • Get started with OCI core services

    The labs in this workshop cover an introduction to Oracle Cloud Infrastructure (OCI) core services including virtual cloud networks (VCN) and compute and storage services.

    Start OCI core services lab now
  • Autonomous Database quick start

    In this workshop, you’ll go through the steps to get started using Oracle Autonomous Database.

    Start Autonomous Database quick start lab now
  • Build an app from a spreadsheet

    This lab walks you through uploading a spreadsheet into an Oracle Database table, and then creating an application based on this new table.

    Start this lab now
  • Deploy an HA application on OCI

    In this lab you’ll deploy web servers on two compute instances in Oracle Cloud Infrastructure (OCI), configured in High Availability mode by using a Load Balancer.

    Start HA application lab now

Explore over 150 best practice designs

See how our architects and other customers deploy a wide range of workloads, from enterprise apps to HPC, from microservices to data lakes. Understand the best practices, hear from other customer architects in our Built & Deployed series, and even deploy many workloads with our "click to deploy" capability or do it yourself from our GitHub repo.

Popular architectures

  • Apache Tomcat with MySQL Database Service
  • Oracle Weblogic on Kubernetes with Jenkins
  • Machine-learning (ML) and AI environments
  • Tomcat on Arm with Oracle Autonomous Database
  • Log analysis with ELK Stack
  • HPC with OpenFOAM

See how much you can save on OCI

Oracle Cloud pricing is simple, with consistent low pricing worldwide, supporting a wide range of use cases. To estimate your low rate, check out the cost estimator and configure the services to suit your needs.

Experience the difference:

  • 1/4 the outbound bandwidth costs
  • 3X the compute price-performance
  • Same low price in every region
  • Low pricing without long-term commitments

Contact sales

Interested in learning more about Oracle Cloud Infrastructure? Let one of our experts help.

  • They can answer questions like:

    • What workloads run best on OCI?
    • How do I get the most out of my overall Oracle investments?
    • How does OCI compare to other cloud computing providers?
    • How can OCI support your IaaS and PaaS goals?