Skip to content
Digital Transformation solutions for Telecommunication and Finance industries
Neural Technologies7 min read

Digital Integration: Standardising Event Data Collection and Management

Digital transformation is imperative for all enterprises, whatever the size. Various industry bodies, including the TMForum, have taken on several initiatives to help Communications Service Providers (CSPs) successfully complete their journey towards digital business maturity. These initiatives include: digital transformation guidelines, standard Open APIs, as well as numerous catalyst projects. The goal: turning CSPs into Digital Service Providers (DSPs).

For the last 25 years, Neural Technologies as a Software and Consulting Service Provider has been successfully serving the telecommunications and finance industries in realizing business values via digital transformation. This article will highlight what we have learned from our wealth of experience in digital integration, and how CSPs can successfully propel themselves towards an open digital world.

The Digital Integration Value Proposition

Whenever a CSP wants to embark on the journey of becoming a Data Service Provider, it can be difficult to know where to start. Though the starting point is usually: “What is the quickest and maximum improved business value achieved from minimum effort?”

Because all eTOM (TMForum’s Business Process Framework) business processes have data points associated, it is these data points in various forms that are the core element of any business process value chain. It is such data that delivers business value through collection, interpretation, enrichment and distribution for consumption, together with the mining and delivery of business KPIs. Such data, we have classified as “Event Data”.

Definition: “Event Data is generated by logging uses, events, or transactions which have been generated/triggered by people, machines and services.”

Examples are switch Call Detail Records (CDRs), Internet of Things (IoT) data points, or customer related activities data like orders, invoices, payments and complaints etc.

So, to answer the question posed above, a practical approach to digital integration must be developed. Any practical approach must be centered around the Event Data and its associated business processes, that can be divided into 4 stages:

  • Event Data classification & standardization
  • Event Data collection & distribution
  • Event Data life cycle management
  • Event Data driven decision making
1. Event Data Classification & Standardization

Event Data classification & standardization is the preparation step for all instances of Digital Integration

EVENT DATA CLASSIFICATION

  • Event Data definition and quality program: Defining the format, data types, quality, content of the Event Data treated, and information compliancy to the business needs;
  • Event Data loss prevention program: Applying administrative and technical controls, based on their criticality to the business;
  • Event Data security and privacy program: Ensuring the protection of sensitive or classified information (e.g. GDPR Compliancy requirements);
  • Event Data ageing-based retention program: As per regulatory licensing agreements and business needs, complying to the internal and external audit requirements

EVENT DATA STANDARDISATION

  • Organising the Event Data: To ensure proper Event Data governance over time by leveraging the TM Forum’s Shared Information Data (SID) model as a reference.
    • Event Data cleaning, enrichment and transformation as per business lexicon.
2. Event Data Collection & Distribution

Nowadays, managing customer data or contact center data is rarely a challenge. The difficulties start with network related events and usage data, on both sides of collection and distribution. This is due to:

  • Volume, Velocity and Variety of the Event Data and their Input & Output format requirements
3 Vs of Big Data - Digital Transformation

The 3 V’s of Big Data

  • Various Service Level Agreements (SLAs) around processing speed and expected information & format, from source/target application and users plus security and privacy
  • Limited resources, requiring an efficient use of the Event Data (avoiding the duplication and multi-processing of the same event).

Consequently, once the Event Data is classified and standardized as defined above, most CSPs are required to replace their legacy mediation platform to deal with:

Event Data Collection (across numerous sources regardless of structure and format)

  • Event Data at Rest for Batch Processing
    • Intelligent Network (IN) and Value-Added Services (VAS)/Service Delivery Platform (SDP) Dumps, Base Transceiver Station (BTS) Info and Visitor Location Register (VLR) Dumps & more
  • Event Data in Motion
    • Near Real-Time Processing: Network Elements CDRs, Rated Usage Event, Recharges & more
    • Real-Time Processing: VLR Location Updates, Deep Packet Inspection (DPI) feeds & more

Event Data distribution to downstream applications

  • Microservices, Roaming, Enterprise Data Warehouse, Management Information Support (MIS) Reporting, etc.

Obviously, storage cost-effectiveness and Service Level-Agreements play a big role here. Hence, Big Data Technologies and Open Source tools have become a recurring “requirement”. Through that hides its own difficulties like:

  • Multiple tools required and the need to successfully integrate them
  • Stabilization and maintenance
  • Skills (Big Data & high-volume computing)

Additionally, rolling out a digital integration solution that addresses the Event Data collection and distribution challenges leads to other challenges:

  1. Complex Event Data Collection
    1.   Event Data interface driver (especially live Event Data streaming)
    2.   Event Data coherence (time- & business-wise, hierarchies)
    3.   Event Data enrichment & stitching
  2. Event Data delivery service API layer
    1.   Hypertext Transfer Protocol (HTTP)
    2.   Representational State Transfer (REST)
    3.   Simple Object Access Protocol (SOAP)
    4.   File-based
    5.   Kafka
    6.   etc.
  3. High-volume & Velocity computing
    1.   Streaming & Processing to meet service level-agreements
    2.   Re-processing & Backlog
    3.   High throughput Event Data delivery capability
  4. Complex Event Data retention policies

All those challenges (some often overlooked) are contributing factors to the 70% failure rate of digital transformation projects. To overcome them, CSPs must find an end-to-end solution that solves all, and Neural Technologies have named this the Event Data Lake Platform, where events are collected, pre-processed and structurally stored once, so that it can be used for multiple uses as required to serve the various downstream applications and is 100% fully configurable.

Definition: “An Event Data Lake is a group of various physical storages of various storage technologies in where Event Data is stored in its natural format.”

Definition: “An Event Data Lake Platform enables and implements the end-to-end digitised business processes directly related to its associated Event Data.”

3. Event Data Life Cycle Management
To manage the Event Data Life Cycle (collection, storage, consumption and retention) of the Event Data Lake Platform, the following is expected:
  1. Data Collection: The act of creating Event Data values that do not yet exist and have never existed within the enterprise.
  2. Data Preparation: The preparation (Pre-Processing) of Event Data to points at which Data Synthesis and Data Usage occur.
  3. Data Synthesis: The creation of Event Data values via inductive logic, using other Event Data as input.
  4. Data Usage: The application of Event Data as information to tasks that the enterprise needs to run and manage itself.
  5. Data Distribution: The sending of Event Data to a location outside of the enterprise.
  6. Data Archival: The copying of data to an environment where it is stored in case it is needed again in an active production environment, and the removal of this data from all active production environments.
  7. Data Purging: The removal of every copy of a data item from the enterprise.
Digital Transformation solutions for Telecommunication and Finance industries
4. Event Data-Driven Decision Making

Professor Russell Ackoff defined the path of delivering business wisdom through the information value chain [data => information => knowledge => wisdom):

Digital Transformation solutions for Telecommunication and Finance industries

To ensure Event Data driven decision making, CSPs require a standardization of the enterprise reporting framework and the delivery of reporting of uniform business KPIs across all stakeholders.

To achieve this goal:

  • Define a common business lexicon across all business functions
  • Design a KPI business matrix (KPI-business dimensions)
  • Leverage TM Forum’s SID aligned Analytic Model
  • Define standardized reports and dashboards
  • Enable business users with Ad Hoc reporting & Self-Service Capability
  • Define information delivery service level-agreements (including Event Data extraction)
  • Define Event Data quality strategy
  • Define change control process

Therefore, CSPs need to adapt a smart information delivery model. Specifically: a Telecommunications specific Data Warehouse (DWH) model that is aligned to TM Forum’s SID, and establishes the different level of analysis capability to deliver business values based upon Professor Russell Ackoff definition of delivering business wisdom. These different levels of analysis are:

  • Level – 1: Descriptive Analysis
    • Delivering Management Information System (MIS) Reporting (Hourly to Yearly)
    • Ad-hoc Reporting & on-demand Information Delivery
    • Enterprise Performance Measurement: Business KPIs & Targets
  • Level – 2: Advanced (Predictive) Analysis
    • What-if and Cross-Tab Analysis
    • Waterfall Modelling
    • Point-In-Time/Rolling-Time Window Analysis
  • Level – 3: Mining Modelling & Forecasting
    • KPI Forecasting
    • Supervised & Un-supervised Modelling (predictive churn, clustering…)
  • Level – 4: Prescriptive Analysis (requires pattern recognition)
    • Contextual Marketing
    • Fraud Management
The Ideal Event Data Lake Platform

Everything that has been highlighted throughout this article defines the ideal Event Data Lake Platform requirements that enables and implements the end-to-end digitized business processes directly related to its associated Event Data.

The ideal Event Data Lake Platform must derive its functionality from the above classifications as part of a digital integration value proposition and be summarized as follows:

  • Maximum flexibility, combining complete configurability with technology plug-ins to rapidly setup and modify complete end-to-end digitized business processes
  • Scalability to handle even the most challenging current and future Event Data volumes
  • Event Data Collection, Distribution, Life Cycle Management and Event Data Driven Decision Making integrated as part of its DNA
Conclusion

Based on Neural Technologies’ experiences we recommend that the digital integration steps & the Event Data Lake Platform capabilities as highlighted in this article are the most efficient and effective approach to overcoming the challenges most CSPs meet in their individual digital transformation journey. With this approach, Neural Technologies is confident that the failure rate of digital transformation projects will decrease, and our collective goal of an open digital world will become a reality.

As featured in TM Forum’s Digital Transformation Tracker 3 

RELATED ARTICLES