Telecom giant saves millions with automated Teradata transformation to big data
Offloaded 100s of terabytes and 1000+ tables
A leading wireless communications service provider was struggling to maintain Teradata costs within the projected scope of investments. They also wanted to chart out a step-by-step journey towards the Hadoop world.
The key challenge was to work on the highly sensitive Teradata production environment without affecting business continuity. Their earlier in-house migration efforts were unsuccessful, as migrating Fact tables with terabytes of records to Hadoop was difficult.
As the first step, transforming legacy Teradata workloads to modern platforms involved the identification and transformation of costly and resource-consuming ETL, analytical, and reporting workloads.
The wireless communications service provider was looking for a solution that would simplify and accelerate the transformation process with reliability. They wanted an automated solution to assess, identify, and recommend workloads to offload to Hadoop and free up the Teradata capacity.
Impetus Technologies Inc. used the Workload Transformation (WT) toolkit to identify the top resource consumers. Resource-intensive Teradata workloads like BTEQ, mLoad, TPT, FExp, shell scripts, and associated tables were automatically identified and migrated to Hadoop, thereby offering expanded capabilities and opportunities for data exploration and analytics.
Schema replication and data migration
Workload validation and execution
Analytical data moved to EDW
Impetus delivered an end-to-end production-ready solution with multiple utilities for:
- Log analysis for data ingestion, processing, orchestrating, and auditing
- Split data ingestion workflow to avoid performance bottlenecks
- Identification of reports execution time, job run status, and highest query execution time
Teradata active nodes 14, standby nodes 7
- Assessed entire Teradata (v14.1) footprint with active nodes 35, 1300+ users, 50+ databases with 700 TB data
- Offloaded 10+ applications/users, 15 databases with 100+ TB data, and 1000+ tables