Client is multinational enterprise information technology company that provides products and services geared toward the data center such as servers, enterprise storage, networking and enterprise software. Purpose of the project was to accurately recognise the Custom SOW (Statement of work) Orders, Service contracts and Service Orders for forecasting and linkage to sales crediting. The model is the classic example for near to real time reporting for the recognised Orders or Service contracts. Thereby, supporting business users from Finance and Services to take their decision on performing the respective billing on a day-by-day basis.
ABAP driver program is written in S4 system which calls NACE configuration for Application type ‘V1’ (for Sales) when the output is triggered with an immediate trigger mode from VA01 & VA02 (for Orders) and VA41 & VA42 (for Service Contracts), which generates the consolidated JSON file per Order or Service Contracts.
Intermediate system (SnapLogic) pulls the JSON file and stores the data in Kafka topic with an archival period of 7 days. With specific configuration maintained in EAP, the data gets pulled/ingested in HIVE(EAP) system with the streaming jobs. The streaming job makes sure to pull the data in HIVE(EAP) as soon as data is available.
Ingested JSON file is then flattened in two-dimensional table with specific architecture consists of Error, Raw and Refined tables. As S4 tables such as VBPA for Party function and JEST for User status contains the relevant information for various partner functions and statuses respectively in a single attribute, logical bifurcation is required through Transposing mechanism to get the individual fields available to perform necessary derivations.
Transposed fields are then made available to the DOR consumption layer by creating the DOR specific Refined table. While writing the code this Refined table is also termed as Source table for easy comparison with already existing records in target table. Target table is DOR specific Dimension table for reporting which is structured in such a manner to keep all the transactions. DOR main logic including the currency conversion is written between source (DOR refined) and target table (final reporting table) with having intermediate temporary (staging) tables created to distribute the load on memory/queue while processing the logic sequentially before making it available in target table. Temporary table stores the data on go basis while getting the incremental/delta records and deletes it with execution of the next run.
QlikSense capability is used to report the data by fetching the records from DOR dimension table from HIVE(EAP). Various sheets to maintain all DOR transactions, Header and Line-item data, missing or error records are created according to Business need.
Author:
Sponsors: