Over the past few weeks Alan & I have had the pleasure of working with a small team to build an AR data mart. The goal was simple. Provide the business sponsors with the ability to monitor and trend outstanding aged AR over time. We built a periodic snapshot fact table with four conformed and one degenerate dimension. During the process I kept track of where our time was being spent. Below are the results.
Task 1 – Collect Requirements
Requirements were pretty straight forward. This is our seventh iteration working with the same business sponsors so our velocity is well defined. We are only concerned with outstanding AR, not total billed or total paid. The business wants to be able to perform both ad hoc and canned analysis across four dimensions. The challenge is that AR data is spread across six different sources each with their own profiles. This task has been on the backlog for quite a while, and some analyst work was completed prior to this iteration. Approximately half of one day was spent on requirements gathering.
Task 2 – Research Source Systems
Five of the six sources for this data mart are stored in an AS400 system. The sixth source was stored in SQL Server. Over the course of three weeks we were able to determine how to extract outstanding AR from each of these systems. We had a mixture of validation reports. In some cases we had only detailed data to validate against, and in other cases we had only top level data to validate against. Prior to moving to development we successfully tied to the validation data that was considered accurate from the business’s stand point. We also pulled together some consolidated numbers to help our business sponsors further validation our findings. Throughout this process the business was involved with key decisions to ensure that the mart to be developed would meet their needs.
Task 3 – Develop Data Mart
With our source system research complete, development went very smoothly. We spent a total of 1.5 days developing a data mart using LeapFrogBI’s platform. The first half day was spent training one of the three developers. Near the end of the second day the data mart definition was complete. We finished off the day by kicking off a build and deploying the project. Shortly thereafter our data model was created, loaded, and ready for validation to begin.
Task 4 – Develop Reports
We allocated one week for report development. This project is being implemented in an agile manner and this iteration’s priorities included frontend assets related the AR data mart as well as a data mart developed in earlier iterations. Approximately five days were allocated to frontend development focused specifically on AR.
In total, approximately one month was consumed in developing the backend and frontend assets related to AR. ETL development consumed 9% of the time required to build the data mart (1.5 days). If you believe the below quote from Ralph Kimball, then ETL development would have taken at least 35 days using traditional methods. A little quick math reveals that LeapFrogBI saved us 33.5 days or 67% of the project costs! Perhaps even more importantly, the ongoing ability to respond quickly to the business’s changing requirements is hard to quantify.
“The ETL system makes or breaks the data warehouse …it easily consumes 70% of the resources needed for implementation and maintenance of a data warehouse.” Ralph Kimball and Joe Catera (The Data Warehouse ETL Toolkit)
LeapFrogBI’s goal is to transform traditional data warehouse development into a fast and agile process. Our client got their hands on a high quality data mart faster than traditionally possible while saving tens of thousands of dollars. The benefit of the analysis that is enabled is yet to be measured, but will undoubtedly provide insight that will pay dividends for many months to come. I look forward to the questions that this newly available insight spurs so we can do the same thing over and over again.