Business Intelligence Career Master Plan by Eduardo Chavez & Danny Moncada

Business Intelligence Career Master Plan by Eduardo Chavez & Danny Moncada

Author:Eduardo Chavez & Danny Moncada
Language: eng
Format: epub
Publisher: Packt
Published: 2023-02-15T00:00:00+00:00


Figure 5.2 – The Scrum methodology

The new approach starts with creating a “primordial SQL” that may or may not hold the data that the user needs to build their main five reports. This may not be sufficient data but iterations can adjust the logic to accommodate for missing requirements. The idea of building five main reports may be simplistic but in reality, it is a pragmatic decision, and it is quite likely that those five reports will result in a data model that covers all the needs a proper bottom-to-top data model would contain. The primordial SQL is then plugged into a semantic layer or directly into your BI tool of choice and then exposed to your final users to test and validate.

Tasks are deposited in a fishbowl and at any point, a developer picks a paper with a random task to deliver. Cross-functional skills are advised; otherwise, a fishbowl per skill would be required. Tasks in the yellow banner happen asynchronously. They are not dependent on each other and can be parallelized. Data testing is locked to a single point (the frontend). Rapid prototyping requires you to quickly analyze the sources and map the initial requirements. A semantic layer connects to the database hosting the source tables and performs quick and dirty dimensional modeling or a SQL prototype is dumped into the BI tool.

This is a great learning opportunity for new data modelers. A quick model is created in a short period in a working session with every member of the team: a BI developer/ETL developer, a data modeler, and a subject matter expert (SME). As the product gets materialized with backend-supporting objects, instances in the rapid prototype start to get replaced with the final version of the object at the very end. There’s an integration task for every handoff.

While every component of the prototype is being replaced by a formal operationalized ETL, the user can continue building and testing reports without disrupting their development in a fully transparent manner. By doing this, the user would never know whether the report is coming from a prototype or a fully-fledged data pipeline:



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.