

W H I T E P A P E R
© 2017 Persistent Systems Ltd. All rights reserved. 24
www.persistent.com
Section
below provides additional advice for selecting tools and database technology from a performance point
7.2.2of view.
A best practice mentioned is to
appoint a person to the role of a metadata manager
responsible for creating and
implementing themetadata strategy. Themetadata strategy will include
—
Surveying the landscape for the locations, formats, and uses of metadata across the DW/BI System
—
Working with the data steward to educate the DW/BI team about the importance of metadata and the metadata
strategy
—
Identifying and/or definemetadata that needs to be captured andmanaged
—
Deciding on the location and the version identification for eachmetadata element
—
Creating systems to capture any business or process metadata that does not currently have a home
—
Creating programs or tools, or purchasing a tool to share and synchronizemetadata as needed
—
Designing and implementing the delivery approach for getting business metadata to the user community, and
manage themetadata andmonitor usage and compliance.
We will review this metadatamanagement issue further in sectio
nbelow.
4.4.14.3.3 Technical Architecture at Development Stage
At the development stage, activities center on the dimensional model development (logical and physical) described in
sections
and
,the ETL development to populate the physical model described in
, the data cleansing
5.3.3 5.3.4 5.3.5development, section
and the performance and security aspects described in sections
and
. 6.2.4 7.2.4 7.2.8Integration of the various metadata repositories from the off-the-shelf selected tools is addressed in section
4.4.1.6below.
We are ignoring for themoment all aspects of BI development in this first version of the document.
4.3.4 Technical Architecture at Deployment Stage
Kimball says (chapter 13, page 544) that when a developer gives you a demo working on a developer environment
with the complete scope for the first time, you should consider that the project is only 25% done. Once it passes all unit
tests, you have completed 50%. The third 25% is when you have validated and debugged a simulated production
environment, including data quality assurance, performance and operations testing. The last 25% is delivering the
system into production, which includes usability testing, documentation and training.
This is to contrast with typical testing effort estimation as a percentage of development time, which is anywhere
between 25 to 40% in traditional application programming.