Table of Contents Table of Contents
Previous Page  83 / 96 Next Page
Information
Show Menu
Previous Page 83 / 96 Next Page
Page Background

W H I T E P A P E R

© 2017 Persistent Systems Ltd. All rights reserved. 83

www.persistent.com

Ensure that security does not impact performance and intended functionality, especially real-time analytics. If

it does so, trade-of between relaxing SLArequirements over security.

Audit logging and control

Transactional databases don’t generally store the history of changes and, as data in the data warehouse is

held for much longer periods of time, it makes sense that data auditing is implemented in the warehouse.

Data auditing also keeps track of access control data in terms of how many times an unauthorized user

tried to access data. It is also useful to audit the logs recording denial of service events.

To implement a data auditing, the first step is to scope out auditing, i.e., identify datasets which are

required to be audited. Doesn’t push for auditing on every dataset as it not only require processing of data;

it may also end up hampering the performance of the application. Identify business needs and then go

about creating a list of datasets, rules (e.g. who can access it, legal retention requirement of 1 year)

associated with it in some kind of repository. Define policies and identify data elements (like location of

data, condition/status or actual value itself) whose change values need to be collected as part of the data

auditing.

The next step is to categorize or tag datasets in terms of importance in the enterprise. While this will not

help in searching or indexing, it does help in scoping the level of audit activity for each type of dataset.

Finally, for selected datasets to audit, log all data access and the value changes that might be sensitive to

business as well as data export transactions from the data warehouse. Logging changes may be

implemented in two ways: either by copying previous versions of dataset data elements before making

changes, as in the traditional data warehouse slow changing dimensions, or by making a separate note of

what changes have been made, through DBMSmechanisms such as triggers or specific CDC features, or

auditing DBMS extensions.

Monitor that right users see the right data and detect security breaches, build audit-ready activity reports

It is better to build or identify existing data audit frameworks to collect the information about these data sets

and support policies and rules required for auditing. Using the framework will help in automating some of

these processes efficiently and risk-free. The framework may require a separate policy and rule engine to

be built so that policies applied to data sets can be rationalized/prioritized automatically based on the risks

and compliance fulfillment.

7.3.3 Cloud deployments

7.3.3.1 Performance

Data connectivity

Often, cloud providers will leverage APIs for data connectivity or provide pre-built connectors that work with

heterogeneous systems or cloud services. These connectors will have certain characteristics (knobs) to

optimize the fetch performance. Understand them well before implementing; if possible do a POC to gauge

the performance. This also applies to ETL services such as data quality and data validation which are cloud-

resident.