Craig S. Mullins
               

Return to Home Page

October 2000

 

Managing Database Change
By Craig S. Mullins

Although a cliché, it is also true that change is the only constant in today's complex business environment. An ever-changing market causes businesses to continually adapt. Businesses need to strive to meet customer expectations that are constantly changing, while sustaining revenue growth and profitability at the same time. To keep pace, businesses must constantly change and enhance products and services to meet and exceed the offerings of competitors.

And the individuals that comprise each and every business usually find it difficult to deal with change. This is so because change means we need to take on additional roles and responsibilities which inevitably makes our job more difficult. Our comfortable little status quo no longer exists. So we have to change, too – either change aspects of our environment or our way of doing things.

There are many different aspects of managing change, particularly with respect to Information Technology. Each of the following are different facets of the "change management" experience.

  •       The physical environment or work place changes to accommodate more employees, fewer employees, or perhaps just different employees with new and different skill sets.

  •         The organization changes where “things” like processes or methodology have to adapt to facilitate a quicker pace for product and service delivery.

  •         The network infrastructures changes to provide support for a growing, and perhaps geographically dispersed workforce.

  •         Applications and systems change to do different things with existing data or to include more or different types of data.

  •         The type and structure of data changes requiring modifications to the underlying database schemata to accommodate the new data.

Change is inevitable but necessary for business survival and success. So we better have solutions that enable us to better manage these inevitable changes.

Changing the Database

Most of today’s business information is managed within the context of a complex, computerized business application. Almost every type of information you can think of -- from customer information to vendor information to product specifications to payroll information -- is managed by computer. And, this information is dynamic, it is always changing. Businesses must prepare to handle the challenge of managing and controlling their constantly changing information. To complicate matters, though, most businesses are further challenged by having a limited number of resources with which to tackle this growing mountain of information.

A DBMS is used to store the majority of today’s computerized data. So when that data changes, the databases used to store the data must also change. If the data is not reliable and available, the system does not serve the business, but instead, threatens the health of the business. So we need infallible techniques to manage database changes. But even more, we need techniques that are not just failproof, but are also automated, efficient, and easy to use.

Change Management Requirements

To successfully implement effective change management it is imperative to understand a set of basic requirements. The following factors needs to be incorporated into your change management discipline in order to ensure success: proactivity, intelligence, analyses (planning and impact), automation, standardization, reliability, predictability, and quick and efficient delivery.

Proactivity. Proactive change, which can eliminate future problems, is an organization's most valuable type of change. The earlier in the development cycle that required changes can be identified and implemented, the lower the overall cost of the change will be.

Intelligence. When implementing a change, every aspect of the change needs to be examined because it could result in an unanticipated cost to the company. The impact of each change must be examined and incorporated into the change process. Because a simple change in one area may cause a complex change in another area. Intelligence in the change management process often requires a thorough analysis including an efficient and low-risk implementation plan. True intelligence also requires the development of a contingency plan should the change or set of changes not perform as projected.

Planning analysis. Planning maximizes the effectiveness of change. A well-planned change saves time. It is always easier to do it right the first time than to do it again after the first introduced change proved less than effective. An effective organization will have a thorough understanding of the impact of each change before allocating resources to implementing the change.

Impact analysis. Comprehensive impact and risk analysis allows the organization to look at the entire problem and examine the involved risk to determine the best course of action. A single change usually can be accomplished in many different ways. But the impact of each type of change may be considerably different. Some carry more risks: the risk of failure, the risk associated with a more difficult change, the risk of additional change being required, the risk of causing downtime, etc. All considerations are important when determining the best approach to implementing change.

Automation. With limited resources and a growing workload, automating a process serves to reduce the human-error factor and to eliminate more menial tasks from over burdened staff.

Standardization of procedure. Attrition, job promotions and job changes require organizations to standardize processes to meet continued productivity levels. An organized and thoroughly documented approach to completing a task reduces the learning curve, as well as training time.

Reliable and predictable process. When creating any deliverable, a business needs to know that all of the invested effort is not wasted. Because time is valuable, a high level of predictability will help to ensure continued success and profitability. Reliability and predictability are key factors in repeatedly producing a quality product.

Availability. Most changes require down time to implement the change. To change an application the application must come down. The same is true of databases. But high availability is required of most applications these days, especially for e-businesses. Reducing the amount of downtime required to make a change will increase application availability. And this is fast becoming a requirement in the Internet age.

Quick and efficient delivery. With most products and services there is a consumer demand for quick turnaround. Profitability is at its best when a product is first-to-market. Conversely, the cost of slow or inefficient delivery of products can be enormous. In the case of implementing change, faster is better.

The Change Management Perspective of a DBA

The DBA is the custodian of database changes. Usually, the DBA is not the one to request a change; that is typically done by the application owner or business user.  But there are times, too, when the DBA will request changes, for example, to address performance reasons or to utilize new features or technologies. At any rate, regardless of who requests the change, the DBA is charged with carrying out the database changes.

To effectively make those changes, the DBA needs to consider each of the items discussed in the previous section: proactivity, intelligence, analyses (planning and impact), automation, standardization, reliability, predictability, and quick and efficient delivery. Without a robust, time-tested product that is designed to effect database changes, the DBA will encounter a very difficult job. Why?

Well, today’s major RDBMS products do not support fast and efficient database structure changes. Each RDBMS provides differing levels of support for making changes to its databases. But none easily supports every type of change that might be required. One quick example: most RDBMSs today do not enable a column to be added to the middle of an existing row. To do so, the table must be dropped and recreated with the new column in the middle. But what about the data? When the table is dropped the data is deleted, unless the DBA was wise enough to first unload the data. But what about the indexes on the table? Well, they were dropped when the table was dropped, so unless the DBA knows this and recreated the indexes too, performance will suffer. The same is true for database security – when the table was dropped all security for the table was also dropped. And this is but one simple example. Other types of database change that are required from time to time include:

  •         changing the name of a database, table, view, or column

  •         modifying a stored procedure, trigger, or user-defined function

  •         changing or adding relationships using referential integrity features

  •         changing or adding database partitioning.

  •         moving a table from one database, dbspace, or table space to another.

  •         rearranging the order of column in a table.

  •         changing a column's data type or length.

  •         adding or removing columns from a table.

  •         changing the primary key without dropping and adding the primary key.

  •         adding or removing columns from a view.

  •         changing the SELECT statement on which a view is based.

  •         changing the columns used for indexing.

  •         changing the uniqueness specification of an index.

  •         clustering the table data by a different index.

  •         changing the order of an index (ascending or descending).

And this is a very incomplete listing. Adding to the dilemma is the fact that most organizations have at least two, and sometime more, copies of each database. At the very least, a test and production version will exist. But there may be multiple testing environments for example, to support simultaneous development, quality assurance, unit testing, and integration testing. And each database change will need to be made to each of these copies, as well as, eventually, the production copy. So, you can see how database change quickly can monopolize a DBA’s time.

The solution is to use an automated product to implement database change. The product will enable the DBA to focus on all of the issues of change management because it is built to understand not just the discipline of change management, but also the RDBMS in which the changes are to be made. This built in intelligence shifts the burden of ensuring that a change to a database object does not cause other implicit changes from the DBA to the tool. And once the change has been identified and implemented for one system, it can easily be deployed on other database copies with minimal, or perhaps no changes.

Another feature of a database change manager product is analysis and planning. The impact of changes can be examined prior to implementing any change. This is an invaluable resource for ensuring safe and efficient database changes. This type of tool also uses automation to minimize the resources required to implement database change. Instead of writing a new, complex change script from scratch for each database change, the DBA can rely on the Change Manager to accomplish this. And application and database availability will be enhanced because the product will implement the change in the least intrusive, quickest manner possible.

All in all, a database change management product will improve availability, minimize errors, and speed up your time to market. And I think we all can relate to that!

 

 

 

From Database Trends, October 2000.
 
© 2000 Craig S. Mullins,  All rights reserved.
Home.