Craig S. Mullins |
|
February 2002 | |
The DBA Corner by Craig S. Mullins Measuring
DBA Effectiveness I receive frequent e-mails from readers of this column, some offering criticism or praise, but many more asking questions and soliciting advice. I welcome this input from readers, and every-now-and-then I will take the opportunity to answer particularly intriguing questions in print. A common question I am asked is “What is a good way to manage how effective your DBA group is?” This
is not a very easy question to answer because a DBA has to be a "jack
of all trades." And each of these "trades" can have
multiple metrics for measuring success. For example, a metric
suggested by one reader was to measure the number of SQL statements
that are processed successfully. But what does
"successfully" mean? Does it mean simply that the statement
returned the correct results, or does it mean it returned the correct
results in a reasonable time? And what is a “reasonable” time? Two
seconds? One minute? A half hour? Unless you have established service
level agreements it is unfair to measure the DBA on response time. And
the DBA must participate in establishing reasonable SLAs (in terms of
cost and response time) lest he be handed a task that cannot be
achieved. Measuring
the number of incidence reports was another metric suggested. Well,
this is fine if it is limited to only true problems that might have
been caused by the DBA. The DBA should not be held accountable for
bugs in the DBMS (caused by the DBMS vendor), nor for design elements
forced on him or her by an overzealous development team (happens all
the time with RAD and e-rushing around). I
like the idea of using an availability metric, but it should be
tempered against the environment and your organization’s up-time
needs. In other words, what is the availability required? Once again,
back to SLAs. And the DBA should not be judged harshly for not
achieving availability if the DBMS does not deliver the possibility of
availability (e.g. online reorg and change management) or the
organization does not purchase availability solutions from a third
party vendor (such as BMC Software). Of course, the DBA can be held
accountable if it was his decision to purchase the DBMS in use, but
this is not always the case. Many DBAs are brought in well after the
DBMS has been selected. What
about a metric based on response to problems? This metric would
not necessarily mean that the problem was resolved, but that the DBA
has responded to the "complaining" entity and is working on
a resolution. Such a metric would lean toward treating database
administration as a service or help desk type of function. I actually
think this is much too narrow a metric for measuring DBAs. Any
DBA evaluation metric must be developed with an understanding of the
environment in which the DBA works. This requires in-depth analysis of
things like:
This
is probably an incomplete list, but it accurately shows the complexity
and challenges faced by DBAs on a daily basis. Of course, the best way to measure DBA
effectiveness is to judge the quality of all the tasks that
they perform. Of course, many aspects of such measurement will be
subjective. Keep in mind that a DBA performs many tasks to ensure that
the organization’s data and databases are useful, useable,
available, and correct. These tasks include data modeling, logical and
physical database design, performance monitoring and tuning, assuring
availability, authorizing security, backup and recovery, ensuring data
integrity, and, really, anything that interfaces with the company’s
databases. Developing a consistent metric for measuring these tasks in
a non-subjective way is challenging. You'll
probably need to come up with a complex formula of all of the
above – and more – to do the job correctly. Which is probably why
I've never seen a fair, non-subjective, metric-based measurement
program put together for DBAs. If you (or anyone else reading this)
implements such a program I'd love to hear the details and how it
works out.
From Database
Trends and Applications, February 2002. © 2002 Craig S. Mullins, All rights reserved. |
|