Tip

Measuring your record usage in Microsoft SQL Server

Barrie Sosinsky, Contributor

In any optimization scheme, you want to determine which records change most frequently, and also which columns are being accessed most often. In the former case, this knowledge lets you horizontally partition a set of records creating a table that is both wide and shallow; while in the later case it allows you to vertically partition a table creating a table that is narrow and deep (similar to most lookup tables).

Many database developers use a scheme that tracks transactional history, storing information like the last time a record was accessed, modified, and the person that modified the record in a special set of fields. Those fields are stored either as part of the table itself (in which case only the last change to the record is tracked), or as part of a special security table. Often that is the function of a security module that you purchase. In the best case, all access to the records and all changes are stored in a log table. If you have a log table, then you have the information you need to assess which records are your most active, and who is accessing those records. If not, then you need to implement this system to develop the information.

Depending upon how your database is constructed, the logging routine can simply be added as a last call in a SELECT or COMMIT statement. In instances where adding this additional overhead to your system is unacceptable, consider creating a pilot program that works with a smaller number of well-chosen users or a smaller

    Requires Free Membership to View

number of records to develop the data set.

Determining which columns are being accessed is more problematic and less data dependent. To develop this information you need to consider which queries, views and cursors are used most often. In your analysis, don't forget to consider columns that aren't necessarily part of the various SELECT commands, but are essential to the efficient operation of the action – particularly index columns. Any partition scheme will require that you correctly include those operational columns in order to maintain the performance advantages you are seeking.

With a data set in hand, you then need to analyze your data set. Depending upon the size of the data set, you may find that this is a perfect project on which to test SQL Servers Business Intelligence tools. With data in a BI tool you can analyze the functional dependencies of your commonly used records, and more importantly your most used columns in a multi-level cube.


Barrie Sosinsky is president of consulting company Sosinsky and Associates (Medfield MA). He has written extensively on a variety of computer topics.


This was first published in March 2005

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

Disclaimer: Our Tips Exchange is a forum for you to share technical advice and expertise with your peers and to learn from other enterprise IT professionals. TechTarget provides the infrastructure to facilitate this sharing of information. However, we cannot guarantee the accuracy or validity of the material submitted. You agree that your use of the Ask The Expert services and your reliance on any questions, answers, information or other materials received through this Web site is at your own risk.