In any optimization scheme, you want to determine which records change most frequently, and also which columns...
are being accessed most often. In the former case, this knowledge lets you horizontally partition a set of records creating a table that is both wide and shallow; while in the later case it allows you to vertically partition a table creating a table that is narrow and deep (similar to most lookup tables).
Many database developers use a scheme that tracks transactional history, storing information like the last time a record was accessed, modified, and the person that modified the record in a special set of fields. Those fields are stored either as part of the table itself (in which case only the last change to the record is tracked), or as part of a special security table. Often that is the function of a security module that you purchase. In the best case, all access to the records and all changes are stored in a log table. If you have a log table, then you have the information you need to assess which records are your most active, and who is accessing those records. If not, then you need to implement this system to develop the information.
Depending upon how your database is constructed, the logging routine can simply be added as a last call in a SELECT or COMMIT statement. In instances where adding this additional overhead to your system is unacceptable, consider creating a pilot program that works with a smaller number of well-chosen users or a smaller number of records to develop the data set.
Determining which columns are being accessed is more problematic and less data dependent. To develop this information you need to consider which queries, views and cursors are used most often. In your analysis, don't forget to consider columns that aren't necessarily part of the various SELECT commands, but are essential to the efficient operation of the action – particularly index columns. Any partition scheme will require that you correctly include those operational columns in order to maintain the performance advantages you are seeking.
With a data set in hand, you then need to analyze your data set. Depending upon the size of the data set, you may find that this is a perfect project on which to test SQL Servers Business Intelligence tools. With data in a BI tool you can analyze the functional dependencies of your commonly used records, and more importantly your most used columns in a multi-level cube.
Barrie Sosinsky is president of consulting company Sosinsky and Associates (Medfield MA). He has written extensively on a variety of computer topics.