SQL Server code portability is, unfortunately, a lost art these days for many programmers and database administrators. Some newer professionals in the field are unaware of the importance of code portability or its importance for maintaining a stable scalable multi-environment infrastructure.
In its simplest terms, code portability is the practice of using environment-independent names and paths to access resources on remote machines. SQL Server code mobility allows for easily moving code from development to QA, test, stage, production and disaster recovery environments without any code modification. This eliminates major points of failure.
There are several places to address within the environment, and there are several ways to address them. Larger companies may set up separate Windows Active Directory domains for each environment so all systems names in all environments are identical. It can, however, be a fairly expensive operation because many extra servers need to be purchased in order to support the additional Active Directory forests. Smaller companies will opt for the less expensive method of using a series of aliasing techniques and configuration files that allow you to easily control which servers the code connects to.
Linked servers and cross database queries
Linked servers are probably one of the easiest to configure for code portability and possibly one of the least configured for code portability. Typically, when setting up a linked server,
SQL Server DBAs use the remote server's name for the name of the linked server. While this works fine in a single environment (where production and development happen within the same database), this is not an ideal situation.
Once you do break up the environments, you now have linked servers between environments that do not match, and code needs to be modified between the environments. Instead of naming your linked servers with the remote server's name, try naming the linked server with the name of the application that you will be accessing on the remote server, or perhaps the database name on the remote server. While this will make the code look a little redundant, your code will be transportable seamlessly from one environment to another.
CREATE PROCEDURE usp_RetreiveCustomerInfo @Customer_ID int AS SELECT Customer_ID, Customer_FName, Customer_LName, Active FROM CIS_SVR.CIS.dbo.Customers WHERE Customer_ID = @Customer_ID GO
As you can see from the above stored procedure, no matter what environment this procedure is used in, no code changes are required.
While Data Transformation Services is in the process of being phased out due to the introduction of SSIS in SQL Server 2005, many companies do still use it heavily. Because of this, code portability needs to be addressed as more companies continue to develop new DTS packages. Code portability is a little harder to do in DTS than in some of the other technologies being covered here. Looking on the bright side, there are several options from which to choose.
You can use an ActiveX script to read a configuration file on the server's hard drive, much like an INI file. You can set up a dynamic properties task that logs into the local server and gets its name, and based on that, finds the other server names it needs to use. Using the same kind of technique, a database table within the DTS package's local database could hold the remote server names it needs for its connections. However, if you use this method, set up a separate database for this use; and this database should never be moved between environments.
Code portability with SSIS
Unlike DTS, SSIS (SQL Server Integration Services) comes with a perfect mechanism for ensuring code portability: the configuration files. Configuration files are XML files that you can configure the SSIS package to read from. The SSIS package can use the data within the configuration files to set just about any runtime changeable option, including the connection string for connection objects. SSIS also has an environment option where you specify in advance the different environments within the systems scope (development, QA, stage, production and so on); and when deploying the SSIS package, you select the correct environment. While this is a good solution, it does require that the environment be set at each deployment, thereby making the configuration file mechanism the preferred option.
Anyone who has tried to do cross server file pathing in a code portable environment knows that it can be quite complex. Start by laying out a set of standards you'll need in order to identify all the methods that you'll use to access physical files on a remote servers file system. If you are lucky, you do not have to worry about this part at all, but odds are you aren't that lucky.
Using Domain Name System (DNS) aliasing for the remote machine is probably the most common method for making the code portable. I've even seen one method of using a combination of a static server name with an environment variable that was placed on every server's operating system so the calls would all resolve. While this was complex, it did create a code portable system that was flexible and reliable, and it worked in almost every situation. Each server had an environment variable called %ENV% setup; the various values where DEV, QAT, PRD (development, QA and production). So when we needed to access a file share on the CIS server, we would use the directory path of \\CIS-%ENV%\Share\Directory\File.ext.
This worked for DOS commands such as BCP as well as SSIS packages. It did not, however, work for the T-SQL command BULK INSERT. That's because SQL Server does not make a direct call to the operating system to handle name resolution. It apparently used a Windows API that went directly to DNS, bypassing Windows API handling environment variables. Instead of switching out the %ENV% for DEV, the BULK INSERT command would look for a server named CIS-%ENV%. Fortunately, in this case, there were not many bulk insert commands. If there was a different solution, they may have needed to have been developed.
SQL Server code portability is very important and is not something that should be rushed. Give careful thought to the solution you choose to ensure that it is the correct solution for your enterprise now and in the future. Nothing makes standards more complex than having to change them down the line because they do not fit with the new techniques. That being said, do not be afraid to change your standards if necessary.
ABOUT THE AUTHOR
Denny Cherry is a DBA and database architect managing one of the largest SQL Server installations in the world, supporting more than 175 million users. Denny's primary areas of expertise are system architecture, performance tuning, replication and troubleshooting.