Problem: You have to insert a large amount of data (10,000 to 50,000 rows) from a file to a table. You must insert the data row by row because each row is processed by a complex module that cannot be executed at the database level. The table you must insert to is heavily updated by other processes. When you run your inserts (insert by insert), there are many locks and deadlocks plus the insertion procedure takes hours. The table has a trigger on it and the DBA tells you that, except for your process, the trigger is not causing any problems. You look at the trigger and notice that it is well written from a performance perspective because it correctly handles situations where multiple rows are inserted to the table at once. How can you improve performance and avoid locks and deadlocks as much as possible?
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Solution: In order to avoid locks, you should insert data as quickly as possible. I suggest inserting the rows into an intermediate table row by row or by preparing a local file and bulk inserting to the transitional table. After this step is complete, it would be quicker to use an "insert … select" or "insert… select top X" to the original table.
Explore Microsoft SQL Server triggers
- Home Introduction
- Part 1 Why use triggers in SQL Server?
- Part 2 Designing and implementing triggers in SQL Server
- Part 3 Example: INSTEAD OF triggers vs. AFTER triggers
- Part 4 Example: One trigger vs. two for the same logic
- Part 5 Example: Nested triggers in SQL Server
- Part 6 Example: Heavy insertion process from OLTP and triggers
- Part 7 Example: Cursors in SQL Server triggers
ABOUT THE AUTHOR:
Michelle Gutzait works as a team member of the SQL Server Database Experts at www.pythian.com, a worldwide company providing Remote Administration services for databases, operating systems, SANs and networks. Gutzait has been involved in IT for 20 years as a developer, business analyst and database consultant. For the past 10 years, she has worked exclusively with SQL Server. Her skills include SQL infrastructure and database design, performance tuning, security, high availability, disaster recovery, very large databases, replication, T-SQL coding, DTS/SSIS packages, administrative and infrastructure tools development, reporting services and more.