Thursday, March 29, 2012

Deadlocks slowing down the server

Hello,
We are doing a year-end Inventory and I have only 3 users (data-entry clerks)
trying to insert into the same table, more or less working at the same time
from their computers.
The insert is a very short statement with very basic 3 columns.
As soon as they started entering the data, deadlock errors started poping
once in a while.
They use a front-end Web Application that traps any errors that are
encountered duringn DML operation and displays the message to the user.
Given that SQL Server 2000 can handle thousands of transactions at peak times
without any problems, I am surprised and curious what wrong I could be doing
even for only 3 users to be using the system properly.
Since doing Inventory is a tidious task and a lot of entry is needed within a
short time, I would appreciate a suggestion as due to the locks, the server
performance degrades drastically.
If I kill some locked transactions in EM, it improves.....
Any ideas ?
--
Message posted via http://www.sqlmonster.comHi
A badly designed application can deadlock with even just 2 users. I have
looked after applications with 1000's of simultaneous users, but the DB was
architected correctly, and the code was also implemented correctly, we never
had deadlocks.
If the basic rules are not followed on how to build multi-user and scalable
applications, you are going to have trouble.
http://msdn2.microsoft.com/en-us/library/ms191242.aspx
http://msdn.microsoft.com/library/en-us/acdata/ac_8_con_7a_3hdf.asp
http://support.microsoft.com/default.aspx?scid=kb;en-us;169960
http://www.sql-server-performance.com/blocking.asp
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Sameer via SQLMonster.com" <u4996@.uwe> wrote in message
news:591ee915763c1@.uwe...
> Hello,
> We are doing a year-end Inventory and I have only 3 users (data-entry
> clerks)
> trying to insert into the same table, more or less working at the same
> time
> from their computers.
> The insert is a very short statement with very basic 3 columns.
> As soon as they started entering the data, deadlock errors started poping
> once in a while.
> They use a front-end Web Application that traps any errors that are
> encountered duringn DML operation and displays the message to the user.
> Given that SQL Server 2000 can handle thousands of transactions at peak
> times
> without any problems, I am surprised and curious what wrong I could be
> doing
> even for only 3 users to be using the system properly.
> Since doing Inventory is a tidious task and a lot of entry is needed
> within a
> short time, I would appreciate a suggestion as due to the locks, the
> server
> performance degrades drastically.
> If I kill some locked transactions in EM, it improves.....
> Any ideas ?
> --
> Message posted via http://www.sqlmonster.com|||Thanks Mike,
Lots of reading to do......
I am using ColdFusion and wrapping the insert statement within a
<cftransaction> block to make it behave as one single batch.
Nothing fancy...very simple but I guess I'll have to read more on the
articles.
In pseudocode, I'm doing this:
<transaction>
.....<query>
...............INSERT INTO Table_Name (column_list) VALUES (value_list)
.....<query>
<transaction>
I'll have to see what more I can do to change this logic...
--
Message posted via http://www.sqlmonster.com|||Sameer wrote on Tue, 20 Dec 2005 15:39:39 GMT:
> Thanks Mike,
> Lots of reading to do......
> I am using ColdFusion and wrapping the insert statement within a
> <cftransaction> block to make it behave as one single batch.
> Nothing fancy...very simple but I guess I'll have to read more on the
> articles.
> In pseudocode, I'm doing this:
> <transaction>
> .....<query>
> ...............INSERT INTO Table_Name (column_list) VALUES
> (value_list)
> .....<query>
> <transaction>
> I'll have to see what more I can do to change this logic...
>
A single statement like that in it's own transaction shouldn't cause
deadlocks, should it? Are there any triggers on the table, if so that's
where I'd look. There is a DBCC TRACE option you can enable to write
deadlock information into the SQL Server logs, it might help you locate the
source of the problem.
Dan|||Hi Daniel,
Yes, there is a trigger on that table and fires after every INSERT or UPDATE.
This is the root of the delay I think and I'm looking into it right now.
Thanks for pointing it out.
--
Message posted via http://www.sqlmonster.com|||OK...I modified the trigger and now everything seems to be working fine even
when they are all entering at the same time.
After reading the articles, should I still make the following changes ?
1. Use a lower Isolation Level
2. Set READ_COMMITTED_SNAPSHOT to ON
3. Set ALLOW_SNAPSHOT_ISOLATION to ON
Can someone shed some insight on how to do that and what consequences will I
face ?
Thanks.
--
Message posted via http://www.sqlmonster.com|||are we talking SQL2000 or 2005 ?
"Sameer via SQLMonster.com" <u4996@.uwe> wrote in message
news:592053207623d@.uwe...
> OK...I modified the trigger and now everything seems to be working fine
> even
> when they are all entering at the same time.
> After reading the articles, should I still make the following changes ?
> 1. Use a lower Isolation Level
> 2. Set READ_COMMITTED_SNAPSHOT to ON
> 3. Set ALLOW_SNAPSHOT_ISOLATION to ON
> Can someone shed some insight on how to do that and what consequences will
> I
> face ?
> Thanks.
> --
> Message posted via http://www.sqlmonster.com|||> 1. Use a lower Isolation Level
What isolation level are you using? Read Committed should be fine for most
applications. If you are using Serializable you should rethink it.
> 2. Set READ_COMMITTED_SNAPSHOT to ON
> 3. Set ALLOW_SNAPSHOT_ISOLATION to ON
These are 2005 features only. It looks like you are on 2000.
Andrew J. Kelly SQL MVP
"Sameer via SQLMonster.com" <u4996@.uwe> wrote in message
news:592053207623d@.uwe...
> OK...I modified the trigger and now everything seems to be working fine
> even
> when they are all entering at the same time.
> After reading the articles, should I still make the following changes ?
> 1. Use a lower Isolation Level
> 2. Set READ_COMMITTED_SNAPSHOT to ON
> 3. Set ALLOW_SNAPSHOT_ISOLATION to ON
> Can someone shed some insight on how to do that and what consequences will
> I
> face ?
> Thanks.
> --
> Message posted via http://www.sqlmonster.com|||SQL 2000 SP4 with Win2K Advanced Server 2000
--
Message posted via http://www.sqlmonster.com|||Since I don't know how to set an Isolation Level, my guess is that it is
using the default one.
How do you check the setting ?
How do you set it ?
--
Message posted via http://www.sqlmonster.com|||It really depends on how you are connecting to SQL Server. The easiest way
to see what isolation levels a specific connection is using is to run a
profiler trace. The Existing connection event will show the current
Isolation level and if the app or the driver changes it batch completed will
show the statement. You usually set it with SET TRANSACTION ISOLATION LEVEL
xxxx. See BooksOnLine for more details.
--
Andrew J. Kelly SQL MVP
"Sameer via SQLMonster.com" <u4996@.uwe> wrote in message
news:5920cb73e8dd5@.uwe...
> Since I don't know how to set an Isolation Level, my guess is that it is
> using the default one.
> How do you check the setting ?
> How do you set it ?
> --
> Message posted via http://www.sqlmonster.com|||Thank you so much all....I highly appreciate your advises :)
--
Message posted via http://www.sqlmonster.com

No comments:

Post a Comment