We have several SQL 2000 databases on one server.
One of the applications I'm responsible for has batch jobs that run for an hour; all activity is on the database. During this hour, other applications that use other databases on the same server experience time-outs. One of my coworkers did a count(*) on an empty table and it took 11 seconds.
We pay people to keep our servers up and running. Is this something they might solve by reconfiguring the server? It seems strange to me that a single database is allowed to hog all server resources.
We are meeting with them later this week, and I'd like to have some knowledge about this; we don't want to BS'ed into buying a new server.personally, I would look at optimizing your hour long batch process if at all possible. If you follow the directions in Brett's sticky at the top, I am sure some one would be happy to look over your code for you.|||There's that,yes. Right now it's cursor driven, because, "you know, if we just make some changes to the previous version we'll be done quicker."
Data is read from a file (created at a specific point in time) into a single table, and the batch process consists of two stages: stage one is comparing the new data to what we already have, and create an entry in the log table for every difference. Stage two is changing our data so that it reflects the data in the file, for that particular point in time.
I'm pretty sure that stage one can be done set-based, but I'm not so sure about stage two. The actions required for each row in the new data depend on what's already there in our db.
No comments:
Post a Comment