I have a stored procedure that contains a complex query of approximately 16
joins, (a mixture of inner and left joins). These joins are mostly on
standard user tables, but includes joins on two temporary tables and two non-
indexable views.
I have rewritten the query so as to reduce the reads from 5 million plus, to
about 1.6 million reads using some dynamic sql and adding additional indexes.
Still, in the execution plan, several of the user tables have estimated vs.
actual row counts that are quite different, such as 1 Estimated row versus
135,000 actual rows. These tables are using, what I consider the appropriate
index, but, as stated, the actual vs. estimated are way off.
The procedure takes several arguments and I set those arguments as local
variables in the procedure to prevent parameter sniffing.
I was hoping to get a better plan by getting the actual versus estimated row
counts closer. I have ran "update statistics <table_name> with fullscan" on
all the major tables used in the query and especially on those tables whose
actual vs. estimated are way off. This had no effect on the actual vs.
estimated row counts. I have also run sp_updatestats, dbcc dbreindex on the
tables, sp_recompile, but in each case, the estimated vs. actual is still way
off.
I have considered reordering some of the inner and left joins, but was not
sure if that would have any affect.
Would you be able to advise on what else one might do to get the actual vs.
estimated more in line?
--
Message posted via SQLMonster.com
http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200610/1its difficult to get a better and more accurate plan after your actions
(updating stats etc...)
Specially with left joins...
the only way is to force the way you want to do your joins (manually
specify the hash/ merge option in the joins, setup the better order for your
joins etc...)
in this case you force SQL Server to use your hints instead of using its own
generated plan.
"cbrichards via SQLMonster.com" <u3288@.uwe> wrote in message
news:679360284efd2@.uwe...
>I have a stored procedure that contains a complex query of approximately 16
> joins, (a mixture of inner and left joins). These joins are mostly on
> standard user tables, but includes joins on two temporary tables and two
> non-
> indexable views.
> I have rewritten the query so as to reduce the reads from 5 million plus,
> to
> about 1.6 million reads using some dynamic sql and adding additional
> indexes.
> Still, in the execution plan, several of the user tables have estimated
> vs.
> actual row counts that are quite different, such as 1 Estimated row versus
> 135,000 actual rows. These tables are using, what I consider the
> appropriate
> index, but, as stated, the actual vs. estimated are way off.
> The procedure takes several arguments and I set those arguments as local
> variables in the procedure to prevent parameter sniffing.
> I was hoping to get a better plan by getting the actual versus estimated
> row
> counts closer. I have ran "update statistics <table_name> with fullscan"
> on
> all the major tables used in the query and especially on those tables
> whose
> actual vs. estimated are way off. This had no effect on the actual vs.
> estimated row counts. I have also run sp_updatestats, dbcc dbreindex on
> the
> tables, sp_recompile, but in each case, the estimated vs. actual is still
> way
> off.
> I have considered reordering some of the inner and left joins, but was not
> sure if that would have any affect.
> Would you be able to advise on what else one might do to get the actual
> vs.
> estimated more in line?
> --
> Message posted via SQLMonster.com
> http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200610/1
>|||> The procedure takes several arguments and I set those arguments as local
> variables in the procedure to prevent parameter sniffing.
That way, the optimizer uses hard-coded values to determine selectivity. Something like:
= 10% (actually, it might go by density in the statistics, if available)
> 30%
BETWEEN 25%
The percentage for the values differs from release to release, and can change even between service
packs. So, in other words, a variable is *not* the same thing as the optimizer knowing the value you
are looking for.
For the optimizer to *know* the value and have a static plan, you have to rely on parameter
sniffing. This value in the plan might of course be off for the subsequent usages of the plan.
Or, you would have to get a new plan for each execution. Either break out the critical parts of the
procedure to its own procedures and create them WITH RECOMPILE, or in 2005 you can add RECOMPILE
hint at the query level. Or you can in 2005 even say OPTIMIZE FOR a certain value.
You can of course add WITH RECOMPILE for the whole procedure, but then none of the DML statements in
the procedure will be pre-optimized.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"cbrichards via SQLMonster.com" <u3288@.uwe> wrote in message news:679360284efd2@.uwe...
>I have a stored procedure that contains a complex query of approximately 16
> joins, (a mixture of inner and left joins). These joins are mostly on
> standard user tables, but includes joins on two temporary tables and two non-
> indexable views.
> I have rewritten the query so as to reduce the reads from 5 million plus, to
> about 1.6 million reads using some dynamic sql and adding additional indexes.
> Still, in the execution plan, several of the user tables have estimated vs.
> actual row counts that are quite different, such as 1 Estimated row versus
> 135,000 actual rows. These tables are using, what I consider the appropriate
> index, but, as stated, the actual vs. estimated are way off.
> The procedure takes several arguments and I set those arguments as local
> variables in the procedure to prevent parameter sniffing.
> I was hoping to get a better plan by getting the actual versus estimated row
> counts closer. I have ran "update statistics <table_name> with fullscan" on
> all the major tables used in the query and especially on those tables whose
> actual vs. estimated are way off. This had no effect on the actual vs.
> estimated row counts. I have also run sp_updatestats, dbcc dbreindex on the
> tables, sp_recompile, but in each case, the estimated vs. actual is still way
> off.
> I have considered reordering some of the inner and left joins, but was not
> sure if that would have any affect.
> Would you be able to advise on what else one might do to get the actual vs.
> estimated more in line?
> --
> Message posted via SQLMonster.com
> http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200610/1
>|||Thanks for your reply Jeje.
In regards to your comment of setting up a better order on my joins, is the
guide to setting up joins...ordering joins with the most selective join
first, followed by the next most selective join, etc., etc.?
Jeje wrote:
>its difficult to get a better and more accurate plan after your actions
>(updating stats etc...)
>Specially with left joins...
>the only way is to force the way you want to do your joins (manually
>specify the hash/ merge option in the joins, setup the better order for your
>joins etc...)
>in this case you force SQL Server to use your hints instead of using its own
>generated plan.
>>I have a stored procedure that contains a complex query of approximately 16
>> joins, (a mixture of inner and left joins). These joins are mostly on
>[quoted text clipped - 36 lines]
>> vs.
>> estimated more in line?
--
Message posted via SQLMonster.com
http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200610/1
Thursday, February 9, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment