Take the following example: SELECT <CalculationA> As ColA, <CalculationB> As ColB, <CalculationA> + <CalculationB> As ColC FROM TableA Would CalculationA and CalculationB, each be calculated twice? Or would the optimizer be clever enough to calculate them once and use the result twice? I would like to perform a test to see the result for myself, however, I am not sure how I could check something like this. My assumption is that it would perform the calculation twice. In which case, depending upon the calculations involved, might it be better to use a derived table, or nested view? Consider the following: SELECT TableB.ColA, TableB.ColB, TableB.ColA + TableB.ColB AS ColC, FROM( SELECT <CalculationA> As ColA, <CalculationB> As ColB FROM TableA ) As TableB In this case, I would hope that the calculations would only be performed once?
SQL Server makes **no guarantees** about when, and how many times, a scalar expression will be evaluated at execution time. This was confirmed by Jim Hogg of Microsoft in response to a [bug report] filed by Itzik Ben-Gan: > The optimizer does not guarantee timing or number of executions of scalar functions. This is a long-estabilished tenet. It's the fundamental 'leeway' tha (sic) allows the optimizer enough freedom to gain significant improvements in query-plan execution. > > So, our recommendations are: > > a) Avoid reliance on non-guaranteed timing and number-of-executions semantics. Runtime behaviour depends on many complicated (and undocumented) interacting details within the query optimizer and execution engine. The placement, execution, and caching of scalar expressions is difficult to predict, even for people with advanced knowledge of the internals. To take one example, evaluation of a deterministic expression may be deferred from the point it appears in an execution plan, to be evaluated when a later operation requires the expression result. For more details and examples see my article [Compute Scalars, Expressions and Execution Plan Performance] The cost-based optimizer makes little effort to avoid repeated scalar computations. This is because scalar operations are hardly costed at all, as explained by optimizer architect Conor Cunningham in [SQL Server Scalar Operator Costing aka “umm, what costing?”] It is best to treat the behaviour here as **undefined**. There is nothing much in execution plans to help you figure out what happened in detail at runtime, and it will not often be convenient to attach a debugger to examine detailed operations, as I did in my blog post. Often, it will not matter too much if simple scalar expressions are evaluated more often than necessary. Where an important performance impact is suspected, the most robust way to work around the issue is to explicitly materialize the result, e.g. in a variable or temporary table. Views, derived tables, and common table expressions are not "optimization fences" in SQL Server, so while using these *may* encourage the runtime behaviour you are after, it does not come with any particular guarantee. : https://web.archive.org/web/20150506160803/http://connect.microsoft.com/SQLServer/feedback/details/350485/bug-with-newid-and-table-expressions : https://www.sql.kiwi/2012/09/compute-scalars-expressions-and-execution-plan-performance.html : https://www.sqlskills.com/blogs/conor/sql-server-scalar-operator-costing-aka-umm-what-costing/