This code completes under a second, however its seems like I've taken the "Mr. Bean" way around getting there.
The goal is to get the average number of orders processed in a single day based on how many days exist in the current database. The loadcount column represents how many orders are processed in a single batch. The batch is driven by the date and time during the day.
-- // ditch the time to get the start/end date ranges uniform by date //--
;with makeuniform as
(SELECT
cast(startrangedatetime as date) as startdate
,cast(endrangedatetime as date) as enddate
,[LoadCount]
FROM [audit].[Tracker]
)
-- // next, rollup all the loadcounts by uniform date // --
select startdate,
sum(loadcount) TotalOrders
into #sumthedays
from makeuniform
group by startdate;
-- // since the counts have been rolled up by day, get the table count // --
declare @howmanydays int
select @howmanydays = count(*) from #sumthedays;
-- // next get the average traffic per day // --
select sum(TotalOrders) / @howmanydays
from #sumthedays
drop table #sumthedays
2 Answers 2
This could be simplified. You don't have to create a temporary table to get your unique day count. You can use the COUNT
function on the StartDate
column directly.
Try this:
SELECT
TotalLoadCount / NULLIF(TotalDayCount, 0) AvgLoadCountByDay
FROM
(
SELECT
COUNT(DISTINCT CAST(StartDate as date)) TotalDayCount
,SUM(LoadCount) TotalLoadCount
FROM
dbo.Tracker
) x
Obviously, you can write it directly like this as well:
SELECT
SUM(LoadCount) / NULLIF(COUNT(DISTINCT CAST(StartDate AS date)), 0) AvgLoadCountByDay
FROM
dbo.Tracker
If my understanding of your problem is correct, this query should give you the same result. Let me know if it doesn't.
I personally prefer the first version with derived table as it helps with the readability of the query. You are not paying any additional cost because SQL Server will generate the same execution plan for both the queries.
-
\$\begingroup\$ Brilliantly done! :) \$\endgroup\$plditallo– plditallo2013年10月31日 21:16:09 +00:00Commented Oct 31, 2013 at 21:16
this doesn't look bad to me at all,
- I had no trouble following along
- you said it is a Fast Query
- nothing looks over complicated
I would suggest testing it more, maybe see how it would run in a very large database, that is something that I always try to keep in mind, what happens when my tables become very large?