5
\$\begingroup\$

I have a mammoth SQL statement, however it's taking a long time to load (27 secs on the server). I think the issue lies with the IN statements towards the bottom (the IN statement is repeated too, can this be stored so it only does it once?) (taking it out shaves 23 seconds, anything under 10 will be brilliant) but i'm unsure, is there any better ways of writing this?

Thanks in advance!

Dave

DECLARE @site INT = 4
DECLARE @langid INT = 9
DECLARE @checkID INT = 0
DECLARE @minstockQTY INT = 0
DECLARE @branchCount INT = (SELECT COUNT(*) 
 FROM Branch 
 WHERE ApplicationID = @site)
if @site = 4 BEGIN
 SET @checkID = 4
end
select p.*,
 stuff((select ',' + cast(BrandID as nvarchar(max))
 from Product_Brand_Mapping d 
 where d.ProductCode = p.ProductCode 
 AND d.ApplicationID = @site for xml path('')) , 1, 1, '') as BrandIDs,
 stuff((select '|' + cast(Name as nvarchar(max))
 from Brands e 
 where e.ID IN (SELECT BrandID 
 FROM Product_Brand_Mapping f 
 WHERE f.ProductCode = p.ProductCode 
 AND f.ApplicationID = @site) for xml path('')), 1, 1, '') as BrandNames,
 stuff((select ',' + cast(DepartmentID as nvarchar(max))
 from Product_Department_Mapping e 
 where e.ProductCode = p.ProductCode 
 and e.ApplicationID = @site for xml path('')), 1, 1, '') as DepartmentIDs,
 stuff((select ',' + cast(DepartmentID as nvarchar(max)) + '|' + cast(DisplayOrder as nvarchar(max)) 
 from Product_Department_DisplayOrder e 
 where e.ProductCode = p.ProductCode 
 and e.ApplicationID = @site for xml path('')), 1, 1, '') as DisplayOrders,
 stuff((select '|' + cast(Size as nvarchar(max)) 
 from Products e 
 where e.ProductCode = p.ProductCode 
 and e.Sell > 0 
 AND e.Department > 0 
 AND e.ApplicationID = @site 
 AND e.Published = 1 
 AND e.PrimaryPLU IN (select SKU 
 FROM Stock 
 WHERE BranchID IN (1,2,11,12,32,31,13,14,15,0,96) 
 AND ApplicationID = @site 
 GROUP BY SKU 
 HAVING(SUM(CASE WHEN Stock < 0 
 AND BranchID = 31 
 THEN Stock 
 WHEN BranchID = 31 
 THEN Stock END)) - p.MinStockQty > 0) for xml path('')), 1, 1, '') as Sizes,
 stuff((select '|' + cast(Colour as nvarchar(max)) 
 from Products e 
 where e.ProductCode = p.ProductCode 
 and e.Sell > 0 
 AND e.Department > 0 
 AND e.ApplicationID = @site 
 AND e.Published = 1 
 AND e.PrimaryPLU IN (select SKU 
 FROM Stock 
 WHERE BranchID IN (1,2,11,12,32,31,13,14,15,0,96) 
 AND ApplicationID = @checkID 
 GROUP BY SKU 
 HAVING(SUM(CASE WHEN Stock < 0 
 AND BranchID = 31
 THEN Stock 
 WHEN BranchID NOT IN (31) 
 THEN Stock END)) - p.MinStockQty > 0) for xml path('')), 1, 1, '') as Colours,
 (select MAX(pr.Sell) 
 from Products pr 
 where pr.ProductCode = p.ProductCode 
 AND pr.ApplicationID = @site) as SellTo, 
 (select MAX(pr.WholeSale) 
 from Products pr 
 where pr.ProductCode = p.ProductCode 
 AND pr.ApplicationID = @site) as WasTo,
 (select MAX(Department) 
 from Products pr 
 where pr.ProductCode = p.ProductCode 
 AND pr.ApplicationID = @site) as Department
from ProductDescription p
WHERE p.ApplicationID = @site 
 AND p.LanguageID = @langid 
 AND p.Published = 1 
 AND p.Deleted = 0
 AND p.ProductCode IN (SELECT Distinct pr.ProductCode 
 FROM Products pr 
 WHERE pr.Sell > 0 
 AND pr.Department > 0 
 AND pr.ApplicationID = @site 
 AND pr.Published = 1)
 AND ((SELECT SUM(st.Stock) 
 FROM Stock st 
 WHERE st.ApplicationID = 4 
 AND st.SKU IN (SELECT DISTINCT pr.PrimaryPLU 
 FROM Products pr 
 WHERE p.ProductCode = pr.ProductCode 
 AND pr.ApplicationID = @site)
 AND st.BranchID IN (1,2,11,12,32,13,14,15,0,96))
 + ISNULL((SELECT SUM(st.Stock) 
 FROM Stock st 
 WHERE st.ApplicationID = 4 
 AND st.SKU IN (SELECT DISTINCT pr.PrimaryPLU 
 FROM Products pr 
 WHERE p.ProductCode = pr.ProductCode 
 AND pr.ApplicationID = @site)
 AND st.BranchID = 31
 AND st.Stock < 0), 0)) - p.MinStockQty > 0
Clockwork-Muse
1,55510 silver badges10 bronze badges
asked Dec 12, 2012 at 18:01
\$\endgroup\$
4
  • 2
    \$\begingroup\$ Please include execution plan \$\endgroup\$ Commented Dec 12, 2012 at 18:08
  • \$\begingroup\$ Also please describe your data structures, mainly what are the primary keys for tables involved in the query \$\endgroup\$ Commented Dec 12, 2012 at 18:36
  • 2
    \$\begingroup\$ I have a feeling that, if we knew a little more about your relationships (entity, not personal), we might be able to figure our what bits and pieces could be simplified. Some of your CASE statements have unnecessary conditions... \$\endgroup\$ Commented Dec 12, 2012 at 23:12
  • \$\begingroup\$ If you want to get a really good and efficient answer, it would help the most if you provided the community with all the pieces to get up and running with the query. If the data is not confidential, could you generate CREATE TABLE and INSERT statements for all of the relevant tables as pasties and link them in your post. If it is confidential, then manufacture some fake data that shows the problem. Ideally, provide a SQLFiddle with your query running and showing results, and you will get lots of help. \$\endgroup\$ Commented Apr 18, 2013 at 13:15

2 Answers 2

1
\$\begingroup\$

Subquery should not be used.Please prefer a join with Table\view

Select * from ProductDescription pinner join Product PR on P.ProductCode=PR.ProductCode on AND pr.Department > 0 
 AND pr.ApplicationID = @site 
 AND pr.Published = 1 Left join Stock st on PR.SKU=pr.PrimaryPLU WHERE p.ApplicationID = @site 
 AND p.LanguageID = @langid 
 AND p.Published = 1 
 AND p.Deleted = 0

This is not complete sql but might help you to refactor you code

answered Dec 19, 2012 at 8:53
\$\endgroup\$
0
\$\begingroup\$

Use the Display Estimated Execution plan and Show Execution Plan on the Query Analyzer menu.

Pay special attention to the 'Percentage of Overall Query' numbers. You can make significant improvements by focusing on any single step that is taking the most time.

Number one no brainier to look for is Table Scan of any large table. Table Scan of small ( less than 100 rows or so ) is perfectly ok.

You can hover highlight the connecting lines between the steps to show the amount of data that is being transferred in each filtering step. This can give you a clue as to what could be optimized. The more data that is being moved around and joined with other large amounts of data means a slower query.

Be aware of the kinds of join strategy being selected by the engine. The lack of appropriate indexes can force the engine to pick a very slow join strategy. As you add indexes, monitor the choices made by the engine each time you redo the execution plan. Watch for table scan to disappear and index seeks to show up. Watch for join strategies to change and perhaps the overall data flow to be rearranged in a more efficient way once appropriate indexes are added. Also once you develop a baseline for the approximately correct plan for this type of data pull, you will have an easier time if performance problems come up in the future.

answered Aug 17, 2013 at 18:30
\$\endgroup\$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.