I'm currently running into an issue with one of our databases in Azure where 1 table is forcing us to scale the entire database up to cope with the CPU requirements. The basic structure of the table is as follows:
column_a is unique and not nullable. column_b can be null.
This table has ~1.6 million rows in it so it's not a large table but it is used throughout the system we have built. We have several feeds pushing data in which requires an entry in this table so for each new request we receive there is a check for an entry and then if that doesn't exist one is created and the id returned. Once that data is received, we have subscribers which receive that data and use this table to link other information the system holds in a nice interface.
Table of most expensive queries
The table above shows the most expensive queries being run on this database over a 24 hour period. The one highlighted in yellow is SELECT TOP (1) id FROM TableName WHERE column_a = @param1
.
Is this just bad database architecture or are we missing a trick to optimize for the number of reads we are performing on the table? Unfortunately, the number of reads being performed is only going to increase as we are adding new data feeds every month and we are aiming to have 5 - 10x the number of feeds by the end of this year.
Any help is much appreciated and my apologies if any of the above is unclear.
Edit
Create Table query
CREATE TABLE [dbo].[TableName](
[id] [bigint] IDENTITY(1,1) NOT NULL,
[column_a] [varchar](50) NOT NULL,
[column_b] [varchar](50) NULL,
CONSTRAINT [PK_TableName] PRIMARY KEY CLUSTERED
(
[id] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY],
CONSTRAINT [AK_Column_A] UNIQUE NONCLUSTERED
(
[plate] ASC
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
Edit 2 Please understand that I have had to redact the names. Execution plan
2 Answers 2
there is a plan which has a predicate of CONVERT_IMPLICIT(nvarchar(50), column_a, 0) = [@param1]
Someone is passing an NVARCHAR parameter, preventing index use. Either fix the code or alter the column to NVARCHAR(50). Comparing a VARCHAR parameter to a NVARCHAR column is not problematic, as NVARCHAR has higher Data Type Precedence, so the parameter is converted instead of the column.
I would say the issue is the # of executions. When u take the time per execution for two first rows, its about 0.8 seconds per row. But the select is executed so often It eats up cpu by comparision.
In adition, what is the datatype of @param?
-
I originally commented stating it was a varchar however I have since checked and it is indeed nvarchar. I have added the execution plan to my original post also.goingrogue– goingrogue2021年04月28日 19:31:54 +00:00Commented Apr 28, 2021 at 19:31
-
1There is your solution. Make whatever is executing that code to do it with the right datatype, varchar instead of nvarchar. Or change the data type in the table.Tibor Karaszi– Tibor Karaszi2021年04月29日 03:59:38 +00:00Commented Apr 29, 2021 at 3:59
Explore related questions
See similar questions with these tags.
SELECT TOP (1) id FROM TableName WHERE column_a = @param1
is being reported as the highest cost query being run in a 24-hour window as shown aboveCONVERT_IMPLICIT(nvarchar(50), column_a, 0) = [@param1]
and has significantly higher costs associated