5

I'm trying to figure out the peak load I can sustain on new hardware I procured for validating backups and performing DBCC checks. I've been using Crystal Diskmark to get throughput stats which helped me benchmark sequential I/O for the copy/restore tasks. I'm having trouble gauging how much random I/O I can sustain for the DBCC check. I'm thinking about using iometer and sqliosim but want to know config would work best to simulate a DBCC check.

The hardware I'm testing consists of one R720 with dual E5-2609s for 8 cores, 32 GB RAM, Windows 2008 R2 Standard, SQL Server 2008 R2 Standard with SP2, and a PowerVault 3620f with 24 15k SAS spindles hooked up to two dual port HBAs on the R720. I've been experimenting with 4, 8 and 12 spindle RAID 0 groups (I can afford to lose the fault tolerance as the DBs have a life expectancy of minutes as part of the testing process).

I'm thinking I can run multiple simultaneous DBCC checks with the above hardware without hitting disk contention. I have the option to upgrade the RAM to 64 GB and the O/S to Enterprise but probably can't upgrade the SQL to Enterprise due to licensing costs.

Any suggestions on how to determine the max random I/O for DBCC using iometer, sqliosim or another utility would be deeply appreciated.

asked Feb 26, 2013 at 15:52
1

2 Answers 2

1

SQLIO is good for doing raw disk IO load testing, but a good way to simulate a comprehensive load testing is to use the SQL Profiler tool included with most sql server installations or using the SQL Load Test utility. Here is a link to some different ways to use these utilities to achieve your goal using profiler or other stock utilites.

You can simply run profiler during a normal load that you would like to record, save it off, and replay it against the server to load test under realistic load. If you are using SQL server 2012, you can use the Distributed Replay utility for performing this same kind of testing, but from multiple sources to simulate load. If you are looking for a commercial option, you could look at Quest Software's tool called Benchmark Factory that can be used to perform SQL Server load testing.

answered Jul 21, 2013 at 21:50
1

I'm surprised this question went unanswered for so long.

For this scenario, I'd suggest focusing less on running synthetic benchmarks with tools like SQLIO and instead just testing the actual RESTORE DATABASE and DBCC CHECKDB performance. SQLIO or Crystal Diskmark will give you some useful information about the performance characteristics of your I/O subsystem, but what will really tell you what performance to expect for your specific needs is scripting out a real-world workload with the RESTORE DATABASE and DBCC CHECKDB commands you want to run, on the databases you want to run them on.

Monitor with Perfmon while you run the load test, and watch for a point where throughput (Logical Disk:Bytes/Sec) is constant but latency (Logical Disk:avg. disk sec/transfer) increases, and you'll know you've hit a bottleneck, at which point you either need to do some research & tuning or decide whether the performance is acceptable. Note also that DBCC CHECKDB will intentionally saturate your I/O subsystem, even with only one concurrent integrity check, but this isn't necessarily a bad thing as it helps ensure the I/O subsystem is kept busy.

answered Jul 21, 2013 at 23:49

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.