3

I've got about 150,000 keys in a Redis cache, and need to delete> 95% of them - all keys matching a specific key prefix - as part of a cache rebuild. As I can see it, there are three ways to achieve this:

  1. Use server.Keys(pattern) to pull out the entire key list matching my prefix pattern, and iterate through the keys calling KeyDelete for each one.
  2. Maintain a list of keys in a Redis set - each time I insert a value, I also insert the key in the corresponding key set, and then retrieve these sets rather than using Keys. This would avoid the expensive Keys() call, but still relies on deleting tens of thousands of records one by one.
  3. Isolate all of my volatile data in a specific numbered database, and just flush it completely at the start of a cache rebuild.

I'm using .NET and the StackExchange.Redis client - I've seen solutions elsewhere that use the CLI or rely on Lua scripting, but nothing that seems to address this particular use case - have I missed a trick, or is this just something you're not supposed to do with Redis?

(Background: Redis is acting as a view model in front of the Microsoft Dynamics CRM API, so the cache is populated on first run by pulling around 100K records out of CRM, and then kept in sync by publishing notifications from within CRM whenever an entity is modified. Data is cached in Redis indefinitely and we're dealing with a specific scenario here where the CRM plugins fail to fire for a period of time, which causes cache drift and eventually requires us to flush and rebuild the cache.)

asked Apr 7, 2016 at 15:21
1

2 Answers 2

2

Use scanStream instead of keys and it will work like a charm. Docs - https://redis.io/commands/scan The below code can get you a array of keys starting with LOGIN:: and you can loop through the array and execute redis DEL command to del the corresponding keys.

Example code in nodejs :-

const redis = require('ioredis');
 let stream = redis.scanStream({
 match: "LOGIN::*"
 });
 stream.on("data", async (keys = []) => {
 let key;
 for (key of keys) {
 if (!keysArray.includes(key)) {
 await keysArray.push(key);
 }
 }
 });
 stream.on("end", () => {
 res(keysArray);
 });
answered Aug 19, 2020 at 8:20
Sign up to request clarification or add additional context in comments.

Comments

1

Both options 2 & 3 are reasonable.

Steer clear of option 1. KEYS really is slow and only gets slower as your keyspace grows.

I'd normally go for 2 (without LUA, including LUA would increase the learning curve to support the solution - which of course is fine when justified and assuming it's existence is clear/documented.), but 3 could definitely be a contender, fast and simple, as long as you can be sure you won't exceed the configured DB limit.

answered Apr 7, 2016 at 20:09

1 Comment

was this helpful? What solution did you go with? thanks

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.