0

I am trying to learn using basic rate limiting in an ASP.NET Core 8.0 Web API. I set 'permit limit' to 5, 'window' to 1 minute and 'queue limit' of 2.

At this juncture, let me tell you that I have some services registered in my Program.cs; eg. for Swagger, for global exception handler, for working with Couchbase database, for adding Serilog logger and so forth.

Now, I am trying to test the default endpoint in WeatherForecastControl by applying [EnableRateLimiting("fixed")] to it. But it isn't working ideally ie as expected. The first 5 requests are returning the data as it should. The 6th request makes Swagger display a 'loading' spinner for an indefinite period. I didn't use timer or anything else so I don't know whether it's running for the 1 minute time-frame or not. Then it is fetching and showing the data again with a 200 OK status.

If my knowledge serves me right, the 6th request should rightaway return a 429 status code with a message. Right? I have a hunch, that it is all due to wrong placement of the AddFixedRateLimiter() and UseRateLimiter() in my Program.cs file.

Let me share some code - RateLimiterExtension.cs:

public static class RateLimiterExtension
{
 public static IServiceCollection AddFixedRateLimiter(this IServiceCollection services)
 {
 services.AddRateLimiter(options =>
 {
 options.AddFixedWindowLimiter("fixed", builder =>
 {
 builder.PermitLimit = 5; // Number of requests allowed
 builder.Window = TimeSpan.FromMinutes(1); // Time window for the limit
 builder.QueueProcessingOrder = QueueProcessingOrder.OldestFirst; // Order of processing requests
 builder.QueueLimit = 2; // Maximum number of requests in the queue
 });
 options.OnRejected = async (context, token) =>
 {
 context.HttpContext.Response.StatusCode = 429;
 await context.HttpContext.Response.WriteAsync("Rate limit exceeded!", token);
 };
 });
 return services;
 }
}

Program.cs:

var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddTransient<GlobalExceptionHandlingMiddleware>();
builder.Services.AddControllers();
builder.Services.AddSwaggerExplorer()
 .InjectCouchbaseContext(builder.Configuration);
builder.Services.AddApplicationServices();
Log.Logger = new LoggerConfiguration()
 .ReadFrom.Configuration(builder.Configuration)
 .CreateLogger();
builder.Services.AddFixedRateLimiter();
var app = builder.Build();
app.ConfigureSwaggerExplorer()
 .UseHttpsRedirection()
 .UseAuthorization();
app.UseMiddleware<GlobalExceptionHandlingMiddleware>();
app.UseRateLimiter();
app.MapControllers();
app.Lifetime.ApplicationStopped.Register(() =>
{
 app.Services.GetRequiredService<ICouchbaseLifetimeService>().Close();
});
app.Run();

And finally in the controller:

[ApiController]
[Route("[controller]")]
[EnableRateLimiting("fixed")]
public class WeatherForecastController : ControllerBase
{
 // ...
 [HttpGet(Name = "GetWeatherForecast")]
 public IEnumerable<WeatherForecast> Get()
 { 
 // ...
 }
}

Please study my Program.cs code closely so that you can understand my approach and specially, placement of the AddFixedRateLimiter() and UseRateLimiter() methods. Do you think I have implemented this correctly? Why is Swagger showing the loading spinner instead of returning 429 response? Or I am entirely mistaken and this implementation is working just fine?

Kindly guide me in the right direction.

7
  • When you debug the 6th request, does the rate limiter's OnRejected handler get invoked at all? Does any server-side code get invoked? If so, what does it do from there while debugging? Commented Oct 10, 2025 at 17:08
  • Hi @David I applied a breakpoint on the options.OnRejected line. No it does not reach there AT ALL upon sending the 6th request. The code block in my rate limiter policy file executes once only and that's when the Swagger is loading for the FIRST time . Commented Oct 10, 2025 at 17:15
  • "on the options.OnRejected line" - To be clear... Did you put a breakpoint on the line which defines the handler function, or on the first line within the handler function? The former would execute once when the application starts, the latter would (should) be executed when the event is handled. Commented Oct 10, 2025 at 17:20
  • @David YES. On the first line within the handler function. ie. services.AddRateLimiter(options=> ...... Commented Oct 10, 2025 at 17:23
  • 6
    you are using a queue there, so: "With queuing enabled, when a request exceeds the rate limit, it's placed in a queue where the request waits until a permit becomes available or until a timeout occurs." learn.microsoft.com/en-us/aspnet/core/performance/… The default timeout for an endpoint call is 100 seconds methinks. (you get another 5 requests before that happens?) Commented Oct 10, 2025 at 19:53

1 Answer 1

4

I managed to find the solution for this.

I removed the 2 lines for QueueProcessingOrder and QueueLimit from my rate limiting logic in RateLimiterExtension.cs file.

Also added app.UseRouting() to my Program.cs file.

I found out that the order matters too when configuring the pipeline. So I have put builder.Services.AddFixedRateLimiter() after all other service extensions are registered. ie. just before var app = builder.Build();

My rate limiting functionality now works as desired and returns 429 status code with the message when the number of HTTP requests is limited.

Sign up to request clarification or add additional context in comments.

2 Comments

Congrats! I think the issue was with the middleware order
the order, as well as the code inside the rate-limiter extension function.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.