In modern web applications, managing traffic and protecting APIs from abuse is critical. Rate limiting and throttling control how often a client can call your API within a time frame—preventing overload, reducing costs, and improving reliability. In ASP.NET Core Web API, you can build resilient systems with built-in features, third-party libraries, or custom middleware.
This guide explains what rate limiting and throttling are, why they matter, and how you can implement them in ASP.NET Core Web API with real code examples.
What Are Rate Limiting and Throttling?
Rate limiting is a technique that restricts the number of requests a client can make over a set period (for example, 100 requests per minute).
Throttling is a broader concept where you dynamically slow down or deny requests when usage thresholds are hit. Both protect your API from overuse, bots, and abusive clients.
Why You Need Rate Limiting in ASP.NET Core
Implementing rate limiting helps you:
- Prevent Denial of Service (DoS) attacks
- Ensure fair use of APIs across clients
- Minimize server overload and resource exhaustion
- Protect backend systems and databases
- Enforce subscription tiers in paid APIs
With ASP.NET Core, you can apply rate limiting at different levels: global, per endpoint, per user, or by API key.
Built-In Rate Limiting in ASP.NET Core
Starting from .NET 7, ASP.NET Core offers built-in rate limiting middleware that simplifies configuration.
Step 1 — Install Packages
Make sure your project targets .NET 7 or later and has:
dotnet add package Microsoft.AspNetCore.RateLimiting
Step 2 — Configure Rate Limiting in Program.cs
using Microsoft.AspNetCore.RateLimiting;
using System.Threading.RateLimiting;
var builder = WebApplication.CreateBuilder(args);
// Add rate limiting services
builder.Services.AddRateLimiter(options =>
{
options.AddFixedWindowLimiter(policyName: "GlobalPolicy", limiterOptions =>
{
limiterOptions.PermitLimit = 100;
limiterOptions.Window = TimeSpan.FromMinutes(1);
limiterOptions.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;
limiterOptions.QueueLimit = 2;
});
});
var app = builder.Build();
// Enable rate limiting middleware
app.UseRateLimiter();
app.MapGet("/", () => "Hello World!");
app.Run();
Fixed Window vs Sliding Window
- Fixed Window: Resets counters at fixed intervals (e.g., every minute)
- Sliding Window: Tracks requests over a trailing time window for smoother enforcement
ASP.NET Core also supports Token Bucket, Concurrency Limiter, and Sliding Window strategies to match different traffic patterns.
Customizing Rate Limits per Endpoint
You can apply rate limits specifically to controllers or actions:
app.MapGet("/api/limited", () => "Limited endpoint")
.RequireRateLimiting("GlobalPolicy");
This gives you control over which routes are protected and how strict the limits should be.
Rate Limiting by Client or API Key
For APIs with different users or subscription plans, you can rate limit per API key:
1. Extract the API key from headers.
2. Define a policy keyed by the API key.
3. Use custom logic to enforce limits per client.
builder.Services.AddRateLimiter(options =>
{
options.AddPolicy("ApiKeyPolicy", context =>
{
var apiKey = context.Request.Headers["X-Api-Key"].ToString();
return RateLimitPartition.GetFixedWindowLimiter(apiKey, _ => new FixedWindowRateLimiterOptions
{
PermitLimit = 1000,
Window = TimeSpan.FromHours(1)
});
});
});
Handling Rate Limit Responses
When a client exceeds the limit, your API should return:
HTTP/1.1 429 Too Many Requests
Retry-After: 60
Content-Type: application/json
You can customize the response body:
options.OnRejected = async (context, token) =>
{
context.HttpContext.Response.StatusCode = 429;
await context.HttpContext.Response.WriteAsync("Rate limit exceeded. Try again later.");
};
Using Third-Party Libraries
If you need advanced features like distributed counters, sliding logs, or persistent throttling across instances, consider libraries such as:
- AspNetCoreRateLimit
- Polly
These support more granular policies, Redis storage, IP filtering, and client tracking.
Testing and Monitoring
Always test your rate-limited API to ensure:
- Limits are applied accurately
- Responses include correct headers
- Legitimate users are not blocked unfairly
Monitor metrics using tools like Application Insights, Prometheus, or logging middleware.
Best Practices
- Start with reasonable limits (e.g., 100–500 requests/minute)
- Use Retry-After headers to inform clients
- Log throttled requests for insight into abuse
- Scale policies for different user tiers
- Consider distributed stores like Redis in microservices
Conclusion
Rate limiting and throttling protect your ASP.NET Core Web API from abuse and ensure fairness among users. With .NET’s built-in middleware and clear customization options, you can implement policies that scale with your application’s needs.
Whether you use fixed window counters, sliding windows, or external libraries, enforcing limits is a vital part of building resilient APIs in today’s distributed web.