There's no official maximum URL length, but browsers, servers, and CDNs impose practical limits. Exceeding these limits causes silent failures, errors, or truncation.
Key Takeaways
- 1Keep URLs under 2,000 characters for maximum compatibility
- 2Modern Chrome supports 2MB+ but servers often reject them
- 3Internet Explorer had a 2,083 character limit (legacy concern)
- 4CDNs and proxies may have their own limits (often 8-16KB)
- 5Use POST requests for data that would create long URLs
“HTTP does not place a predefined limit on the length of a request-line... A server that receives a request-target longer than any URI it wishes to parse MUST respond with a 414 (URI Too Long) status code.”— RFC 7230, Section 3.1.1
Browser Limits
Browser URL limits are more generous than you might expect from the Internet Explorer era. Modern browsers handle URLs well into the megabytes—but that doesn't mean you should use them. The practical limit is determined by the weakest link in your infrastructure chain.
| Browser | URL Length Limit | Notes |
|---|---|---|
| Chrome | ~2MB | Very generous but server limits apply |
| Firefox | ~65,000 chars | May vary by version |
| Safari | ~80,000 chars | May vary by version |
| Edge (Chromium) | ~2MB | Same as Chrome |
| IE 11 | 2,083 chars | Hard limit, truncates silently |
The IE 11 limit is worth noting if you still support legacy systems—it silently truncates URLs without warning. For modern browsers, you'll hit server limits long before browser limits become a problem.
Server Limits
Server limits are often stricter than browser limits and vary by platform. These defaults are configurable, but if you're hitting them regularly, it's a sign you should rethink your URL design rather than raising limits.
| Server/Service | Default Limit | Configurable |
|---|---|---|
| Apache | 8,190 bytes | LimitRequestLine directive |
| Nginx | 8KB (8,192 bytes) | large_client_header_buffers |
| IIS | 16,384 bytes | maxUrl setting |
| Node.js | ~80KB | --max-http-header-size flag |
| Cloudflare | 16KB | Not configurable |
| AWS ALB | 16KB | Not configurable |
CDN and cloud provider limits are particularly important because you often can't change them. Cloudflare's 16KB limit applies to all plans, and AWS Application Load Balancers have a hard 16KB cap. Design your URLs with these constraints in mind from the start.
Error Responses
When a URL exceeds server limits, the error response tells you something went wrong—but not always clearly. Understanding the common error codes helps you diagnose URL length issues quickly.
| HTTP Status | Meaning |
|---|---|
| 414 URI Too Long | Server explicitly rejects the URL |
| 400 Bad Request | Generic error (check server logs) |
| 500 Internal Server Error | Server crashed processing the URL |
The 414 status code is the clearest signal, but many servers return generic 400 errors. If you see unexplained 400s with large parameter payloads, URL length should be your first suspect. Check server logs for more specific error messages.
Safe Limit Recommendation
Rather than pushing limits, build your application to detect and handle long URLs gracefully. The 2,000 character threshold works across all modern browsers and most server configurations without adjustment.
// Check URL length before sending
function isUrlSafe(url) {
// 2000 chars is safe for all modern browsers and most servers
const MAX_SAFE_LENGTH = 2000;
return url.length <= MAX_SAFE_LENGTH;
}
// Handle long URLs
function buildUrl(base, params) {
const url = new URL(base);
for (const [key, value] of Object.entries(params)) {
url.searchParams.set(key, value);
}
if (!isUrlSafe(url.href)) {
console.warn(`URL too long: ${url.href.length} chars. Consider POST.`);
// Either truncate, use POST, or chunk the request
}
return url.href;
}This helper function gives you early warning when URLs approach dangerous lengths. The console warning helps during development, but in production you should implement a proper fallback strategy—either switching to POST or notifying the user.
When you genuinely need to pass large amounts of data, there are several proven alternatives to long query strings.
Handling Long Data
These patterns let you work with complex data while keeping URLs manageable. Choose the approach that best fits your architecture and caching requirements.
// 1. Use POST instead of GET
// Instead of:
fetch('/api/search?q=' + encodeURIComponent(veryLongData));
// Do:
fetch('/api/search', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ q: veryLongData })
});
// 2. Compress and encode
import pako from 'pako';
const compressed = pako.deflate(JSON.stringify(data));
const base64 = btoa(String.fromCharCode(...compressed));
const url = `/api/search?data=${encodeURIComponent(base64)}`;
// 3. Store server-side and pass ID
const { searchId } = await fetch('/api/searches', {
method: 'POST',
body: JSON.stringify(complexSearchParams)
}).then(r => r.json());
const url = `/api/searches/${searchId}/results`;Each approach has trade-offs. POST requests break bookmark-ability and change caching behavior. Compression adds complexity and CPU overhead. Server-side storage requires state management. Pick the pattern that matches your use case—for most applications, switching to POST for complex queries is the simplest solution.