Go's net/url package provides a clean, type-safe way to work with URLs. The URL struct represents a parsed URL, and url.Values handles query parameters. This guide covers everything you need for URL manipulation in Go.
Key Takeaways
- 1Use url.Parse() to parse URLs into a *url.URL struct
- 2Use url.Values for building and manipulating query strings safely
- 3Use QueryEscape() for query values and PathEscape() for path segments
- 4Always handle the error from url.Parse() in production code
- 5The URL struct fields like Host, Path, RawQuery are directly accessible
Parsing URLs
Unlike JavaScript's URL constructor that throws on invalid input, Go's approach uses explicit error handling. The url.Parse() function returns both a URL struct and an error, letting you handle parsing failures gracefully.
Use url.Parse() to parse a URL string into a *url.URL struct:
package main
import (
"fmt"
"net/url"
)
func main() {
rawURL := "https://user:pass@api.example.com:8080/v1/users?status=active&limit=10#section"
u, err := url.Parse(rawURL)
if err != nil {
panic(err)
}
fmt.Println("Scheme:", u.Scheme) // "https"
fmt.Println("Host:", u.Host) // "api.example.com:8080"
fmt.Println("Hostname:", u.Hostname()) // "api.example.com"
fmt.Println("Port:", u.Port()) // "8080"
fmt.Println("Path:", u.Path) // "/v1/users"
fmt.Println("RawQuery:", u.RawQuery) // "status=active&limit=10"
fmt.Println("Fragment:", u.Fragment) // "section"
// User info (avoid in production URLs!)
if u.User != nil {
fmt.Println("Username:", u.User.Username())
password, _ := u.User.Password()
fmt.Println("Password:", password)
}
}The code parses a URL string and accesses its components. Notice how Go provides both fields (like Host) and methods (like Hostname()). The methods are useful when you need processed values: Hostname() strips the port, while Host includes it.
The URL Struct
The URL struct provides direct access to all components. The table below shows each field and method. Note that Path is already decoded, while RawPath preserves the encoded form.
| Field/Method | Type | Description |
|---|---|---|
Scheme | string | Protocol (http, https, ftp) |
Host | string | Host with optional port (example.com:8080) |
Hostname() | string | Host without port |
Port() | string | Port number or empty string |
Path | string | Path component (decoded) |
RawPath | string | Path component (encoded, if different from Path) |
RawQuery | string | Query string (without ?) |
Fragment | string | Fragment (without #) |
User | *Userinfo | Username and password (if present) |
Parse vs ParseRequestURI
Go offers two parsing functions with different strictness levels. Choosing the right one depends on whether you're processing user input or validating request URLs.
import "net/url"
// url.Parse() is lenient - accepts relative URLs
u1, _ := url.Parse("/path/to/resource")
fmt.Println(u1.Path) // "/path/to/resource"
// url.ParseRequestURI() is strict - requires absolute URLs
u2, err := url.ParseRequestURI("/path/to/resource")
// err: "invalid URI for request"
// Use ParseRequestURI for validating incoming request URLs
u3, err := url.ParseRequestURI("https://example.com/path")
if err != nil {
// Handle invalid URL
}
// Parse() accepts fragments, ParseRequestURI() does not
u4, _ := url.Parse("https://example.com#section")
fmt.Println(u4.Fragment) // "section"
u5, err := url.ParseRequestURI("https://example.com#section")
// err: "invalid URI for request" (fragments not allowed in requests)The key distinction: Parse() accepts relative URLs and fragments, while ParseRequestURI() requires absolute URLs and rejects fragments (since HTTP requests don't include them). Use ParseRequestURI() when validating incoming request URLs for stricter validation.
With URLs parsed, the next step is usually working with query parameters. Go's url.Values type makes this straightforward.
Working with Query Strings
url.Values Type
Go handles query strings through the url.Values type, which is a map[string][]string with helper methods. Like Python, values are slices because the same key can appear multiple times.
import "net/url"
// Create new Values
params := url.Values{}
// Set a single value (replaces any existing)
params.Set("q", "golang tutorials")
// Add a value (appends to existing)
params.Add("tag", "programming")
params.Add("tag", "backend")
// Get a single value (first one if multiple exist)
query := params.Get("q") // "golang tutorials"
// Get all values for a key
tags := params["tag"] // []string{"programming", "backend"}
// Check if key exists
if params.Has("q") {
// key exists
}
// Delete a key
params.Del("tag")
// Encode to query string
queryString := params.Encode()
// "q=golang+tutorials"
// Note: Encode() sorts keys alphabetically for consistent outputThe Values type provides Set() for single values (replacing any existing), Add() for appending values, and Get() for retrieving the first value. Access the underlying map directly with params["key"] to get all values as a slice.
Parsing Query Strings
You can parse query strings either from a URL object or from a standalone string. Both approaches use the same underlying function.
import "net/url"
// From a URL
rawURL := "https://example.com/search?q=hello&tags=go&tags=web"
u, _ := url.Parse(rawURL)
// Method 1: Parse the RawQuery
params, err := url.ParseQuery(u.RawQuery)
if err != nil {
panic(err)
}
fmt.Println(params.Get("q")) // "hello"
fmt.Println(params["tags"]) // ["go", "web"]
// Method 2: Use URL.Query() (returns new Values each time)
params = u.Query()
fmt.Println(params.Get("q")) // "hello"
// Parse a standalone query string
queryString := "name=John&age=30&city=NYC"
params, _ = url.ParseQuery(queryString)
fmt.Println(params.Get("name")) // "John"Note that u.Query() returns a new Values instance each time you call it. If you modify it, you must assign it back to u.RawQuery using Encode() for the changes to take effect.
Modifying URL Query Parameters
A common task is adding or changing query parameters on an existing URL. Here's the pattern: get the values, modify them, encode them back.
import "net/url"
rawURL := "https://example.com/products?category=electronics"
u, _ := url.Parse(rawURL)
// Get current params
params := u.Query()
// Modify params
params.Set("sort", "price")
params.Set("order", "asc")
params.Add("filter", "in_stock")
params.Del("category")
// Update the URL
u.RawQuery = params.Encode()
fmt.Println(u.String())
// "https://example.com/products?filter=in_stock&order=asc&sort=price"This modify-encode-assign pattern is important to remember. Since Query() returns a copy, you must explicitly write changes back. The Encode() method handles all the escaping and formatting.
URL Encoding
Like Python and JavaScript, Go provides separate functions for encoding different parts of a URL. The difference is how they handle spaces and which characters they consider safe.
| Function | Encodes Space As | Best For |
|---|---|---|
QueryEscape() | + | Query parameter values |
PathEscape() | %20 | Path segments |
Use QueryEscape() for query parameter values (spaces become +) and PathEscape() for URL path segments (spaces become %20). This matches the HTML form encoding vs. RFC 3986 distinction.
import "net/url"
text := "Hello World & Friends"
// QueryEscape - for query values
queryEncoded := url.QueryEscape(text)
fmt.Println(queryEncoded)
// "Hello+World+%26+Friends"
// PathEscape - for path segments
pathEncoded := url.PathEscape(text)
fmt.Println(pathEncoded)
// "Hello%20World%20%26%20Friends"
// Decoding
decoded, _ := url.QueryUnescape("Hello+World")
fmt.Println(decoded) // "Hello World"
decoded, _ = url.PathUnescape("Hello%20World")
fmt.Println(decoded) // "Hello World"
// url.Values.Encode() handles encoding automatically
params := url.Values{}
params.Set("query", "Tom & Jerry")
fmt.Println(params.Encode())
// "query=Tom+%26+Jerry"The key insight is that Values.Encode() handles query encoding automatically, so you rarely need to call QueryEscape() directly. Use it when building URLs manually or working with individual parameter values.
Building URLs
Go gives you direct control over URL construction through struct initialization. This is more explicit than JavaScript's approach but avoids surprising behaviors.
Building from Scratch
You can construct URLs by creating a URL struct directly or by parsing a base and modifying it.
import "net/url"
// Method 1: Construct URL struct directly
u := &url.URL{
Scheme: "https",
Host: "api.example.com:8080",
Path: "/v1/users",
}
// Add query parameters
params := url.Values{}
params.Set("status", "active")
params.Set("limit", "10")
u.RawQuery = params.Encode()
fmt.Println(u.String())
// "https://api.example.com:8080/v1/users?limit=10&status=active"
// Method 2: Parse base and modify
base, _ := url.Parse("https://api.example.com")
base.Path = "/v1/users"
base.RawQuery = url.Values{
"status": {"active"},
"limit": {"10"},
}.Encode()
fmt.Println(base.String())Both approaches work well. The struct literal is cleaner for building from scratch, while parse-and-modify is better when you start with an existing URL. Notice how Values can be initialized inline using a map literal.
Resolving Relative URLs
When processing links from web pages or API responses, you often need to resolve relative URLs against a base. Go's ResolveReference() method follows RFC 3986 rules.
import "net/url"
base, _ := url.Parse("https://example.com/blog/posts/article")
// Relative path
relative, _ := url.Parse("../about")
resolved := base.ResolveReference(relative)
fmt.Println(resolved.String())
// "https://example.com/blog/about"
// Absolute path
absolute, _ := url.Parse("/contact")
resolved = base.ResolveReference(absolute)
fmt.Println(resolved.String())
// "https://example.com/contact"
// Same directory
sibling, _ := url.Parse("other-article")
resolved = base.ResolveReference(sibling)
fmt.Println(resolved.String())
// "https://example.com/blog/posts/other-article"
// Full URL (base ignored)
full, _ := url.Parse("https://other.com/page")
resolved = base.ResolveReference(full)
fmt.Println(resolved.String())
// "https://other.com/page"The method handles all relative URL patterns: same-directory, parent directory (..), root-relative (/path), and full URLs that ignore the base entirely. This is the same behavior browsers use when resolving links.
Using with net/http
Go's net/http package integrates naturally with net/url. Here's how to build URLs for API calls and form submissions.
import (
"fmt"
"net/http"
"net/url"
"io"
)
func main() {
// Build URL with query parameters
base, _ := url.Parse("https://api.example.com/search")
params := url.Values{}
params.Set("q", "golang")
params.Set("page", "1")
base.RawQuery = params.Encode()
// Make request
resp, err := http.Get(base.String())
if err != nil {
panic(err)
}
defer resp.Body.Close()
body, _ := io.ReadAll(resp.Body)
fmt.Println(string(body))
}
// For POST with form data
func postForm() {
data := url.Values{
"username": {"john"},
"password": {"secret123"},
}
resp, err := http.PostForm("https://example.com/login", data)
if err != nil {
panic(err)
}
defer resp.Body.Close()
}For GET requests, build the URL with query parameters and call http.Get(). For POST with form data, use http.PostForm() which takes a url.Values directly and sets the content type automatically.
Let's look at some complete examples that bring these concepts together.
Practical Examples
API Client with URL Builder
This example shows a reusable API client that encapsulates URL building logic. It demonstrates Go idioms like method receivers and the builder pattern.
package main
import (
"encoding/json"
"fmt"
"net/http"
"net/url"
)
type APIClient struct {
baseURL *url.URL
apiKey string
client *http.Client
}
func NewAPIClient(baseURL, apiKey string) (*APIClient, error) {
u, err := url.Parse(baseURL)
if err != nil {
return nil, err
}
return &APIClient{
baseURL: u,
apiKey: apiKey,
client: &http.Client{},
}, nil
}
func (c *APIClient) buildURL(path string, params url.Values) string {
u := *c.baseURL // Copy base URL
u.Path = path
// Add API key to all requests
if params == nil {
params = url.Values{}
}
params.Set("api_key", c.apiKey)
u.RawQuery = params.Encode()
return u.String()
}
func (c *APIClient) Search(query string, filters map[string]string) ([]byte, error) {
params := url.Values{}
params.Set("q", query)
for k, v := range filters {
params.Set(k, v)
}
reqURL := c.buildURL("/search", params)
resp, err := c.client.Get(reqURL)
if err != nil {
return nil, err
}
defer resp.Body.Close()
var result []byte
json.NewDecoder(resp.Body).Decode(&result)
return result, nil
}
func main() {
client, _ := NewAPIClient("https://api.example.com/v1", "your-api-key")
result, _ := client.Search("golang", map[string]string{
"category": "programming",
"sort": "date",
})
fmt.Println(string(result))
}The client stores a parsed base URL and builds request URLs by copying it, setting the path, and adding query parameters. Notice how the API key is injected into every request automatically.
URL Validator
When accepting URLs from users, you need to validate them before use. This validator checks for common security issues and returns detailed error messages.
package main
import (
"fmt"
"net/url"
"strings"
)
type ValidationError struct {
Field string
Message string
}
func ValidateURL(rawURL string) (*url.URL, []ValidationError) {
var errors []ValidationError
u, err := url.Parse(rawURL)
if err != nil {
return nil, []ValidationError{{
Field: "url",
Message: "Invalid URL format",
}}
}
// Must have scheme
if u.Scheme == "" {
errors = append(errors, ValidationError{
Field: "scheme",
Message: "Missing protocol (http:// or https://)",
})
} else if u.Scheme != "http" && u.Scheme != "https" {
errors = append(errors, ValidationError{
Field: "scheme",
Message: fmt.Sprintf("Invalid protocol: %s", u.Scheme),
})
}
// Must have host
if u.Host == "" {
errors = append(errors, ValidationError{
Field: "host",
Message: "Missing domain name",
})
}
// Check for suspicious content
if strings.Contains(rawURL, "<") || strings.Contains(rawURL, ">") {
errors = append(errors, ValidationError{
Field: "content",
Message: "URL contains potentially dangerous characters",
})
}
if len(errors) > 0 {
return nil, errors
}
return u, nil
}
func main() {
testURLs := []string{
"https://example.com/page",
"example.com",
"javascript:alert(1)",
"https://example.com/<script>",
}
for _, rawURL := range testURLs {
u, errors := ValidateURL(rawURL)
if len(errors) > 0 {
fmt.Printf("Invalid: %s\n", rawURL)
for _, e := range errors {
fmt.Printf(" - %s: %s\n", e.Field, e.Message)
}
} else {
fmt.Printf("Valid: %s\n", u.String())
}
}
}The validator uses a struct to return multiple errors, making it easy to show all issues at once rather than one at a time. It checks scheme, host presence, and dangerous characters that might indicate XSS attempts.
URL Normalizer
URL normalization is essential for comparing, caching, and deduplicating URLs. This normalizer handles case, default ports, trailing slashes, and parameter ordering.
package main
import (
"fmt"
"net/url"
"sort"
"strings"
)
func NormalizeURL(rawURL string) (string, error) {
u, err := url.Parse(rawURL)
if err != nil {
return "", err
}
// Lowercase scheme and host
u.Scheme = strings.ToLower(u.Scheme)
u.Host = strings.ToLower(u.Host)
// Remove default ports
if (u.Scheme == "http" && u.Port() == "80") ||
(u.Scheme == "https" && u.Port() == "443") {
u.Host = u.Hostname()
}
// Normalize path
if u.Path == "" {
u.Path = "/"
}
// Remove trailing slash (except for root)
if u.Path != "/" && strings.HasSuffix(u.Path, "/") {
u.Path = strings.TrimSuffix(u.Path, "/")
}
// Sort query parameters
if u.RawQuery != "" {
params, _ := url.ParseQuery(u.RawQuery)
// Get sorted keys
keys := make([]string, 0, len(params))
for k := range params {
keys = append(keys, k)
}
sort.Strings(keys)
// Rebuild query string with sorted keys
sortedParams := url.Values{}
for _, k := range keys {
for _, v := range params[k] {
sortedParams.Add(k, v)
}
}
u.RawQuery = sortedParams.Encode()
}
// Remove fragment
u.Fragment = ""
return u.String(), nil
}
func main() {
urls := []string{
"HTTPS://Example.COM:443/path/?b=2&a=1#section",
"https://example.com/path?a=1&b=2",
"https://example.com:443/path/?a=1&b=2",
}
normalized := make(map[string]bool)
for _, rawURL := range urls {
norm, _ := NormalizeURL(rawURL)
fmt.Printf("%s -> %s\n", rawURL, norm)
normalized[norm] = true
}
fmt.Printf("\nUnique URLs: %d\n", len(normalized))
}The normalizer demonstrates several Go idioms: modifying the URL in place, using the strings package for manipulation, and explicitly sorting the query parameters. After normalization, semantically equivalent URLs become string-identical.
Common Patterns
Here are utility functions you'll find useful in many projects. Each follows Go's convention of returning an error alongside the result.
Add Parameters to URL
A helper function to add query parameters while preserving existing ones.
func AddParams(rawURL string, newParams map[string]string) (string, error) {
u, err := url.Parse(rawURL)
if err != nil {
return "", err
}
params := u.Query()
for k, v := range newParams {
params.Set(k, v)
}
u.RawQuery = params.Encode()
return u.String(), nil
}
// Usage
newURL, _ := AddParams(
"https://example.com/search?q=go",
map[string]string{"page": "2", "sort": "date"},
)
// "https://example.com/search?page=2&q=go&sort=date"This function parses the URL, gets the current query parameters, adds the new ones, and encodes everything back. Note that Set() replaces existing values with the same key.
Extract Domain
Extract just the domain or root domain from a URL. Useful for grouping, security checks, and analytics.
func GetDomain(rawURL string) (string, error) {
u, err := url.Parse(rawURL)
if err != nil {
return "", err
}
return u.Hostname(), nil
}
func GetRootDomain(rawURL string) (string, error) {
hostname, err := GetDomain(rawURL)
if err != nil {
return "", err
}
parts := strings.Split(hostname, ".")
if len(parts) >= 2 {
return strings.Join(parts[len(parts)-2:], "."), nil
}
return hostname, nil
}
// Usage
domain, _ := GetDomain("https://docs.api.example.com/page")
// "docs.api.example.com"
rootDomain, _ := GetRootDomain("https://docs.api.example.com/page")
// "example.com"The GetRootDomain function is simplified and won't handle all TLDs correctly (like .co.uk). For production use, consider the golang.org/x/net/publicsuffix package.
Safe Path Joining
Joining URL paths safely requires handling leading/trailing slashes correctly. This helper uses the path package to normalize paths.
import (
"net/url"
"path"
)
func JoinPath(baseURL string, paths ...string) (string, error) {
u, err := url.Parse(baseURL)
if err != nil {
return "", err
}
// Join all path segments
allPaths := append([]string{u.Path}, paths...)
u.Path = path.Join(allPaths...)
return u.String(), nil
}
// Usage
fullURL, _ := JoinPath("https://api.example.com/v1/", "users", "123", "posts")
// "https://api.example.com/v1/users/123/posts"
// Handles edge cases
fullURL, _ = JoinPath("https://example.com", "/absolute/path")
// "https://example.com/absolute/path"The path.Join() function cleans up redundant slashes and handles edge cases like absolute paths. This is safer than string concatenation, which can produce invalid URLs with double slashes.