Are you looking to specify a custom Golang user agent? You're in the right place. With websites employing various techniques to regulate bot traffic, you must mimic natural browser behavior to access your desired data.
In this article, you'll learn how to configure your net/http client user agent in Golangto emulate an actual browser.
What Is Net/http User Agent?
The net/http User Agent (UA) string is critical to the HTTP headers sent with every HTTP request. These headers are metadata that convey additional information to the web server, such as authentication credentials, content type, caching, etc.
Most importantly, the user agent provides details about various components of the web client, such as browser name, version, device type, and operating system.
Here's a sample Google Chrome browser UA string:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36
It indicates that the user uses Chrome version 109.0.0.0, which employs the WebKit rendering engine on a 64-bit Windows 10 (Windows NT 10.0) platform.
However, your Golang net/http user agent typically looks like this.
Go-http-client/1.1
You can see yours by making a basic request to httpbin.io/user-agent
.
The clear distinction of the two UA samples above confirms how easy it is for websites to differentiate between a net/http request and an actual browser. That's why specifying a custom user agent is essential.
Let's see how.
How to Set Up a Custom Golang User Agent in Net/http
To specify a User-Agent header in your Go application using net/http, follow the steps below:
1. Prerequisites
The net/http package is part of Go's standard library, included with every Go installation by default. So, there's no need for a separate installation. You only need to create your Go project and import the necessary dependencies.
Once you have everything set up, you'll be ready to write your code. Here's a basic script that makes a GET request to https://httpbin.io/user-agent
and retrieves its text content.
package main
// import the necessary dependencies
import (
"fmt"
"io"
"net/http"
)
func main() {
// make an HTTP GET request
response, err := http.Get("https://httpbin.io/user-agent")
if err != nil {
fmt.Println("Error:", err)
return
}
defer response.Body.Close()
// read the response body
body, err := io.ReadAll(response.Body)
if err != nil {
fmt.Println("Error:", err)
return
}
// print the text content
fmt.Println(string(body))
}
2. Customize the UA
net/http provides a Header.Set()
method that allows you to set HTTP headers. This method takes two parameters: the header's name and its value. In this case, you would pass "User-Agent" as the header name and your desired UA string as the header value.
To do that, you must create a Client
. This gives you more control over HTTP headers and allows you to manipulate them according to your needs.
Let's break it down into smaller steps using the Google Chrome UA sample provided earlier and httpbin as the target website.
Import the necessary dependencies and create a custom HTTP Client
package main
import (
"fmt"
"io"
"net/http"
)
func main() {
// create custom HTTP client
client := &http.Client{
Transport: &http.Transport{},
}
}
After that, create your GET
request and set the custom user agent using Header.Set()
.
func main() {
//...
// create HTTP request
req, err := http.NewRequest("GET", "https://httpbin.io/user-agent", nil)
if err != nil {
// Handle error
return
}
// set User-Agent header
req.Header.Set("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36")
}
Next, make the HTTP request, close the response body, read it, and print its text content to the console.
func main() {
//...
// make HTTP request
response, err := client.Do(req)
if err != nil {
fmt.Println("Error:", err)
return
}
// close the response body
defer response.Body.Close()
// read the response body
body, err := io.ReadAll(response.Body)
if err != nil {
fmt.Println("Error:", err)
return
}
// print the text content
fmt.Println(string(body))
}
Putting it all together, here's the complete code:
package main
import (
"fmt"
"io"
"net/http"
)
func main() {
// create custom HTTP client
client := &http.Client{
Transport: &http.Transport{},
}
// create HTTP request
req, err := http.NewRequest("GET", "https://httpbin.io/user-agent", nil)
if err != nil {
// Handle error
return
}
// set User-Agent header
req.Header.Set("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36")
// make HTTP request
response, err := client.Do(req)
if err != nil {
fmt.Println("Error:", err)
return
}
// close the response body
defer response.Body.Close()
// read the response body
body, err := io.ReadAll(response.Body)
if err != nil {
fmt.Println("Error:", err)
return
}
// print the text content
fmt.Println(string(body))
}
Run it, and your response should be the predefined custom User Agent.
{
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
}
Congrats! You've set your first custom Golang user agent in net/http.
Not to spoil the party, but specifying a single UA may not be enough in most cases, as websites can eventually identify your scraper. However, you can fix this issue by rotating between user agents.
3. Use a Random User Agent in Net/http
Websites interpret multiple requests from the same user agent as automated traffic and block them accordingly. But by rotating UAs, you can appear to the web server as though your requests come from different browsers (users).
When making HTTP requests using the net/http package, you can rotate between UAs by randomly selecting from a predefined list.
Here's how you can modify your previous code to achieve this.
Start by importing the necessary dependencies. Then, define a list of User-Agent strings. You can include various UAs to emulate different browsers, operating systems, and devices. We've selected a few from this list of web-scraping user agents for this example.
package main
// import the necessary dependencies
import (
"fmt"
"io"
"net/http"
"math/rand"
)
func main() {
// list of User-Agent strings
var userAgents = []string{
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
// add more UA strings as needed
}
}
Next, select a random UA from the list. To do this, generate a random integer and use it to choose one from the list.
func main() {
//...
// select a random UA from the list
randomIndex := rand.Intn(len(userAgents))
randomUA := userAgents[randomIndex]
}
After that, create a Client and HTTP request. Then, specify the randomly selected UA using Header.Set()
.
func main() {
//...
// create custom HTTP client with custom Transport
client := &http.Client{
Transport: &http.Transport{},
}
// create HTTP request
req, err := http.NewRequest("GET", "https://httpbin.io/user-agent", nil)
if err != nil {
fmt.Println("Error:", err)
return
}
// set User-Agent header
req.Header.Set("User-Agent", randomUA)
}
Lastly, update the previous code with the steps above to get the following complete code.
package main
// import the necessary dependencies
import (
"fmt"
"io"
"net/http"
"math/rand"
)
func main() {
// list of User-Agent strings
var userAgents = []string{
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
// add more UA strings as needed
}
// select a random UA from the list
randomIndex := rand.Intn(len(userAgents))
randomUA := userAgents[randomIndex]
// create custom HTTP client with custom Transport
client := &http.Client{
Transport: &http.Transport{},
}
// create HTTP request
req, err := http.NewRequest("GET", "https://httpbin.io/user-agent", nil)
if err != nil {
fmt.Println("Error:", err)
return
}
// set User-Agent header
req.Header.Set("User-Agent", randomUA)
// make HTTP request
response, err := client.Do(req)
if err != nil {
fmt.Println("Error:", err)
return
}
// close the response body
defer response.Body.Close()
// read the response body
body, err := io.ReadAll(response.Body)
if err != nil {
fmt.Println("Error:", err)
return
}
// print the text content
fmt.Println(string(body))
}
Every time you run the script, a different UA will be used to make your request. For example, here are the results for three requests:
{
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
}
{
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36"
}
{
"user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
}
Awesome! You've successfully rotated your Golang http client user agent.
It's worth noting that real-world use cases would require a larger UA list. So, paying attention to your UA construction is essential. Well-formed UAs are critical to avoid triggering anti-bot measures.
For example, your UA string must match other HTTP headers.
If the User-Agent string identifies the client as a particular browser and version, but other HTTP headers suggest different characteristics or behaviors, it might raise suspicion.
That said, constructing and maintaining a list of well-formed UAs can take time and effort. But no worries, the following section provides a more straightforward solution.
Avoid Getting Blocked When Scraping With Net/http
Building a reliable User Agent rotation system is harder than it looks. You need to regularly update browser versions, make sure they match operating systems correctly and remove outdated combinations.
Additionally, websites check more than just User Agents to detect bots. They analyze request patterns, header consistency, connection fingerprints, and more. Even with perfect User Agent rotation, your requests might still get blocked.
The most effective solution is to use a web scraping API like ZenRows. It provides auto-rotating up-to-date User Agents, premium proxy, JavaScript rendering, CAPTCHA auto-bypass, and everything you need to avoid getting blocked.
Let's see how ZenRows performs against a protected page like the Antibot Challenge page.
Start by signing up for a new account, and you'll get to the Request Builder.

Paste the target URL, enable JS Rendering, and activate Premium Proxies.
Next, select Go and click on the API connection mode. Then, copy the generated code and paste it into your script.
package main
import (
"io"
"log"
"net/http"
)
func main() {
client := &http.Client{}
req, err := http.NewRequest("GET", "https://api.zenrows.com/v1/?apikey=<YOUR_ZENROWS_API_KEY>&url=https%3A%2F%2Fwww.scrapingcourse.com%2Fantibot-challenge&js_render=true&premium_proxy=true", nil)
resp, err := client.Do(req)
if err != nil {
log.Fatalln(err)
}
defer resp.Body.Close()
body, err := io.ReadAll(resp.Body)
if err != nil {
log.Fatalln(err)
}
log.Println(string(body))
}
Run the code, and you'll successfully access the page:
<html lang="en">
<head>
<!-- ... -->
<title>Antibot Challenge - ScrapingCourse.com</title>
<!-- ... -->
</head>
<body>
<!-- ... -->
<h2>
You bypassed the Antibot challenge! :D
</h2>
<!-- other content omitted for brevity -->
</body>
</html>
Congratulations! 🎉 You’ve successfully bypassed the anti-bot challenge page using ZenRows. This works for any website.
Conclusion
This guide has shown you key points about User Agents in Go's net/http:
- What User Agents are and how they work.
- How to set custom User Agents in your requests.
- Ways to rotate between different User Agents.
- Why User Agent management alone isn't enough.
Keep in mind that many websites use different anti-bot mechanisms to prevent web scraping. Integrate ZenRows to make sure you extract all the data you need without getting blocked. Try ZenRows for free!