Have you ever gotten blocked while web scraping with Axios? Chances are the target site identified you as a bot because of your User Agent. The UA is a string of data your browser sends to the website's server, indicating information such as the browser and operating system you're using.
You need to change the User-Agent in Axios to avoid detection and access the information you want, and we'll explore how to do that in this tutorial.
Let's get started!
What Is Axios User Agent?
The User Agent is an essential fingerprint for a server to identify clients and deliver the appropriate content (e.g., display a mobile version of a page for mobile browsers). Every HTTP request includes it as part of its headers.
For instance, here's what a Chrome User Agent string would look like:
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36
The above string indicates that the client is a Chrome browser running on a 64-bit Linux operating system. It complies with Mozilla's standards, uses the AppleWebKit rendering engine, and is compatible with the Safari browser.
For most websites, the first step in bot detection is checking the UA string and blocking any client without a valid web browser User Agent.
So, how do you learn your Axios User-Agent? The HTTPBin endpoint page returns information about the UA after a GET
request. Let's make a request to see what we get.
// npm install axios
const axios = require("axios");
axios
.get("https://httpbin.io/user-agent")
.then(({ data }) => console.log(data));
You should have a response similar to this:
{ 'user-agent': 'axios/1.7.2' }
The User Agent here is axios/1.7.2
, the default UA string Axios sends for every request. As you can imagine, any website can easily recognize it's not a valid browser client and block it. To avoid this outcome, specify a real User-Agent in Axios.
Set User Agent in Axios
You can set a custom Axios User Agent by specifying it in your request configuration's headers
option.
Let's say this is the Chrome UA string we want to set:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36
Update your Axios code to add the User Agent in the headers
:
const axios = require("axios");
const headers = {
"User-Agent":
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36",
};
axios
.get("https://httpbin.io/user-agent", {
headers,
})
.then(({ data }) => console.log(data));
Run it, and you'll get a similar response to the one below:
{
'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36'
}
Awesome! The UA has changed to the one set.Â
How to Rotate User Agents in Axios
As a next step, let's get a random User Agent for each request. Start by creating the array (you can find some suitable options on our list of top User Agents for web scraping).
const userAgents = [
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
// ...
];
Modify your code to rotate the User Agents in your Axios script:
const axios = require("axios");
// list of user agent strings to rotate
const userAgents = [
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
// ...
];
// select a random user agent from the list
const ua = userAgents[Math.floor(Math.random() * userAgents.length)];
// set the user agent in the headers and make a get request
const headers = {
"User-Agent": ua,
};
axios
.get("https://httpbin.io/user-agent", {
headers,
})
.then(({ data }) => console.log(data));
You'll get a random UA every time a request is made. Here's the result from running the above code three times:
// request 1
{
'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36'
}
// request 2
{
'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'
}
// request 3
{
'user-agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'
}
While randomly changing the Axios User-Agent can work, it's not the most effective approach.
Let's explore how to efficiently scale this method for web scraping and overcome all the above limitations.
Change the Axios User Agent At Scale
Managing a User Agent rotation system takes more work than you might think. You need to keep track of new browser versions, match them with the right operating systems, and remove old combinations.
Beyond that, modern websites don't just check User Agents. They look at your connection patterns, browser behavior, network details, and more. Even with perfect User Agent rotation, your Axios requests might still get blocked.
A better solution is to use the ZenRows' Universal Scraper API. It automatically manages User Agents, handles JavaScript rendering, rotates premium proxies, auto-bypasses any CAPTCHA, and provides you with everything you need to avoid getting blocked.
Let's test ZenRows with a website that usually blocks Axios requests like the Antibot Challenge page.
Start by signing up for a new account, to get to the Request Builder.

Insert the target URL into the link box, enable JS Rendering, and activate Premium Proxies.
Next, choose Node.js and then click on the API connection mode. After that, copy the generated code and paste it into your script.
const axios = require('axios');
const params = {
url: 'https://www.scrapingcourse.com/antibot-challenge',
apikey: '<YOUR_ZENROWS_API_KEY>',
js_render: 'true',
premium_proxy: 'true'
};
axios.get('https://api.zenrows.com/v1/', { params })
.then(({ data }) => console.log(data))
.catch(error => console.error(error));
When you run this code, you'll successfully access the page:
<html lang="en">
<head>
<!-- ... -->
<title>Antibot Challenge - ScrapingCourse.com</title>
<!-- ... -->
</head>
<body>
<!-- ... -->
<h2>
You bypassed the Antibot challenge! :D
</h2>
<!-- other content omitted for brevity -->
</body>
</html>
Congratulations! 🎉 You've accessed a protected page without any complex User Agent setup.
Conclusion
In this guide, you've learned the key points about User Agents in Axios:
- What User Agents are and why they matter.
- How to set custom User Agents in your requests.
- Ways to rotate between different User Agents.
- Why User Agent management isn't enough by itself.
Keep in mind that websites use many ways to detect bots. Instead of handling everything yourself, try ZenRows to make sure you extract all the data you need without getting blocked. Try ZenRows for free!