Set User Agent in Axios: Step-by-Step Guide

July 17, 2024 · 5 min read

Have you ever gotten blocked while web scraping with Axios? Chances are the target site identified you as a bot because of your User Agent. The UA is a string of data your browser sends to the website's server, indicating information such as the browser and operating system you're using.

You need to change the User Agent in Axios to avoid detection and access the information you want, and we'll explore how to do that in this tutorial.

Let's get started!

What Is the User Agent in Axios?

The User Agent is an essential fingerprint for a server to identify clients and deliver the appropriate content (e.g., display a mobile version of a page for mobile browsers). Every HTTP request includes it as part of its headers.

For instance, here's what a Chrome User Agent string would look like:

Example
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36

The above string indicates that the client is a Chrome browser running on a 64-bit Linux operating system. It complies with Mozilla's standards, uses the AppleWebKit rendering engine, and is compatible with the Safari browser.

For most websites, the first step in bot detection is checking the UA string and blocking any client without a valid web browser User Agent.

So, how do you learn your Axios User Agent? The HTTPBin endpoint page returns information about the UA after a GET request. Let's make a request to see what we get.

scraper.js
// npm install axios
const axios = require("axios");
axios
  .get("https://httpbin.io/user-agent")
  .then(({ data }) => console.log(data));

You should have a response similar to this:

Output
{ 'user-agent': 'axios/1.7.2' }

The User Agent here is axios/1.7.2, the default UA string Axios sends for every request. As you can imagine, any website can easily recognize it's not a valid browser client and block it. To avoid this outcome, specify a real User Agent for Axios.

Set User Agent in Axios

You can set a custom Axios User Agent by specifying it in your request configuration's headers option.

Let's say this is the Chrome UA string we want to set:

Example
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36

Update your Axios code to add the User Agent in the headers:

scraper.js
const axios = require("axios");
const headers = {
  "User-Agent":
    "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36",
};
axios
  .get("https://httpbin.io/user-agent", {
    headers,
  })
  .then(({ data }) => console.log(data));

Run it, and you'll get a similar response to the one below:

Output
{
  'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36'
}

Awesome! The UA has changed to the one set. 

How to Rotate User Agents in Axios

As a next step, let's get a random User Agent for each request. Start by creating the array (you can find some suitable options on our list of top User Agents for web scraping).

scraper.js
const userAgents = [
  "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
  "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
  "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
  "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
  // ...
];

Modify your code to rotate the User Agents in your Axios script:

scraper.js
const axios = require("axios");

// list of user agent strings to rotate
const userAgents = [
  "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
  "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
  "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
  "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36",
  // ...
];

// select a random user agent from the list
const ua = userAgents[Math.floor(Math.random() * userAgents.length)];

// set the user agent in the headers and make a get request
const headers = {
  "User-Agent": ua,
};
axios
  .get("https://httpbin.io/user-agent", {
    headers,
  })
  .then(({ data }) => console.log(data));

You'll get a random UA every time a request is made. Here's the result from running the above code three times:

Output
// request 1
{
  'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36'
}

// request 2
{
  'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'
}

// request 3
{
  'user-agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36'
}

While randomly changing the Axios User Agent can work, it's not the most effective approach.

Let's explore how to efficiently scale this method for web scraping and overcome all the above limitations.

Frustrated that your web scrapers are blocked once and again?
ZenRows API handles rotating proxies and headless browsers for you.
Try for FREE

Change the Axios User Agent At Scale and Avoid Getting Blocked

While a simple User Agent rotator might seem effective for web scraping, it's not the best choice for at-scale operations due to the challenges of maintaining an extensive and up-to-date list of valid User Agents.

What's more, combining the custom rotator with other solutions, such as proxies, still isn't 100% foolproof against advanced anti-bot systems. These systems often employ sophisticated anti-scraping challenges.

That's why the best way to get foolproof protection is to use a web scraping API, such as ZenRows. ZenRows offers a comprehensive scraping solution that auto-rotates User Agents, provides anti-CAPTCHA capabilities, and includes all other anti-bot bypass tools you might need, all under the hood.

Let's use ZenRows to scrape a heavily protected G2 Reviews webpage, as the previous script would fail with such a target site.
First, sign up on ZenRows to get a free API key. Then, you'll get redirected to the Request Builder page. Enter the target URL in the URL to Scrape input box. Click on the Premium Proxies checkbox and enable the JS Rendering boost mode. Select Node.js and click on the API tab.

ZenRows Request Builder
Click to open the image in full screen

Copy-paste the generated code into your script:

scraper.js
// npm install axios
const axios = require("axios");

const url = "https://www.g2.com/products/visual-studio/reviews";
const apikey = "<YOUR_ZENROWS_API_KEY>";
axios({
  url: "https://api.zenrows.com/v1/",
  method: "GET",
  params: {
    url: url,
    apikey: apikey,
    js_render: "true",
    premium_proxy: "true",
  },
})
  .then((response) => console.log(response.data))
  .catch((error) => console.log(error));

You'll get the following target page's HTML on running the code:

Output
<!DOCTYPE html>
<head>
    <title>Visual Studio Reviews 2024: Details, Pricing, &amp; Features | G2</title>
    <!--
    ...
    -->
</head>

When you make a request, ZenRows will handle Axios User Agent randomization and other anti-bot challenges. Awesome, right?

Fix Error in Axios: Refused to Set Unsafe Header User-Agent

When you use client-side JavaScript code to set a custom User-Agent header in your Axios request, the "Refused to set unsafe header User-Agent" error occurs. That's because modern web browsers see your UA as unsafe and shouldn't be set on the client side.

To resolve this error, remove the custom UA header from your Axios request or set it on the server side instead. You can use ZenRows for cases where you have to use a different User Agent in client-side JavaScript code, as it'll randomize the UA on the server side.

Conclusion

Randomizing your User Agent in Axios makes it harder for websites to detect your scraping activities, as each request appears to be coming from a different software or device. Therefore, it's essential to configure the UAs and other HTTP headers properly to make your request look like it's coming from a real user.

Additionally, apart from bot detection using User Agents, you need to deal with other obstacles, like IP address reputation and JavaScript challenges. Check our article on web scraping without getting blocked to learn the best methods to avoid detection.

Overall, using a web scraping API such as ZenRows will save you a lot of effort and resources. ZenRows offers premium rotating proxies, headless browser automation, CAPTCHA solving, and more to make web scraping easier and more effective.

Ready to get started?

Up to 1,000 URLs for free are waiting for you