Web Crawling Webinar for Tech Teams
Web Crawling Webinar for Tech Teams

Change Playwright User Agent: Steps & Best Practices

Idowu Omisola
Idowu Omisola
Updated: January 26, 2025 · 5 min read

Scraping a website involves implementing mechanisms to avoid being detected as a bot. In this tutorial, we'll change the User Agent in Playwright and learn the best practices.

What Is the Playwright User Agent?

The Playwright User Agent is a string that identifies you to the websites you visit when using the headless browser. It's typically included in the HTTP request header and shares information about your operating system and browser.

A typical User Agent (UA) looks like this:

Example
Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/109.0

From it, we can tell that, for example, the user is accessing the website using Firefox version 109 on a Mac OS X version 10.15.

However, the default Playwright UA is different and looks something like the one below, with the second part being the version:

Example
Playwright/1.12.0

BoldClearly, it's easy to spot that you're not a regular user. Anti-bots use this and other techniques to detect and block web scrapers. Thus, it's critical to change the default Playwright User Agent to avoid anti-bot detection.

How to Set a Custom User Agent in Playwright

Let's learn how to set a custom User Agent (UA) in Playwright to web scrape flying under the radar!

1. Getting Started

To get started, create a project folder on your device. Then, create a JavaScript file (e.g. scraping.js) in the folder.

Next, install Playwright in your project by using npm (Node.js package manager).

Terminal
npm install playwright

Then, import Playwright.

program.js
const { chromium } = require('playwright');

To scrape a webpage with Playwright, your initial code would look like this:

program.js
const { chromium } = require('playwright');
 
(async () => {
  // Launch the Chromium browser
  const browser = await chromium.launch();
 
  const context = await browser.newContext();
 
  // Create a new page in the browser context and navigate to target URL
  const page = await context.newPage();
  await page.goto('https://httpbin.io/user-agent');
  
  // Get the entire page content
  const pageContent = await page.content();
  console.log(pageContent);
 
  // Close the browser
  await browser.close();
})();

Run it, and you'll get a similar output to the one below. It displays your default User Agent in Playwright.

Output
<html><head><meta name="color-scheme" content="light dark"></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
  "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) HeadlessChrome/115.0.5790.75 Safari/537.36"
}
</pre></body></html>

If you want to review your fundamentals in this headless browser, take a look at our Playwright tutorial.

Frustrated that your web scrapers are blocked once and again?
ZenRows API handles rotating proxies and headless browsers for you.
Try for FREE

2. Customize UA

To change your Playwright User Agent, specify a custom one when launching the browser instance. We'll use the real one we displayed earlier.

Example
Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/109.0

UnderlineAdd it in the userAgent parameter:

program.js
const { chromium } = require('playwright');
 
(async () => {
  // Launch the Chromium browser
  const browser = await chromium.launch();
 
  const context = await browser.newContext({
    userAgent: 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/109.0',
  });
  
  // Create a new page in the browser context and navigate to target URL
  const page = await context.newPage();
  await page.goto('https://httpbin.io/user-agent');
  
  // Get the entire page content
  const pageContent = await page.content();
  console.log(pageContent);
 
  // Close the browser
  await browser.close();
})();

Run the script with the terminal command node scraping.js. You should see the following:

Output
<html><head><meta name="color-scheme" content="light dark"></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
  "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/109.0"
}
</pre></body></html>

That's it! You've successfully set a custom Playwright User Agent.

However, using a single UA and making many requests will make it easy for anti-bot systems to detect you as a bot. Therefore, you need to randomize it to pretend your requests are coming from different users.

3. Use a Random User Agent in Playwright

By rotating your Playwright User Agent randomly, you can mimic user behavior, making it more challenging for websites to identify and block your automated activities.

To rotate the User Agent, first create a list of UAs. We'll take some examples from our list of User Agents for web scraping. Here's how to add them:

program.js
const { chromium } = require('playwright');
 
// An array of user agent strings for different versions of Chrome on Windows and Mac
const userAgentStrings = [
  'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36',
  'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36',
  'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36',
  'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36',
];

Next, use the Math.random() function to select random UAs from your list for each request. It generates a random floating-point number between 0 and 1. By multiplying it by the length of the User Agent list and using Math.floor(), you get a random index that selects a UA.

program.js
(async () => {
  // Launch the Chromium browser
  const browser = await chromium.launch();
 
  const context = await browser.newContext({
    userAgent: userAgentStrings[Math.floor(Math.random() * userAgentStrings.length)],
});

Bringing it all together:

program.js
const { chromium } = require('playwright');
 
// An array of user agent strings for different versions of Chrome on Windows and Mac
const userAgentStrings = [
  'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36',
  'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36',
  'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36',
  'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36',
];
 
(async () => {
  // Launch the Chromium browser
  const browser = await chromium.launch();
 
  const context = await browser.newContext({
    userAgent: userAgentStrings[Math.floor(Math.random() * userAgentStrings.length)],
  });
  
  // Create a new page in the browser context and navigate to target URL
  const page = await context.newPage();
  await page.goto('https://httpbin.io/user-agent');
  
  // Get the entire page content
  const pageContent = await page.content();
  console.log(pageContent);
 
  // Close the browser
  await browser.close();
})();

With that, each time the script runs, a different User Agent is used. Run the command node scraping.js about two times in your terminal, and you should see the UA changes:

Output
<html><head><meta name="color-scheme" content="light dark"></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
  "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
}
</pre></body></html>
 
=====
 
<html><head><meta name="color-scheme" content="light dark"></head><body><pre style="word-wrap: break-word; white-space: pre-wrap;">{
  "user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
}
</pre></body></html>

Excellent! You've now successfully rotated your User Agent.

To generate a bigger list (you'll need it for real-world scraping), you'll realize it's quite easy to create incorrectly formed UAs. For example, if you add Google Chrome v83 on Windows, and the other headers in the HTTP request are that of a Mac, your target websites may detect this discrepancy and block your requests. This might seem easy, but it might not be evident when creating your UAs.

Also, it's important to keep your User Agent list up-to-date to avoid being flagged as a bot. However, this can be a tiring process and somewhat difficult to keep up with.

But is there any easier and automated way to do all of that?

Change User Agents in Playwright At Scale

Having a reliable User Agent rotation system requires constant attention. Beyond just creating a list, you need to regularly check browser versions, make sure they match with correct operating systems, and clean up outdated combinations.

More importantly, using different User Agents alone won't stop websites from detecting your Playwright automation. Modern websites check many other things like mouse movements, IP reputation, connection details, and more to spot bots.

A better solution is to use the ZenRows' Universal Scraper API. It handles User Agents automatically, supports JavaScript rendering, auto-bypasses any CAPTCHA, and provides you with everything you need to avoid getting blocked.

Let's see how ZenRows performs against a protected page like the Antibot Challenge page.

Start by signing up for a new account, to get to the Request Builder.

building a scraper with zenrows
Click to open the image in full screen

Insert the target URL, enable JS Rendering, and activate Premium Proxies.

Next, choose Node.js and then click on the API connection mode. After that, copy the generated code and paste it into your script.

scraper.js
const axios = require('axios'); 

const params = { 
url: 'https://www.scrapingcourse.com/antibot-challenge', 
apikey: '<YOUR_ZENROWS_API_KEY>', 
js_render: 'true', 
premium_proxy: 'true' 
}; 

axios.get('https://api.zenrows.com/v1/', { params }) 
.then(({ data }) => console.log(data)) 
.catch(error => console.error(error));

The generated code uses the Axios library as the HTTP client. You can install this library using the following command:

Terminal
npm install axios

When you run this code, you'll successfully access the page:

Output
<html lang="en">
<head>
    <!-- ... -->
    <title>Antibot Challenge - ScrapingCourse.com</title>
    <!-- ... -->
</head>
<body>
    <!-- ... -->
    <h2>
        You bypassed the Antibot challenge! :D
    </h2>
    <!-- other content omitted for brevity -->
</body>
</html>

Congratulations! 🎉 You've accessed a protected page without any complex Playwright setup.

Conclusion

This guide has shown you the important aspects of User Agents in Playwright:

  • What makes a good User Agent string.
  • How to set custom User Agents.
  • Ways to rotate User Agents.
  • Why User Agent management isn't enough on its own.

Remember that websites use many different ways to detect automation. Instead of managing everything yourself, try ZenRows to make sure you extract all the data you need without getting blocked. Try ZenRows for free!

Ready to get started?

Up to 1,000 URLs for free are waiting for you