All you need to do is just add one line of code to your existing browser automation script.
Manage concurrency and get instantly available browsers, allowing you to scale thousands of requests.
Eliminate expensive cloud set-ups and time-consuming maintenance tasks while optimizing your scraping operations.
const puppeteer = require('puppeteer-core');
const connectionURL = 'wss://browser.zenrows.com?apikey=<YOUR_ZENROWS_API_KEY>';
(async () => {
const browser = await puppeteer.connect({ browserWSEndpoint: connectionURL });
const page = await browser.newPage();
await page.goto('https://example.com');
console.log(await page.title());
await browser.close();
})();
const { chromium } = require('playwright');
const connectionURL = 'wss://browser.zenrows.com?apikey=<YOUR_ZENROWS_API_KEY>';
(async () => {
const browser = await chromium.connectOverCDP(connectionURL);
const page = await browser.newPage();
await page.goto('https://example.com');
console.log(await page.title());
await browser.close();
})();
Biggest Pro was the fast support responses when I was testing their service and learning more about the scraping process.