How to Use Superagent-Proxy

August 9, 2023 · 3 min read

Looking to avoid detection when web scraping? Then, let's learn how to set up Superagent-proxy in this tutorial.

What Is Superagent-proxy in NodeJS?

Superagent-proxy is a NodeJS module that extends the functionality of Superagent, an HTTP request library, by adding proxy support. That can be helpful to avoid getting blocked while web scraping in NodeJS.

How Superagent-proxy Works

Under the hood, Proxy-agent provides an http.Agent implementation, allowing Superagent and other NodeJS HTTP clients like Axios to use proxy servers. Agent handles the proxy communication and allows the underlying JavaScript client to connect to the remote server through proxies.

How to Set a Superagent Proxy

The steps to set up a Superagent proxy include setting up a scraper with Superagent and then ass superagent-proxy. Let's go step by step!

To set a Superagent proxy, you must first set up your Superagent web scraper.

1. Prerequisites

First of all, make sure you have NodeJS installed.

Then, create a new project folder (e.g. Superagent_Scraper) and start a NodeJS project inside using the following commands.

Terminal
mkdir Superagent_Scraper
cd Superagent_Scraper
npm init -y

Next, install Superagent.

Terminal
npm install superagent

2. Create a Superagent Scraper

To create your Superagent scraper, open a new JavaScript file (e.g. call it superagent-proxy.js), import Superagent using the require function, and define an asynchronous function that'll contain the scraper's logic. Also, insert a try-catch block within your asynchronous function to easily handle errors.

superagent-proxy.js
const request = require('superagent');
 
(async () => {
  try {
    // Function content will be added in the next steps.
  } catch (error) {
    console.error(error);
  }
})

Now, make a GET request to your target URL and log the response body. If you send a request to [https://httpbin.io/ip](https://httpbin.io/ip), you'll get your IP address as a response. Lastly, add (); to call the async function. Here's the complete code.

superagent-proxy.js
const request = require('superagent');
 
(async () => {
  try {
    const response = await request.get('https://httpbin.io/ip');
    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

Run it, and your result should be your IP address.

Output
{ origin: '197.100.236.106:25609' }
Premium residential proxies to avoid getting blocked.
Access all the data you need with ZenRows' residential proxy network.
Try for Free

3. Add a Superagent Proxy

Install the superagent-proxy module with the following command:

Terminal
npm install superagent-proxy

Then, import the module, enable the proxy feature using Request#proxy(), and define your proxy URL. You can get a free proxy from FreeProxyList. Your previous code should now look like this:

superagent-proxy.js
const request = require('superagent');
 
// import and enable superagent-proxy using Request#proxy()
require('superagent-proxy')(request);
 
// Define proxy URL
const proxyUrl = 'http://20.89.38.178:3333'; 
 
(async () => {
  try {
    const response = await request.get('https://httpbin.io/ip');
    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

Next, within the async function, apply the proxy using the .proxy() method. You should have the following complete code. 

superagent-proxy.js
const request = require('superagent');
 
// import and enable superagent-proxy using Request#proxy()
require('superagent-proxy')(request);
 
// Define proxy URL
const proxyUrl = 'http://20.89.38.178:3333'; 
 
(async () => {
  try {
    const response = await request
      .get('https://httpbin.org/ip')
      .proxy(proxyUrl); // Apply the proxy
 
    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

And here's the output:

Output
{ origin: '20.89.38.178:4763' }

Awesome, your result this time is the Proxy's IP address, meaning you've set your first Superagent proxy.

Now, a few considerations:

HTTP vs. HTTPS Proxies

HTTP proxies are designed to handle HTTP requests (http://), while HTTPS ones work for websites using SSL/TLS encryption (https://). We recommend HTTPS proxies, as they work for both secure (HTTPS) and non-secure (HTTP) websites.

Authentication with superagent-proxy

Some proxies require authentication with a username and password. This is common with paid proxies to ensure only permitted users can access their proxy servers.

In superagent-proxy, you can add the necessary authentication details by adding an auth object as an argument in the .proxy() method, like in the example below.

Example
.proxy(proxyUrl, { 
    auth: `${username}:${password}`
});

Your script would look like this:

superagent-proxy.js
const request = require('superagent');
 
// import and enable superagent-proxy using Request#proxy()
require('superagent-proxy')(request);
 
// Define proxy URL and auth details
const proxyUrl = 'http://20.89.38.178:3333';
const username = '<YOUR_USERNAME>';
const password = '<YOUR_PASSWORD>'; 
 
(async () => {
  try {
    const response = await request
      .get('https://httpbin.org/ip')
      .proxy(proxyUrl, { 
        auth: `${username}:${password}`
      }); // Apply the proxy
 
    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

Add More Proxies to Superagent

Websites can flag proxies as bots and block them accordingly. Therefore, you need to rotate proxies to increase your chances of avoiding detection.

To see how to rotate proxies using Superagent proxy, get more free proxies from FreeProxyList.

Then change your single proxy to a proxy list like the one below.

superagent-proxy.js
const request = require('superagent');
require('superagent-proxy')(request);
 
// Define a list of proxy URLs
const proxyList = [
  'http://20.89.38.178:3333',
  'http://198.199.70.20:31028',
  'http://8.219.97.248:80',
  // Add more proxy URLs as needed
];

Next, create a function that randomly selects a proxy from your proxy list. You can use  Math.random() to achieve this.

superagent-proxy.js
// Function to select a random proxy from the list
function getRandomProxy() {
  const randomIndex = Math.floor(Math.random() * proxyList.length);
  return proxyList[randomIndex];
}

In your async function, call getRandomProxy() to get a random proxy URL and use it with the .proxy() method. 

superagent-proxy.js
(async () => {
  try {
    const proxyUrl = getRandomProxy(); // Select a random proxy URL from the list
 
    const response = await request
      .get('https://httpbin.org/ip')
      .proxy(proxyUrl);
 
    console.log('Using proxy:', proxyUrl);
    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

Your final code should look like this:

superagent-proxy.js
const request = require('superagent');
require('superagent-proxy')(request);
 
// Define a list of HTTPS proxy URLs
const proxyList = [
  'http://20.89.38.178:3333',
  'http://198.199.70.20:31028',
  'http://8.219.97.248:80',
  // Add more proxy URLs as needed
];
 
// Function to select a random HTTPS proxy from the list
function getRandomProxy() {
  const randomIndex = Math.floor(Math.random() * proxyList.length);
  return proxyList[randomIndex];
}
 
(async () => {
  try {
    const proxyUrl = getRandomProxy(); // Select a random proxy URL from the list
 
    const response = await request
      .get('https://httpbin.org/ip')
      .proxy(proxyUrl);
 
    console.log('Using proxy:', proxyUrl);
    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

To verify it works, run the code multiple times using the following command.

Terminal
node superagent-proxy.js

You should get a different IP address every time you run it. 

Here's the result for three runs:

Output
Using proxy: http://198.199.70.20:31028
{ origin: '198.199.70.20' }
 
Using proxy: http://20.89.38.178:3333
{ origin: '20.89.38.178' }
 
Using proxy: http://8.219.97.248:80
{ origin: '8.219.97.248' }

Awesome! You've successfully rotated proxies using Superagent-proxy.

Premium Proxy to Avoid Getting Blocked

Free proxies present major hurdles for web scraping. Their unstable performance, security risks, and low reliability make them unsuitable for production environments. Target websites quickly identify and block these free proxies, making your HTTP requests fail frequently.

Premium proxies deliver a more robust solution for avoiding detection. With high-quality IPs and smart rotation capabilities, premium proxies can effectively handle your web requests. Features like intelligent routing and geo-location targeting dramatically improve your success rate.

ZenRows' Residential Proxies stands out as a premium solution, offering access to 55M+ residential IPs across 185+ countries. With features like dynamic IP rotation, intelligent proxy selection, and flexible geo-targeting, all backed by 99.9% uptime, it's perfect for reliable web scraping with Superagent.

Let's integrate ZenRows' Residential Proxies with Superagent.

First, sign up and access the Proxy Generator dashboard. Your proxy credentials will be generated automatically.

generate residential proxies with zenrows
Click to open the image in full screen

Copy your proxy credentials and use them in this Node.js code:

scraper.js
const request = require("superagent");

// import and enable superagent-proxy using Request#proxy()
require("superagent-proxy")(request);

// define proxy URL and auth details
const proxyUrl = "http://superproxy.zenrows.com:1337";
const username = "<ZENROWS_PROXY_USERNAME>";
const password = "<ZENROWS_PROXY_PASSWORD>";

(async () => {
  try {
    const response = await request
      .get("https://httpbin.io/ip")
      .proxy(proxyUrl, {
        auth: `${username}:${password}`,
      });

    console.log(response.body);
  } catch (error) {
    console.error(error);
  }
})();

Here's the result after running the code multiple times:

Output
// request 1
{
  "origin": "167.71.192.85:44521"
}
// request 2
{
  "origin": "104.248.56.197:51892"
}

Perfect! The different IP addresses confirm that your Superagent requests are successfully routing through ZenRows' residential proxy network. Your HTTP client is now equipped with premium proxies that significantly reduce the risk of blocks during web scraping.

Conclusion

The Superagent-proxy module allows you to route your requests through different IP addresses to reduce the risk of getting blocked while web scraping. As a note, free proxies are unreliable, so premium residential proxies are the best option. At the same time, even premium proxies aren't enough to scrape many websites.

Whatever the case, you can consider ZenRows as a toolkit to bypass all anti-bot measures. Sign up now for free!

Frequent Questions

What Is SuperAgent Used for?

SuperAgent is used for making HTTP requests in both NodeJS and browser environments. Its intuitive API makes it easy to handle responses and manage headers and other query parameters. Additionally, it supports promises and asynchronous requests.

Ready to get started?

Up to 1,000 URLs for free are waiting for you