How to Use a Proxy in PHP (2025)

Idowu Omisola
Idowu Omisola
March 27, 2025 · 4 min read

Are you getting blocked from scraping your desired data due to IP bans or geo-restrictions? You're not alone! Websites often block scrapers by tracking IPs, but proxies can help you stay undetected by masking your original IP address.

In this tutorial, you'll learn how to set up a proxy with cURL in PHP, choose a reliable provider, and apply best practices to maximize stealth and avoid detection.

Let's get started!

Setting Up a Single Proxy With PHP

In this section, you'll set up a free proxy with PHP's cURL library and test the connection on HTTPBin's IP endpoint. First, grab a free proxy from the Free Proxy List.

Create a scraper function. Then, initialize a cURL request and set the target URL. Add the proxy address as a cURL option and print the target site's HTML to show the proxy's IP address:

scraper.php
<?php

function scraper(){
    // initialize the cURL request
    $curl = curl_init();

    // set the URL with a GET HTTP request
    curl_setopt($curl, CURLOPT_URL, "https://httpbin.io/ip");

    // set proxy
    curl_setopt($curl, CURLOPT_PROXY, "http://50.223.246.237:80");

    // get the data returned by the cURL request as a string
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);

    // follow eventual redirects
    curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);

    // get the HTML of the page
    $html = curl_exec($curl);

    // echo the HTML
    echo $html;


    // catch connection errors
    if (curl_errno($curl)) {
        echo 'cURL error: ' . curl_error($curl);
    }

    // release the cURL resources
    curl_close($curl);

}
// execute the function
scraper();
?>

The above code outputs the test site's HTML, showing the proxy's IP:

Output
{
    "origin": "50.223.246.237:15642"
}

Good job! Now, you know the basics of configuring a proxy with cURL in PHP. You'll learn how to handle authenticated proxies in the next section.

Premium residential proxies to avoid getting blocked.
Access all the data you need with ZenRows' residential proxy network.
Try for Free

Proxy Authentication in PHP

Paid proxies usually require authentication with usernames and passwords, ensuring only authorized users can access the service. An authenticated proxy usually takes the following format:

Example
<PROXY_PROTOCOL>://<YOUR_USERNAME>:<YOUR_PASSWORD>@<PROXY_ADDRESS>:<PROXY_PORT>

If your proxy service requires authentication, you only need to include your credentials in another cURL option:

scraper.php
<?php

function scraper(){
    // initialize the cURL request
    $curl = curl_init();
    // set the URL with a GET HTTP request
    curl_setopt($curl, CURLOPT_URL, "https://httpbin.io/ip");
    
    // set the proxy address
    curl_setopt($curl, CURLOPT_PROXY, "http://54.37.214.253:8080");
    
    // specify your proxy credentials
    curl_setopt($curl, CURLOPT_PROXYUSERPWD, "<YOUR_USERNAME>:<YOUR_PASSWORD>");

    // get the data returned by the cURL request as a string
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);

    // follow eventual redirects
    curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);

    // get the HTML of the page as a string
    $html = curl_exec($curl);

    // echo the HTML
    echo $html;

    // catch connection errors
    if (curl_errno($curl)) {
        echo 'cURL error: ' . curl_error($curl);
    }
    // release the cURL resources
    curl_close($curl);
}
// execute the function
scraper();
?>

You just configured an authenticated PHP proxy. However, proxy rotation can anonymize your requests even further. You'll learn how it works in the next section.

Using Rotating Proxies With PHP

Proxy rotation is a technique for switching IPs per request from a pool. It's handy in large-scale scraping involving multiple requests, as each request appears to come from a different location. This way, you can avoid rate limiting and IP bans at scale.

To set up rotating proxies with your PHP web scraper, create a proxy rotator function (proxyRotator) that returns a random proxy from a list. Then, specify a list of free proxy addresses from the previous Free Proxy List website in the scraper function. Finally, update cURL's proxy option to read random proxies from the proxy rotation function:

scraper.php
<?php

// proxy rotation function
function proxyRotator($proxy_list) {
    // select a random proxy from the list
    $proxy = $proxy_list[array_rand($proxy_list)];
    return $proxy;
}

function scraper(){
    // create a proxy list
    $proxies = [
    'http://203.115.101.51:82',
    'http://50.207.199.82:80',
    'http://188.68.52.244:80',
    ];

    // get a random proxy from the list
    $proxy = proxyRotator($proxies);

    // initialize the cURL request
    $curl = curl_init();

    // set the URL with a GET HTTP request
    curl_setopt($curl, CURLOPT_URL, "https://httpbin.io/ip");

    // set the proxy address
    curl_setopt($curl, CURLOPT_PROXY, $proxy);

    // get the data returned by the cURL request as a string
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);

    // follow eventual redirects
    curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);

    // get the HTML of the page as a string
    $html = curl_exec($curl);

    // echo the HTML
    echo $html;

    // catch connection errors
    if (curl_errno($curl)) {
        echo 'cURL error: ' . curl_error($curl);
    }

    // release the cURL resources
    curl_close($curl);
    }

scraper();
?>

Here are the outputs for three consecutive requests:

Output
// request 1
{
    "origin": "188.68.52.244:29371"
}
// request 2
{
    "origin": "203.115.101.51:77352"
}
// request 3
{
    "origin": "50.207.199.82:78265"
}

Awesome! Your proxy rotation logic works!

However, free proxies have a high fail rate. So, they're only suitable for testing and not real-life projects. Besides, manually managing a pool of free proxies becomes challenging at scale, as some proxies tend to fail, resulting in potential IP bans.

You'll learn an efficient alternative in the next section.

Use Premium Proxies for Reliability

Premium proxies are the most reliable options, offering trusted IP addresses with high anonymity. When choosing a premium service, opt for residential proxies, which are the most suitable for web scraping.

Residential proxies offer IPs assigned to real users by network providers, allowing you to mimic human traffic. Most providers also offer advanced features like proxy rotation and geo-location.

The ZenRows Residential Proxies service is one of the top providers, offering a 99.9% uptime with 55M+ residential IPs distributed across 185+ countries. It features proxy rotation to prevent IP tracking or detection. You also get a geo-location feature to bypass geo-restrictions and access localized content at scale.

Let's integrate ZenRows' Residential Proxies with PHP.

Sign up to access the Proxy Generator dashboard.

generate residential proxies with zenrows
Click to open the image in full screen

You'll see your essential credentials (username, password) and proxy server details (proxy domain and proxy port). Copy and paste these into your scraper as shown below:

scraper.php
<?php
function scraper(){
    // initialize the cURL request
    $curl = curl_init();
    // set the URL with a GET HTTP request
    curl_setopt($curl, CURLOPT_URL, "https://httpbin.io/ip");
    // set up proxy
    curl_setopt($curl, CURLOPT_PROXY, "http://superproxy.zenrows.com:1337");

    // specify your proxy credentials
    curl_setopt($curl, CURLOPT_PROXYUSERPWD, "<ZENROWS_PROXY_USERNAME>:<ZENROWS_PROXY_PASSWORD>");

    // get the data returned by the cURL request as a string
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
    // follow eventual redirects
    curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
    // get the HTML of the page as a string
    $html = curl_exec($curl);

    // echo the HTML
    echo $html;

    if (curl_errno($curl)) {
        echo 'cURL error: ' . curl_error($curl);
    }
    // release the cURL resources
    curl_close($curl);
}
scraper();
?>

Here's a sample for three consecutive requests, confirming IP auto-rotation:

Output
// request 1
{
    "origin": "88.213.219.102:58167"
}

// request 2
{
    "origin": "157.254.15.69:42668"
}

// request 3
{
    "origin": "73.67.246.90:56817"
}

That was easy! You've integrated the ZenRows Residential Proxies into your PHP cURL request. Your scraper now benefits from efficient residential IP rotation.

Best Practices for Using Proxies in PHP

Beyond setting up a proxy, keeping the following scraping best practices in mind will help you build robust and stealthy web scrapers:

  • Rotate IPs and Use Proper Request Headers: In addition to proxy rotation, using suitable web scraping headers increases your chances of avoiding anti-bot detection. These headers help you mimic a real browser environment and help you bypass browser fingerprinting.
  • Optimize Request Frequency: Despite using rotating IPs, sending multiple requests too quickly can still trigger rate-limiting mechanisms. This can be due to suspicious behavioral patterns or other detection mechanisms. You can reduce the request frequency by adjusting the request speed based on the server's response. For instance, you can implement exponential backoffs to increase the interval between retries for failed requests.
  • Use Retries and Fallbacks Adequately: Retries help recover from temporary failures by resending failed HTTP requests. Fallbacks, however, define an alternative course of action if a request fails even after the maximum retry attempts. Implementing both techniques ensures your scraper doesn't halt unexpectedly. You should also consider using logs to track failed requests, so you can schedule reruns when necessary.
  • Ensure Secure and Ethical Scraping Practices: Adherence to website security policies and ethics is essential during web scraping. Avoid scraping activities that disrupt the target site's functionality and user experience. Additionally, ensure you obey the crawling rules in the robots.txt file.

Conclusion

You've learned how to set up a proxy in PHP using cURL, including proxy rotation, authentication, and best practices. Routing your scraping requests through proxies helps you avoid IP bans and reduces the chances of getting blocked by anti-bot measures.

While setting up proxies, remember that the free ones are often unreliable and unsuitable for real-world applications. We recommend a reliable premium solution like ZenRows for better success rates. Try ZenRows for free!

Ready to get started?

Up to 1,000 URLs for free are waiting for you