Free US Proxy List for Your Next Project
Scraping web data, testing geo-specific features, or just poking around the internet programmatically often requires one simple thing: a proxy. Finding a reliable, free list of proxies, especially ones with US IP addresses, can be a real chore. You either stumble into sketchy forums or hit paywalls. What if you could just grab a clean, structured list and get on with building?
That's the itch this free proxy list from Oxylabs aims to scratch. It's a public GitHub repository that provides a regularly updated collection of free US proxy servers, ready to be fed into your scripts and applications.
What It Does
In a nutshell, this GitHub repo hosts a proxy_list.json file. This file contains an array of proxy objects, each detailing a free proxy server with a US IP address. The data includes the essential details you need: IP address, port, protocol (HTTP/HTTPS), and the country code. It's a straightforward, machine-readable resource that gets straight to the point.
Why It's Cool
The value here isn't in some complex API or a fancy client library; it's in the simplicity and reliability. Instead of wasting time scraping proxy list websites yourself, you can just pull this JSON file. The list is updated regularly, which is crucial because the lifespan of free proxies can be notoriously short.
For developers, this is a low-friction solution for projects where you don't need the high reliability of a paid proxy service. Think of it as a great tool for your development toolkit, perfect for writing quick scripts, running tests, or building proof-of-concept features that require a US IP.
How to Try It
Getting started is as simple as it gets. You don't need to install anything. Just head over to the GitHub repository and grab the proxy_list.json file.
You can quickly fetch and use the list in a Python script, for example:
import requests
# Fetch the proxy list from the raw JSON file
proxy_list_url = "https://raw.githubusercontent.com/oxylabs/free-proxy-list/main/proxy_list.json"
response = requests.get(proxy_list_url)
proxies = response.json()
# Use the first proxy in the list for a request
first_proxy = proxies[0]
proxy_url = f"{first_proxy['protocol']}://{first_proxy['ip']}:{first_proxy['port']}"
proxies_dict = {
first_proxy['protocol']: proxy_url
}
try:
test_response = requests.get("http://httpbin.org/ip", proxies=proxies_dict, timeout=5)
print(test_response.json())
except requests.exceptions.RequestException as e:
print(f"Proxy failed: {e}")
Final Thoughts
While you probably wouldn't run your mission-critical, production-grade scraping pipeline on a list of free proxies, this resource is incredibly handy for everything else. It saves you the initial legwork and provides a clean, programmatic way to access a pool of US-based IPs. For hobby projects, experiments, and light-duty tasks, it's a solid find that respects your time as a developer.
Check out the repo, clone it, and maybe even star it for the next time you need a quick proxy.
@githubprojects