When to use sticky sessions

Sticky sessions keep the same residential exit IP for a chain of requests. Enable them when:
  • Your target site uses session cookies tied to IP (logged-in scraping)
  • The target has per-IP anti-bot scoring that punishes IP rotation (Cloudflare Bot Management, Akamai)
  • You’re doing multi-step checkout flows (add-to-cart → checkout → payment — same IP throughout)
  • You’re scraping a pagination chain and don’t want the site to see “different user” for each page
Don’t use them when:
  • The target site rate-limits per-IP — rotating IPs is the correct strategy (omit the session tag)
  • You’re scraping a static resource — no session state to preserve

Syntax

Append -session-<tag> to the username portion of your proxy auth:
avp_live_<key>-session-<tag>:<secret>
Combine with country pinning:
avp_live_<key>-country-us-session-<tag>:<secret>

Tag rules

RuleValue
Allowed chars[a-z0-9_-] case-insensitive
Max length64 chars
Scope(keyId, tag) pair — different keys with the same tag are separate sessions

Lifetime

The sticky binding lives 10 minutes from the last request that uses it. Every request with the same tag refreshes the 10-minute window — so a continuously-used session never expires. If the underlying exit node goes offline (WebSocket disconnects), the sticky binding is dropped and the next request picks a fresh node. Your session tag keeps working — it just binds to a different exit node silently.

Examples

One sticky session per scraper run

run_id = f"product-scrape-{date.today().isoformat()}"
proxy = f"http://avp_live_<key>-country-us-session-{run_id}:<secret>@api.atlasvpn.live:7777"

# All requests today use the same US exit IP
for sku in skus:
    requests.get(f"https://target.com/sku/{sku}", proxies={"https": proxy})

One sticky session per logged-in account

accounts = [('alice', 'pw1'), ('bob', 'pw2')]
for username, password in accounts:
    # Unique session per account — they each look like a different user
    auth = f"avp_live_<key>-country-us-session-acct-{username}:<secret>"
    proxy = f"http://{auth}@api.atlasvpn.live:7777"
    session = requests.Session()
    session.proxies = {"https": proxy}
    session.post("https://target.com/login", data={"u": username, "p": password})
    # Continue scraping as `alice` through a single consistent exit IP
    for url in my_scrape_plan:
        session.get(url)

Per-worker session in a concurrent scraper

import concurrent.futures
import uuid

def fetch(url):
    tag = uuid.uuid4().hex[:16]
    proxy = f"http://avp_live_<key>-session-{tag}:<secret>@api.atlasvpn.live:7777"
    return requests.get(url, proxies={"https": proxy})

with concurrent.futures.ThreadPoolExecutor(max_workers=20) as ex:
    results = list(ex.map(fetch, urls))
Each worker gets a fresh tag → fresh exit IP → parallelism without stepping on the rate limit.

Troubleshooting

SymptomCauseFix
Each request shows a different IP despite the tagTag contains invalid charsStick to [a-z0-9_-] — no spaces, no slashes
Sticky session “stops working” after ~10 minTTL expiredFire a keep-alive request every 5 min, or refresh on demand
502 after the session was workingNode went offlineRetry — the next request binds to a fresh node
Different sessions seem to share an IPYou ran out of nodes in that countryPool depth is beta-scale; happens in thin-pool countries