CARVIEW |
Select Language
HTTP/2 302
server: nginx
date: Sun, 27 Jul 2025 16:59:18 GMT
content-type: text/plain; charset=utf-8
content-length: 0
x-archive-redirect-reason: found capture at 20091202034600
location: https://web.archive.org/web/20091202034600/https://github.com/joshthecoder/tweepy/issues
server-timing: captures_list;dur=0.917638, exclusion.robots;dur=0.030339, exclusion.robots.policy;dur=0.012845, esindex;dur=0.051865, cdx.remote;dur=50.559083, LoadShardBlock;dur=269.187830, PetaboxLoader3.datanode;dur=100.443920, PetaboxLoader3.resolve;dur=129.781814
x-app-server: wwwb-app213
x-ts: 302
x-tr: 356
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
set-cookie: SERVER=wwwb-app213; path=/
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
HTTP/2 200
server: nginx
date: Sun, 27 Jul 2025 16:59:21 GMT
content-type: text/html; charset=utf-8
x-archive-orig-server: nginx/0.7.61
x-archive-orig-date: Wed, 02 Dec 2009 03:45:59 GMT
x-archive-orig-connection: close
x-archive-orig-status: 200 OK
x-archive-orig-etag: "4ed856cd6bbf15bb7f28f5d182740ebc"
x-archive-orig-x-runtime: 93ms
x-archive-orig-content-length: 25668
x-archive-orig-cache-control: private, max-age=0, must-revalidate
x-archive-guessed-content-type: text/html
x-archive-guessed-charset: utf-8
memento-datetime: Wed, 02 Dec 2009 03:46:00 GMT
link: ; rel="original", ; rel="timemap"; type="application/link-format", ; rel="timegate", ; rel="first memento"; datetime="Sat, 29 Aug 2009 11:18:26 GMT", ; rel="prev memento"; datetime="Thu, 15 Oct 2009 05:40:06 GMT", ; rel="memento"; datetime="Wed, 02 Dec 2009 03:46:00 GMT", ; rel="next memento"; datetime="Sun, 10 Jan 2010 01:49:45 GMT", ; rel="last memento"; datetime="Sat, 26 Apr 2025 06:11:34 GMT"
content-security-policy: default-src 'self' 'unsafe-eval' 'unsafe-inline' data: blob: archive.org web.archive.org web-static.archive.org wayback-api.archive.org athena.archive.org analytics.archive.org pragma.archivelab.org wwwb-events.archive.org
x-archive-src: 51_13_20091201175425_crawl102-c/51_13_20091202033508_crawl101.arc.gz
server-timing: captures_list;dur=0.771720, exclusion.robots;dur=0.028064, exclusion.robots.policy;dur=0.012461, esindex;dur=0.014881, cdx.remote;dur=1732.171899, LoadShardBlock;dur=296.620632, PetaboxLoader3.datanode;dur=134.888185, PetaboxLoader3.resolve;dur=234.846026, load_resource;dur=225.120379
x-app-server: wwwb-app213
x-ts: 200
x-tr: 2312
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
content-encoding: gzip
Issues - joshthecoder/tweepy - GitHub
This repository is private.
All pages are served over SSL and all pushing and pulling is done over SSH.
No one may fork, clone, or view it unless they are added as a member.
Every repository with this icon (
) is private.
Every repository with this icon (

This repository is public.
Anyone may fork, clone, or view it.
Every repository with this icon (
) is public.
Every repository with this icon (

Description: | twitter api library for python edit |
Sort by:
Priority
Priority
|
Votes
Votes
|
Last Updated
Last Updated
Loading…
-
Feature Requestx
Here's a changeset with one approach:
https://github.com/brianmichelich/tweepy/commit/5cd1b3d1ce5d760e6c4e08af353c4c1ce406b475This allows you to set a proxy_host and proxy_port on the API() and have binder.py pick them up when creating a connection.
Comments
This feature is coming soon. Sit tight!
We can already provide a proxy host via the "host" parameter in the API init method.
There is not a way to provide a port though which could be easily added. I will look into providing this in the next release. Thanks!
Unless I missed something a changed host parameter would allow for a reverse proxy and I was setting up a forward proxy.
The reverse proxy worked when I sent the request to foo.com and forwarded to twitter.com. Setting a hosts on the connection as well as the request allowed the forward proxy to work. This allowed me to use a forward proxy without setting it for the python environment.