CARVIEW |
Select Language
HTTP/2 302
server: nginx
date: Fri, 08 Aug 2025 13:08:24 GMT
content-type: text/plain; charset=utf-8
content-length: 0
x-archive-redirect-reason: found capture at 20100713083124
location: https://web.archive.org/web/20100713083124/https://github.com/adamfast/python-tweetar
server-timing: captures_list;dur=1.012793, exclusion.robots;dur=0.029858, exclusion.robots.policy;dur=0.013778, esindex;dur=0.015758, cdx.remote;dur=31.559279, LoadShardBlock;dur=235.358329, PetaboxLoader3.datanode;dur=120.977625, PetaboxLoader3.resolve;dur=75.506680
x-app-server: wwwb-app217
x-ts: 302
x-tr: 334
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
set-cookie: wb-p-SERVER=wwwb-app217; path=/
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
HTTP/2 200
server: nginx
date: Fri, 08 Aug 2025 13:08:26 GMT
content-type: text/html; charset=utf-8
x-archive-orig-server: nginx/0.7.61
x-archive-orig-date: Tue, 13 Jul 2010 08:31:24 GMT
x-archive-orig-connection: close
x-archive-orig-status: 200 OK
x-archive-orig-etag: "8240abe414c990d72f770b5522d0dfe2"
x-archive-orig-x-runtime: 96ms
x-archive-orig-content-length: 21807
x-archive-orig-cache-control: private, max-age=0, must-revalidate
x-archive-guessed-content-type: text/html
x-archive-guessed-charset: utf-8
memento-datetime: Tue, 13 Jul 2010 08:31:24 GMT
link: ; rel="original", ; rel="timemap"; type="application/link-format", ; rel="timegate", ; rel="first memento"; datetime="Tue, 13 Jul 2010 08:31:24 GMT", ; rel="memento"; datetime="Tue, 13 Jul 2010 08:31:24 GMT", ; rel="next memento"; datetime="Wed, 14 Jul 2010 20:36:50 GMT", ; rel="last memento"; datetime="Thu, 02 Feb 2023 21:02:57 GMT"
content-security-policy: default-src 'self' 'unsafe-eval' 'unsafe-inline' data: blob: archive.org web.archive.org web-static.archive.org wayback-api.archive.org athena.archive.org analytics.archive.org pragma.archivelab.org wwwb-events.archive.org
x-archive-src: 51_16_20100713040935_crawl102_IndexOnly-c/51_16_20100713083034_crawl101.arc.gz
server-timing: captures_list;dur=1.202753, exclusion.robots;dur=0.020351, exclusion.robots.policy;dur=0.010133, esindex;dur=0.013333, cdx.remote;dur=35.417339, LoadShardBlock;dur=255.932618, PetaboxLoader3.datanode;dur=129.675994, PetaboxLoader3.resolve;dur=773.569854, load_resource;dur=656.379959
x-app-server: wwwb-app217
x-ts: 200
x-tr: 997
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
x-location: All
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
content-encoding: gzip
adamfast's python-tweetar at master - GitHub
adamfast / python-tweetar
- Source
- Commits
- Network (1)
- Issues (1)
- Downloads (0)
- Wiki (1)
- Graphs
-
Branch:
master
click here to add a description
click here to add a homepage
name | age | message | |
---|---|---|---|
![]() |
LICENSE | Sun Jan 31 20:45:17 -0800 2010 | Initial commit of license/readme. [adamfast] |
![]() |
README | Sun Jan 31 21:18:50 -0800 2010 | Added advanced example of how to set it up in a... [adamfast] |
![]() |
tweetar.py | Sun Jan 31 20:44:00 -0800 2010 | Initial commit. [adamfast] |
README
This project has a very simple purpose - post the current METAR according to the National Weather Service to twitter, where it can be subscribed to by any interested parties. This library is not intended to be used by its users or their end users for flight planning purposes. Remember your training and get your weather briefing from a certified source. If you don't know where to get weather, you can find a flight instructor at https://www.aopa.org/learntofly/findcfi/ and they will be happy to teach you about aviation weather and other aspects of aviation. Dependencies on your PythonPath: python-twitter https://code.google.com/p/python-twitter/ Procedure: * Edit tweetar.py and replace <station_id>, <twitter_user> and <twitter_pass> with the appropriate values. If you want to report multiple stations, copy and paste the line as many times as necessary and change the values as appropriate. * retrieve_and_post({'station': '<station_id>', 'twitter_user': '<twitter_user>', 'twitter_password': '<twitter_pass>'}) * Schedule "/path/to/python /path/to/tweetar.py" to run. My recommendation is once an hour - if you follow the METARs of your local station you'll know when it updates. It's best to schedule a pull 5-10 minutes after that time, the update across NWS is not instant. Scheduling it so far out will miss SPECI updates, but will be a good internet citizen since those occur so rarely on average (at least where I'm from). If you schedule it to run more often note Twitter's API will filter updates identical to the last post - so you won't flood the subscribers to the feed. But I wouldn't consider shifting that burden to them as totally fair. Could this script cache the last version and see if there's been a change? Sure, but you're still making a lot of noop requests to the NWS. Do what you will, that's my stance on it. Your mileage and experience may vary. Advanced Procedure: Alternately, if you'd like to avoid editing the tweetar.py file in case there are ever updates to it, create a new Python script of your own (mine's called tweetar_custom.py) and do something like this, adding as many retrieve_and_post calls as necessary for the stations you're "hosting": BEGIN FILE from tweetar import * # this is so I don't have to change the raw tweetar.py to my users/passwords retrieve_and_post({'station': 'station1', 'twitter_user': 'station1_user', 'twitter_password': 'station1_password'}) retrieve_and_post({'station': 'station2', 'twitter_user': 'station2_user', 'twitter_password': 'station2_password'}) END FILE I'm doing it this way so I don't accidentally commit usernames/passwords into public sight.