©2008 Google -
Code Home -
Site Terms of Service -
Privacy Policy -
Site Directory
The recipes provided are submitted by Google's users, not by Google. Google makes no promises or representations about the performance, quality, or content of the recipes.
| CARVIEW |
Select Language
HTTP/2 302
server: nginx
date: Wed, 24 Dec 2025 07:34:23 GMT
content-type: text/plain; charset=utf-8
content-length: 0
x-archive-redirect-reason: found capture at 20100421235332
location: https://web.archive.org/web/20100421235332/https://appengine-cookbook.appspot.com/
server-timing: captures_list;dur=0.770656, exclusion.robots;dur=0.076134, exclusion.robots.policy;dur=0.063254, esindex;dur=0.025549, cdx.remote;dur=5.488259, LoadShardBlock;dur=193.192077, PetaboxLoader3.datanode;dur=67.724181, PetaboxLoader3.resolve;dur=64.895100
x-app-server: wwwb-app245-dc8
x-ts: 302
x-tr: 265
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=1
set-cookie: wb-p-SERVER=wwwb-app245; path=/
x-location: All
x-as: 14061
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
HTTP/2 200
server: nginx
date: Wed, 24 Dec 2025 07:34:24 GMT
content-type: text/html; charset=utf-8
x-archive-orig-cache-control: no-cache
x-archive-orig-expires: Fri, 01 Jan 1990 00:00:00 GMT
x-archive-orig-date: Wed, 21 Apr 2010 23:53:32 GMT
x-archive-orig-server: Google Frontend
x-archive-guessed-content-type: text/html
x-archive-guessed-charset: utf-8
memento-datetime: Wed, 21 Apr 2010 23:53:32 GMT
link: ; rel="original", ; rel="timemap"; type="application/link-format", ; rel="timegate"
content-security-policy: default-src 'self' 'unsafe-eval' 'unsafe-inline' data: blob: archive.org web.archive.org web-static.archive.org wayback-api.archive.org athena.archive.org analytics.archive.org pragma.archivelab.org wwwb-events.archive.org
x-archive-src: 51_15_20100421184516_crawl101_IndexOnly-c/51_15_20100421234807_crawl100.arc.gz
server-timing: captures_list;dur=0.647580, exclusion.robots;dur=0.020334, exclusion.robots.policy;dur=0.007989, esindex;dur=0.010614, cdx.remote;dur=8.142685, LoadShardBlock;dur=232.378096, PetaboxLoader3.datanode;dur=132.640888, PetaboxLoader3.resolve;dur=425.771034, load_resource;dur=348.299029
x-app-server: wwwb-app245-dc8
x-ts: 200
x-tr: 713
server-timing: TR;dur=0,Tw;dur=0,Tc;dur=0
x-location: All
x-as: 14061
x-rl: 0
x-na: 0
x-page-cache: MISS
server-timing: MISS
x-nid: DigitalOcean
referrer-policy: no-referrer-when-downgrade
permissions-policy: interest-cohort=()
content-encoding: gzip
Google Cookbook
Posted by dhruvbird on Tue 23 Mar 2010 in Memcache API
I am using a simple mutex implementation to be able to operate on entities that don't belong to the same entity group. Of course, this shouldn't be done too much and the lock shouldn't be held for too long, but sometimes you need this functionality. So, if anyone wants it, here is something I felt is robust enough.
It handles the case of clients(lockers) failing after acquiring a lock(and hence not unlocking). However, the best way to usethis would be:
The code:
It handles the case of clients(lockers) failing after acquiring a lock(and hence not unlocking). However, the best way to usethis would be:
m = Mutex("Some Key", 2)
try:
m.lock();
# Do stuff here....
except:
# Handle errors here....
pass
finally:
m.unlock()
The code:
from google.appengine.api.memcache import Client as mcClient
import time
import random
class Mutex:
def __init__(self, key, maxTimeOut=1.5):
self.key = key
self.maxTimeOut = maxTimeOut
self.locked = False
def lock(self):
if self.locked == True:
return
import datetime as dt
mc = mcClient()
mc.add(key=self.key + "@lockedAt", value=dt.datetime.now(), namespace="Mutex")
v = mc.incr(self.key, namespace="Mutex", initial_value=1)
while v != 1:
mc.decr(self.key, namespace="Mutex")
lastLockedAt = mc.get(self.key + "@lockedAt", namespace="Mutex")
if lastLockedAt is None or \
(dt.datetime.now() - lastLockedAt > dt.timedelta(seconds=self.maxTimeOut*4)):
mc.set(key=self.key, value=0, namespace="Mutex")
else:
random.seed(time.time())
time.sleep(random.random() * self.maxTimeOut)
v = mc.incr(self.key, namespace="Mutex", initial_value=1)
mc.set(key=self.key + "@lockedAt", value=dt.datetime.now(), namespace="Mutex")
self.locked = True
def unlock(self):
if self.locked == True:
mc = mcClient()
mc.decr(self.key, namespace="Mutex")
self.locked = False
def __del__(self):
if self.locked == True:
self.unlock()
Posted by tikurahul on Mon 22 Mar 2010 in Datastore
Over the past couple of days, i have been experimenting a lot with Google App Engine; I am thrilled with the platform in general (both the Python as well as the Java Runtime).
I use the lower level Java Datastore API because it gives me the convenience of storing data in a completely schema-less way. However one caveat is that while the Datastore is schema-less; index definitions need to be uploaded during deployment time.
The Google App Engine team recommends that you use the development server to run the required queries and a file ‘datastore-indexes-auto.xml’ is generated. This file is your index configuration file; which then gets used by the appcfg.cmd command to create indexes on Google’s datastore. For me this was a major deal breaker as i did not know which Entity properties’ to index as a soft schema would actually get generated on the fly based on the user using my application.
If you look at the current feature requests for the App Engine runtime, the ability to programmatically create indexes is on the list, but it is unfortunately low on priority – or so it seems. However, I was not going to give up; I tried to determine what the appcfg.cmd was doing to generate these indexes at deployment time. To my surprise I found that the App Engine datastore exposes a set of RESTful endpoints, one of which is -https://appengine.google.com/api/datastore/index/add; using which appcfg.cmd adds the indexes at deployment time.
On further analysis, i found that most of the code that you need to take advantage of this endpoint already exists in ‘google.appengine.tools’ package (for the Python Runtime). All you need to do is explained in the script below:
I deployed the above as a Python application and gave it a version string ‘tasks’. Therefore the URL of the this application became ‘tasks.myapp.appspot.com’. This application had access to the same datastore and thus i could generate indexes dynamically !
All I needed was the YAML for my index configuration which could be generated in a variety of ways. I used JAXB to generate the bindings for ‘datastore-indexes.xsd’ and used it to generate the YAML. Once i had the YAML, I passed it on to tasks.myapp.appspot.com and I was done !
I use the lower level Java Datastore API because it gives me the convenience of storing data in a completely schema-less way. However one caveat is that while the Datastore is schema-less; index definitions need to be uploaded during deployment time.
The Google App Engine team recommends that you use the development server to run the required queries and a file ‘datastore-indexes-auto.xml’ is generated. This file is your index configuration file; which then gets used by the appcfg.cmd command to create indexes on Google’s datastore. For me this was a major deal breaker as i did not know which Entity properties’ to index as a soft schema would actually get generated on the fly based on the user using my application.
If you look at the current feature requests for the App Engine runtime, the ability to programmatically create indexes is on the list, but it is unfortunately low on priority – or so it seems. However, I was not going to give up; I tried to determine what the appcfg.cmd was doing to generate these indexes at deployment time. To my surprise I found that the App Engine datastore exposes a set of RESTful endpoints, one of which is -https://appengine.google.com/api/datastore/index/add; using which appcfg.cmd adds the indexes at deployment time.
On further analysis, i found that most of the code that you need to take advantage of this endpoint already exists in ‘google.appengine.tools’ package (for the Python Runtime). All you need to do is explained in the script below:
from google.appengine.tools import appengine_rpchost = 'appengine.google.com'def auth_function(): return ('yourusername','yourpassword')source = 'yourapplication.appspot.com'rpc_server = appengine_rpc.HttpRpcServer(host, auth_function, 'Python 2.6', source)#Authenticaterpc_server._Authenticate();if rpc_server.authenticated == True: print 'Authentication Successful.' for cookie in rpc_server.cookie_jar: if cookie.name and cookie.name == 'ACSID': print 'Authentication Token Obtained (%s =
>
%s)' % (cookie.name, cookie.value) '''index_payload is description of your indexes in YAML - This can be generated in various different ways. '''index_payload = 'indexes:\n- kind: Comment\n properties:\n - name: name\n direction: asc\n - name: comment\n direction: asc'#Uploading Indexprint rpc_server.Send(request_path='/api/datastore/index/add', payload= index_payload, app_id='
<
yourApplicationId
>
', version='
<
yourVersionString
>
')print 'Uploading Index Successful'
I deployed the above as a Python application and gave it a version string ‘tasks’. Therefore the URL of the this application became ‘tasks.myapp.appspot.com’. This application had access to the same datastore and thus i could generate indexes dynamically !
All I needed was the YAML for my index configuration which could be generated in a variety of ways. I used JAXB to generate the bindings for ‘datastore-indexes.xsd’ and used it to generate the YAML. Once i had the YAML, I passed it on to tasks.myapp.appspot.com and I was done !
Posted by guy@multiniche.org on Mon 22 Mar 2010 in Python
This Python module and Django template demonstrate how cookies can be read and set.
You can view the cookies in the incoming request and the request headers. You can set a cookie value using a form in the Django template. You can view the server's response page which sets the cookie in the browser.
This demonstration can be incorporated into an existing webapp project. You'll need to point an url-path from your WSGIApplication to the DisplayCookies RequestHandler, and you'll need to edit the template rendering at the end of the Python module to point correctly to the Django template in your directory tree.
You can view the cookies in the incoming request and the request headers. You can set a cookie value using a form in the Django template. You can view the server's response page which sets the cookie in the browser.
This demonstration can be incorporated into an existing webapp project. You'll need to point an url-path from your WSGIApplication to the DisplayCookies RequestHandler, and you'll need to edit the template rendering at the end of the Python module to point correctly to the Django template in your directory tree.
Posted by recrabtree23 on Fri 12 Mar 2010 in Datastore
3 cups chicken stock
1 pound chicken tenders
1 bay leaf, fresh if available
1 tablespoon extra-virgin olive oil, 1 turn of the pan
4 slices thick, smoky center cut bacon, chopped
1 onion, finely chopped
4 cloves garlic, chopped
2 chipotles in adobo, chopped, plus 2 tablespoons sauce
1 (28-ounce) can crushed fire roasted tomatoes
Salt
4 cups lightly crushed corn tortilla chips
2 cups shredded fresh smoked mozzarella or smoked sharp white Cheddar, 3/4 pound
1 lime, cut into wedges
1/2 red onion, chopped
Freshly chopped cilantro leaves, for garnish
Directions
Bring broth to a simmer and add chicken tenders, poach 6 to 7 minutes with a bay leaf.
While chicken poaches, heat extra-virgin olive oil in a medium soup pot or deep skillet over medium-high heat. Add bacon and cook until crisp then remove with slotted spoon. Drain off excess fat, leaving 2 to 3 tablespoons in the pan. Add onions and garlic to the skillet and cook 5 minutes then stir in chipotles and tomatoes.
Remove chicken from stock, dice and then add to soup. Pass stock through a strainer then add to the soup.
Place a pile of crushed tortilla chips in the bottom of each soup bowl. Cover liberally with smoked cheese then ladle the hot soup down over the top. Serve with lime, raw onions and cilantro at table to finish the soup.
1 pound chicken tenders
1 bay leaf, fresh if available
1 tablespoon extra-virgin olive oil, 1 turn of the pan
4 slices thick, smoky center cut bacon, chopped
1 onion, finely chopped
4 cloves garlic, chopped
2 chipotles in adobo, chopped, plus 2 tablespoons sauce
1 (28-ounce) can crushed fire roasted tomatoes
Salt
4 cups lightly crushed corn tortilla chips
2 cups shredded fresh smoked mozzarella or smoked sharp white Cheddar, 3/4 pound
1 lime, cut into wedges
1/2 red onion, chopped
Freshly chopped cilantro leaves, for garnish
Directions
Bring broth to a simmer and add chicken tenders, poach 6 to 7 minutes with a bay leaf.
While chicken poaches, heat extra-virgin olive oil in a medium soup pot or deep skillet over medium-high heat. Add bacon and cook until crisp then remove with slotted spoon. Drain off excess fat, leaving 2 to 3 tablespoons in the pan. Add onions and garlic to the skillet and cook 5 minutes then stir in chipotles and tomatoes.
Remove chicken from stock, dice and then add to soup. Pass stock through a strainer then add to the soup.
Place a pile of crushed tortilla chips in the bottom of each soup bowl. Cover liberally with smoked cheese then ladle the hot soup down over the top. Serve with lime, raw onions and cilantro at table to finish the soup.
Posted by casualnews on Thu 11 Mar 2010 in Python
#! /usr/bin/python --
"""
reg.cgi by Michael Thornburgh.
This file is in the public domain.
IMPORTANT: This script is for illustrative purposes only. It does
not have user authentication or other access control measures that
a real production service would have.
This script should be placed in the cgi-bin location according to
your web server installation. The database is an SQLite3 database.
Edit the location of the database in variable "dbFile".
Create it with the following schema:
.schema
CREATE TABLE registrations (
m_username VARCHAR COLLATE NOCASE,
m_identity VARCHAR,
m_updatetime DATETIME,
PRIMARY KEY (m_username)
);
CREATE INDEX registrations_updatetime ON registrations (m_updatetime ASC);
"""
# CHANGE THIS
dbFile = '.../registrations.db'
import cgi
import sqlite3
import xml.sax.saxutils
query = cgi.parse()
db = sqlite3.connect(dbFile)
user = query.get('username', [None])[0]
identity = query.get('identity', [None])[0]
friends = query.get('friends', [])
print 'Content-type: text/plain\n\n\n'
if user:
try:
c = db.cursor()
c.execute("insert or replace into registrations values (?, ?, datetime('now'))", (user, identity))
print '\ttrue'
except:
print '\tfalse'
for f in friends:
print "\t\n\t\t%s" % (xml.sax.saxutils.escape(f), )
c = db.cursor()
c.execute("select m_username, m_identity from registrations where m_username = ? and m_updatetime > datetime('now', '-1 hour')", (f, ))
for result in c.fetchall():
eachIdent = result[1]
if not eachIdent:
eachIdent = ""
print "\t\t%s" % (xml.sax.saxutils.escape(eachIdent), )
if f != result[0]:
print "\t\t%s" % (xml.sax.saxutils.escape(result[0]), )
print "\t"
db.commit()
print ""
Posted by davide.rognoni on Mon 08 Mar 2010 in Datastore
Into my app I have this "dynamic query":
Now using this form:
https://lms4milano.appspot.com/latest_messages
I see errors like this:
This is my solution to AUTOGENERATE all necessary indexes:
q = model.Message.all()
q.filter("status =", "visible")
if sex != "":
q.filter("sex =", sex)
if sexual_orientation != "":
q.filter("sexual_orientation =", sexual_orientation)
if age != "":
birth_year = now.year - int(age)
q.filter("birth_year =", birth_year)
if residency != "":
q.filter("residency =", residency)
if birth_country != "":
q.filter("birth_country =", birth_country)
q.order("-last_update")
Now using this form:
https://lms4milano.appspot.com/latest_messages
I see errors like this:
NeedIndexError: no matching index found.
This query needs this index:
- kind: Message
properties:
- name: sex
- name: status
- name: last_update
direction: desc
This is my solution to AUTOGENERATE all necessary indexes:
INDEX = '''\
- kind: Message
properties:
%s - name: status
- name: last_update
direction: desc
'''
INDEX_NAME = '''\
- name: %s
'''
def bin(n, p):
pad = "%%0%is" % p
s = ''
while n != 0:
if n % 2 == 0:
bit = '0'
else:
bit = '1'
s = bit + s
n >>= 1
s = pad % s
s = s.replace(" ", "0")
return s
def names(l, bin):
bin = bin[::-1]
text = ""
count = 0
for b in bin:
if b == "1":
text += INDEX_NAME % l[count]
count += 1
return text
def main():
filters = ["sex", "sexual_orientation", "birth_year", "residency", "birth_country"]
filters_size = len(filters)
combinations = 2**filters_size
text = ""
for i in range(combinations):
text += INDEX % names(filters, bin(i, filters_size))
print text
main()
Posted by Brandon.J.Thomson on Tue 23 Feb 2010 in Python
Add this snippet to your deployment script to automatically tag your current HEAD in git vcs with the current version from app.yaml. If you deploy a new version with the same name it will overwrite the old tag.
Very useful to keep track of where your deployed versions are in your vcs without any work.
Very useful to keep track of where your deployed versions are in your vcs without any work.
with open("app.yaml") as f:
import yaml
data=yaml.load(f, Loader=yaml.Loader)
os.system("git tag -f -a 'v%s' -m 'automatically set by deployment script'" % data['version'])
Posted by mike@robellard.com on Tue 16 Feb 2010 in Webapp Framework
I modified the example profiling code from here:
https://code.google.com/appengine/kb/commontasks.html#profiling
to be a WSGI middleware. It's very basic and could be expanded to allow configuration options. However I was in a hurry, but thought I should share.
https://code.google.com/appengine/kb/commontasks.html#profiling
to be a WSGI middleware. It's very basic and could be expanded to allow configuration options. However I was in a hurry, but thought I should share.
import cProfile, pstats, StringIO
class ProfileWSGIMiddleware(object):
def __init__(self, app):
self.app = app
def __call__(self, environ, start_response):
# This is the main function for profiling
# We've renamed our original main() above to real_main()
prof = cProfile.Profile()
prof = prof.runctx("self.app(environ, start_response)", globals(), locals())
stream = StringIO.StringIO()
stats = pstats.Stats(prof, stream=stream)
stats.sort_stats("time") # Or cumulative
stats.print_stats(80) # 80 = how many to print
# The rest is optional.
# stats.print_callees()
# stats.print_callers()
logging.info("Profile data:\n%s", stream.getvalue())
application = ProfileWSGIMiddleware(application)
Posted by bruno.braga on Wed 03 Feb 2010 in Datastore
This is a very simple and succinct article to register the format used to query the datastore viewer (from Google App Engine Admin panel) using the GQL form. Among other properties, the one that intrigued me was the user property.
For instance:
or
just don't work.
I didn't find any other report informing about this, so I decided to leave it here, so that it may help others (sometimes you want to do direct queries in the datastore without creating backdoor scripts to do so, right?)...
The solution that will work is:
Here is one example used to query my data, with the application I am working with.
https://www.brunobraga.net/gae-help/files/gae_datastore_viewer_gql_user.jpeg
For instance:
SELECT * FROM Model WHERE user='me@gmail.com'
or
SELECT * FROM Model WHERE user='me'
just don't work.
I didn't find any other report informing about this, so I decided to leave it here, so that it may help others (sometimes you want to do direct queries in the datastore without creating backdoor scripts to do so, right?)...
The solution that will work is:
SELECT * FROM Model WHERE user=User('me@gmail.com')
Here is one example used to query my data, with the application I am working with.
https://www.brunobraga.net/gae-help/files/gae_datastore_viewer_gql_user.jpeg
Posted by bruno.braga on Wed 03 Feb 2010 in Datastore
This is a very simple and succinct article to register the format used to query the datastore viewer (from Google App Engine Admin panel) using the GQL form. Among other properties, the one that intrigued me was the user property.
For instance:
or
just don't work.
I didn't find any other report informing about this, so I decided to leave it here, so that it may help others (sometimes you want to do direct queries in the datastore without creating backdoor scripts to do so, right?)...
The solution that will work is:
Attached is one example used to query my data, with the application I am working with.
For instance:
SELECT * FROM Model WHERE user='me@gmail.com'
or
SELECT * FROM Model WHERE user='me'
just don't work.
I didn't find any other report informing about this, so I decided to leave it here, so that it may help others (sometimes you want to do direct queries in the datastore without creating backdoor scripts to do so, right?)...
The solution that will work is:
SELECT * FROM Model WHERE user=User('me@gmail.com')
Attached is one example used to query my data, with the application I am working with.
