This page provides Python code examples for using memcache. Memcache is a high-performance, distributed memory object caching system that provides fast access to cached data. To learn more about memcache, read the Memcache Overview.
The memcache Pattern
Memcache is typically used with the following pattern:
- The application receives a query from the user or the application.
- The application checks whether the data needed to satisfy that query is in memcache.
- If the data is in memcache, the application uses that data.
- If the data is not in memcache, the application queries the Datastore and stores the results in memcache for future requests.
The pseudocode below represents a typical memcache request:
ndb internally uses memcache to speed up queries. However, if you wish, you can also explicitly add memcache calls to gain more control about the speed-ups.
Caching data
The following example demonstrates several ways to set values in memcache using the Python API.
To learn more about the add()
, set_multi()
, and set()
methods, see the
memcache Python API documentation.
Modifying guestbook.py
to use memcache
The Guestbook application queries the Datastore on every request (via ndb, so it already gains some of the memcache speed-ups). You can modify the Guestbook application to use memcache explicitly before resorting to querying the Datastore.
First we'll import the memcache module and create the method that checks memcache before running a query.
Next we'll separate out the querying and creation of the HTML for the page. When we don't hit the cache, we'll call this method to query the Datastore and build the HTML string that we'll store in memcache.
Finally we will update the MainPage
handler to call the get_greetings() method and display some stats about the number of times the cache was hit or missed.