deb_control_files:
- control
- md5sums
- postinst
- prerm
deb_fields:
Architecture: all
Depends: python3:any
Description: |-
multi backend asyncio cache
Asyncio cache supporting multiple backends (memory, redis and memcached).
This library aims for simplicity over specialization.
All caches contain the same minimum interface which consists on the following
functions:
.
- ``add``: Only adds key/value if key does not exist.
- ``get``: Retrieve value identified by key.
- ``set``: Sets key/value.
- ``multi_get``: Retrieves multiple key/values.
- ``multi_set``: Sets multiple key/values.
- ``exists``: Returns True if key exists False otherwise.
- ``increment``: Increment the value stored in the given key.
- ``delete``: Deletes key and returns number of deleted items.
- ``clear``: Clears the items stored.
- ``raw``: Executes the specified command using the underlying client.
.
.. role:: python(code)
:language: python
.
.. contents::
.
.. section-numbering:
.
Usage
=====
.
Using a cache is as simple as
.
.. code-block:: python
.
>>> import asyncio
>>> from aiocache import Cache
>>> cache = Cache(Cache.MEMORY) # Here you can also use Cache.REDIS and Cache.MEMCACHED, default is Cache.MEMORY
>>> with asyncio.Runner() as runner:
>>> runner.run(cache.set('key', 'value'))
True
>>> runner.run(cache.get('key'))
'value'
.
Or as a decorator
.
.. code-block:: python
.
import asyncio
.
from collections import namedtuple
.
from aiocache import cached, Cache
from aiocache.serializers import PickleSerializer
# With this we can store python objects in backends like Redis!
.
Result = namedtuple('Result', "content, status")
.
.
@cached(
ttl=10, cache=Cache.REDIS, key="key", serializer=PickleSerializer(), port=6379, namespace="main")
async def cached_call():
print("Sleeping for three seconds zzzz.....")
await asyncio.sleep(3)
return Result("content", 200)
.
.
async def run():
await cached_call()
await cached_call()
await cached_call()
cache = Cache(Cache.REDIS, endpoint="127.0.0.1", port=6379, namespace="main")
await cache.delete("key")
.
if __name__ == "__main__":
asyncio.run(run())
.
The recommended approach to instantiate a new cache is using the `Cache` constructor.
However you can also instantiate directly using `aiocache.RedisCache`,
`aiocache.SimpleMemoryCache` or `aiocache.MemcachedCache`.
.
.
You can also setup cache aliases so its easy to reuse configurations
.
.. code-block:: python
.
import asyncio
.
from aiocache import caches
.
# You can use either classes or strings for referencing classes
caches.set_config({
'default': {
'cache': "aiocache.SimpleMemoryCache",
'serializer': {
'class': "aiocache.serializers.StringSerializer"
}
},
'redis_alt': {
'cache': "aiocache.RedisCache",
'endpoint': "127.0.0.1",
'port': 6379,
'timeout': 1,
'serializer': {
'class': "aiocache.serializers.PickleSerializer"
},
'plugins': [
{'class': "aiocache.plugins.HitMissRatioPlugin"},
{'class': "aiocache.plugins.TimingPlugin"}
]
}
})
.
.
async def default_cache():
cache = caches.get('default') # This always returns the SAME instance
await cache.set("key", "value")
assert await cache.get("key") == "value"
.
.
async def alt_cache():
cache = caches.create('redis_alt') # This creates a NEW instance on every call
await cache.set("key", "value")
assert await cache.get("key") == "value"
.
.
async def test_alias():
await default_cache()
await alt_cache()
.
await caches.get("redis_alt").delete("key")
.
.
if __name__ == "__main__":
asyncio.run(test_alias())
Homepage: https://github.com/aio-libs/aiocache
Installed-Size: '125'
Maintainer: Gianfranco Costamagna <locutusofborg@debian.org>
Package: python3-aiocache
Priority: optional
Recommends: python3-aiomcache, python3-marshmallow, python3-memcache, python3-msgpack,
python3-redis
Section: python
Source: aiocache
Version: 0.12.3-2
srcpkg_name: aiocache
srcpkg_version: 0.12.3-2