rez.utils.memcached#

class rez.utils.memcached.Client#

Bases: object

Wrapper for memcache.Client instance.

Adds the features: - unlimited key length; - hard/soft flushing; - ability to cache None.

miss = <rez.utils.memcached.Client._Miss object>#
logger = <rez.utils.logging_._Printer object>#
__init__(servers, debug=False)#

Create a memcached client.

Parameters:
  • servers (str or list of str) – Server URI(s), eg ‘127.0.0.1:11211’.

  • debug (bool) – If True, quasi human readable keys are used. This helps debugging - run ‘memcached -vv’ in the foreground to see the keys being get/set/stored.

property client#

Get the native memcache client.

Returns:

memcache.Client instance.

test_servers()#

Test that memcached servers are servicing requests.

Returns:

URIs of servers that are responding.

Return type:

set

set(key, val, time=0, min_compress_len=0)#

See memcache.Client.

get(key)#

See memcache.Client.

Returns:

A value if cached, else self.miss. Note that this differs from memcache.Client, which returns None on cache miss, and thus cannot cache the value None itself.

Return type:

object

delete(key)#

See memcache.Client.

flush(hard=False)#

Drop existing entries from the cache.

Parameters:

hard (bool) – If True, all current entries are flushed from the server(s), which affects all users. If False, only the local process is affected.

get_stats()#

Get server statistics.

Returns:

A list of tuples (server_identifier, stats_dictionary).

reset_stats()#

Reset the server stats.

disconnect()#

Disconnect from server(s). Behaviour is undefined after this call.

rez.utils.memcached.memcached_client(servers=[], debug=False)#

Get a shared memcached instance.

This function shares the same memcached instance across nested invocations. This is done so that memcached connections can be kept to a minimum, but at the same time unnecessary extra reconnections are avoided. Typically an initial scope (using ‘with’ construct) is made around parts of code that hit the cache server many times - such as a resolve, or executing a context. On exit of the topmost scope, the memcached client is disconnected.

Returns:

Memcached instance.

Return type:

Client

rez.utils.memcached.pool_memcached_connections(func)#

Function decorator to pool memcached connections.

Use this to wrap functions that might make multiple calls to memcached. This will cause a single memcached client to be shared for all connections.

rez.utils.memcached.memcached(servers, key=None, from_cache=None, to_cache=None, time=0, min_compress_len=0, debug=False)#

memcached memoization function decorator.

The wrapped function is expected to return a value that is stored to a memcached server, first translated by to_cache if provided. In the event of a cache hit, the data is translated by from_cache if provided, before being returned. If you do not want a result to be cached, wrap the return value of your function in a DoNotCache object.

Examples:

@memcached('127.0.0.1:11211')
def _listdir(path):
    return os.path.listdir(path)

Note

If using the default key function, ensure that repr() is implemented on all your arguments and that they are hashable.

Note

from_cache and to_cache both accept the value as first parameter, then the target function’s arguments follow.

Parameters:
  • servers (str or list of str) – memcached server uri(s), eg ‘127.0.0.1:11211’. This arg can be None also, in which case memcaching is disabled.

  • key (Optional[Callable]) – Function that, given the target function’s args, returns the string key to use in memcached.

  • from_cache (Optional[Callable]) – If provided, and a cache hit occurs, the cached value will be translated by this function before being returned.

  • to_cache (Optional[Callable]) – If provided, and a cache miss occurs, the function’s return value will be translated by this function before being cached.

  • time (int) – Tells memcached the time which this value should expire, either as a delta number of seconds, or an absolute unix time-since-the-epoch value. See the memcached protocol docs section “Storage Commands” for more info on <exptime>. We default to 0 == cache forever.

  • min_compress_len (int) – The threshold length to kick in auto-compression of the value using the zlib.compress() routine. If the value being cached is a string, then the length of the string is measured, else if the value is an object, then the length of the pickle result is measured. If the resulting attempt at compression yeilds a larger string than the input, then it is discarded. For backwards compatability, this parameter defaults to 0, indicating don’t ever try to compress.

  • debug (bool) – If True, memcache keys are kept human readable, so you can read them if running a foreground memcached proc with ‘memcached -vv’. However this increases chances of key clashes so should not be left turned on.

class rez.utils.memcached.DoNotCache#

Bases: object

__init__(result)#