Class-based decorators use a class with __init__ and __call__ instead of nested functions. They excel at maintaining state across calls -- counting invocations, accumulating timing data, managing caches with custom eviction. But because the class instance replaces the function in the namespace, the same metadata loss problem exists: __name__, __doc__, and the function signature disappear. The fix is functools.update_wrapper, the function that @functools.wraps calls under the hood, adapted for use inside a class __init__.
How a Class-Based Decorator Works
A class-based decorator is a class that implements the __call__ method, making its instances callable. When applied with the @ syntax, Python creates an instance of the class, passing the decorated function to __init__. Every subsequent call to the decorated function invokes __call__ on that instance.
class MyDecorator:
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
print(f"Before {self.func.__name__}")
result = self.func(*args, **kwargs)
print(f"After {self.func.__name__}")
return result
@MyDecorator
def greet(name):
"""Return a greeting."""
return f"Hello, {name}"
# greet is now a MyDecorator instance, not a function
greet("Alice") # Before greet / Hello, Alice / After greet
After decoration, the name greet points to a MyDecorator instance. That instance stores the original function as self.func and calls it inside __call__. The decorator works, but the function's identity is gone.
Writing @MyDecorator above a function definition is syntactic sugar. Python executes it as greet = MyDecorator(greet). The original function object is passed to __init__, and the name greet in the current scope is rebound to the new MyDecorator instance. From that point forward, every call to greet() routes through __call__ on that instance. The original function still exists -- it is just stored privately as self.func inside the instance.
The Metadata Problem
print(greet.__name__) # MyDecorator (should be "greet")
print(greet.__doc__) # None (should be "Return a greeting.")
print(type(greet)) # <class 'MyDecorator'> (not a function)
The __name__ attribute returns the class name rather than the function name. The docstring is gone. help(greet) describes a MyDecorator object with no useful information about the original function. This is the exact same metadata loss that function-based decorators suffer, but the fix is slightly different because there is no wrapper function to apply @functools.wraps to.
Metadata loss is not just an aesthetic problem. Automated documentation tools (Sphinx, pdoc) read __name__ and __doc__ to generate API references. Debugging tools and stack trace formatters use __name__ to identify the function in an error message. Type checkers and IDEs use __annotations__ to surface parameter hints. inspect.signature() -- used by frameworks like FastAPI to extract route parameters -- follows the __wrapped__ chain to find the original signature. Strip the metadata and all of these tools see a featureless class instance instead of the function you wrote.
Applying functools.update_wrapper in __init__
The fix is a single line in __init__:
import functools
class MyDecorator:
def __init__(self, func):
functools.update_wrapper(self, func) # <-- the fix
self.func = func
def __call__(self, *args, **kwargs):
print(f"Before {self.func.__name__}")
result = self.func(*args, **kwargs)
print(f"After {self.func.__name__}")
return result
@MyDecorator
def greet(name: str) -> str:
"""Return a greeting."""
return f"Hello, {name}"
print(greet.__name__) # greet
print(greet.__doc__) # Return a greeting.
print(greet.__wrapped__) # <function greet at 0x...>
functools.update_wrapper(self, func) copies __name__, __doc__, __module__, __qualname__, __annotations__, and (on Python 3.12+) __type_params__ from the original function onto the class instance. It also merges the function's __dict__ and sets __wrapped__ to reference the original function. This is exactly what @functools.wraps(func) does in function-based decorators — @wraps just calls update_wrapper internally. The full list of copied attributes is defined in functools.WRAPPER_ASSIGNMENTS; the Python documentation at docs.python.org/3/library/functools.html lists the current defaults for each Python version.
Think of functools.update_wrapper(wrapper, wrapped) as the imperative form and @functools.wraps(wrapped) as the declarative shorthand. They produce identical results. The shorthand exists because decorating a function definition with @wraps reads naturally above a def. In a class, there is no def to decorate -- the instance itself is the replacement -- so you call the underlying function directly. The arguments are reversed from what you might expect: the first argument is the object receiving the copied attributes (self), the second is the source (func).
Call functools.update_wrapper(self, func) before assigning self.func = func or any other instance attributes. update_wrapper merges the original function's __dict__ into self.__dict__. If you set instance attributes first and the function's __dict__ happens to contain a key with the same name, the merge will overwrite your attribute. Calling it first means your subsequent assignments win.
A Stateful Decorator: Call Counter
The primary advantage of class-based decorators is persistent state. Instance attributes survive across calls, making patterns like call counting trivial:
import functools
class CountCalls:
"""Track how many times the decorated function is called."""
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
@CountCalls
def process_order(order_id: int) -> str:
"""Process a customer order."""
return f"Order {order_id} processed"
process_order(101)
process_order(102)
process_order(103)
print(process_order.count) # 3
print(process_order.__name__) # process_order
The count attribute persists on the class instance and increments with each call. In a function-based decorator, you would need a mutable container in the closure (like a list) or the nonlocal keyword to achieve the same thing. The class approach is more readable because the state is an explicit attribute with a clear name.
When Python evaluates @CountCalls, it creates one CountCalls instance and binds it to the function name. That single instance lives for the lifetime of the decorated name -- it is not re-created on every call. So self.count accumulates across every invocation, just like a counter in a module-level variable, but scoped cleanly to that specific decorated function. If you decorate two separate functions with @CountCalls, each gets its own instance with its own independent count.
A Stateful Decorator: Timing Accumulator
A more advanced example accumulates execution times across calls and provides summary statistics:
import functools
import time
class TimingStats:
"""Accumulate execution timing statistics across calls."""
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.call_count = 0
self.total_time = 0.0
def __call__(self, *args, **kwargs):
start = time.perf_counter()
result = self.func(*args, **kwargs)
elapsed = time.perf_counter() - start
self.call_count += 1
self.total_time += elapsed
return result
@property
def avg_time(self):
if self.call_count == 0:
return 0.0
return self.total_time / self.call_count
@TimingStats
def compute_report(data: list) -> dict:
"""Generate a summary report from raw data."""
return {"count": len(data), "total": sum(data)}
for _ in range(100):
compute_report(list(range(10000)))
print(f"Calls: {compute_report.call_count}")
print(f"Total: {compute_report.total_time:.4f}s")
print(f"Average: {compute_report.avg_time:.6f}s")
print(f"Name: {compute_report.__name__}") # compute_report
The class instance acts as both the callable replacement and the statistics container. Custom properties like avg_time provide computed summaries that would be awkward to expose from a function-based decorator's closure.
Parameterized Class-Based Decorators
When a class-based decorator needs its own configuration, the pattern changes. The class __init__ receives the parameters, and __call__ receives the function. This means __call__ must return a wrapper, and update_wrapper goes on that wrapper:
import functools
class Repeat:
"""Call the decorated function n times."""
def __init__(self, n=2):
self.n = n # __init__ receives the parameter
def __call__(self, func): # __call__ receives the function
@functools.wraps(func) # @wraps works here -- it's a function def
def wrapper(*args, **kwargs):
result = None
for _ in range(self.n):
result = func(*args, **kwargs)
return result
return wrapper
@Repeat(n=3)
def say_hello(name: str) -> None:
"""Print a greeting."""
print(f"Hello, {name}")
say_hello("Alice")
# Hello, Alice
# Hello, Alice
# Hello, Alice
print(say_hello.__name__) # say_hello
Notice that in this pattern, __call__ returns a regular wrapper function, so the standard @functools.wraps(func) syntax works on that function's def line. The update_wrapper form is only needed when the class instance itself is the callable replacement (the non-parameterized pattern in the sections above).
The rule for which metadata technique to use: if the class instance replaces the function, use functools.update_wrapper(self, func) in __init__. If the class's __call__ returns a function that replaces the original, use @functools.wraps(func) on that function.
The Method Binding Problem and __get__
A class-based decorator that works on standalone functions may fail silently when applied to a method inside a class. The problem is that regular functions in Python are descriptors -- they implement __get__, which is how Python binds self to methods. A class instance does not implement __get__ by default, so the method binding mechanism breaks.
# This will fail when used on a method
class Greeter:
@CountCalls
def say_hi(self, name):
return f"Hi, {name}"
g = Greeter()
g.say_hi("Alice") # TypeError: say_hi() missing 'self' argument
The fix is to implement the __get__ method on the decorator class, making it a descriptor that participates in Python's method binding protocol:
import functools
class CountCalls:
"""Track call count -- works on functions AND methods."""
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
class Greeter:
@CountCalls
def say_hi(self, name):
"""Greet someone by name."""
return f"Hi, {name}"
g = Greeter()
print(g.say_hi("Alice")) # Hi, Alice
print(g.say_hi("Bob")) # Hi, Bob
print(Greeter.say_hi.count) # 2
The __get__ method is called when the descriptor is accessed as an attribute of an instance. When obj is not None (accessed from an instance), it returns a functools.partial that pre-binds obj as the first argument, simulating the normal method binding that Python performs for regular functions. When accessed from the class directly (obj is None), it returns the decorator instance unchanged.
Normal Python functions implement __get__ internally. When you write instance.method, Python finds the function in the class's namespace and calls its __get__(instance, type), which returns a bound method with self pre-filled. A class-based decorator instance is just an object -- it does not inherit this __get__ behavior unless you explicitly add it. The functools.partial(self, obj) pattern recreates what function binding does natively: it produces a callable where the first argument (self of the enclosing class) is already filled in.
If you plan to use your class-based decorator on methods inside classes, you must implement __get__. Without it, the decorator will fail with a TypeError about missing the self argument. This is the single largest pitfall with class-based decorators and catches many developers off guard.
Complete Production Template
This template includes metadata preservation, descriptor protocol support, and explicit state tracking. Copy it as a starting point for any class-based decorator:
import functools
class MyClassDecorator:
"""Template for a production-grade class-based decorator."""
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
# Initialize your state attributes here
def __call__(self, *args, **kwargs):
# YOUR PRE-CALL LOGIC HERE
result = self.func(*args, **kwargs)
# YOUR POST-CALL LOGIC HERE
return result
def __get__(self, obj, objtype=None):
"""Support instance method binding."""
if obj is None:
return self
return functools.partial(self, obj)
- Import functools. Both
functools.update_wrapperandfunctools.partiallive in this standard library module. - Call
functools.update_wrapper(self, func)as the first line of__init__. This copies__name__,__doc__,__module__,__qualname__,__annotations__, and__type_params__(Python 3.12+) from the original function onto the class instance, merges__dict__, and sets__wrapped__. - Assign
self.func = funcafter theupdate_wrappercall. Placing your own instance attributes after the call ensures they are not overwritten by the original function's__dict__merge. - Implement
__call__(self, *args, **kwargs)with your wrapper logic. Callself.func(*args, **kwargs)inside it and return the result. Add state mutations, logging, or guards around that call as needed. - Add
__get__(self, obj, objtype=None)if the decorator will be used on instance methods. Returnselfwhenobj is Noneand returnfunctools.partial(self, obj)otherwise to replicate Python's method binding behavior.
What Happens When the Wrapped Function Raises?
Every stateful example in this article assumes the wrapped function returns normally. In production that assumption fails. If self.func(*args, **kwargs) raises an exception, execution exits __call__ immediately -- and any state update written after that line never runs.
Compare how the timing accumulator behaves depending on where you write the state update:
def __call__(self, *args, **kwargs):
start = time.perf_counter()
result = self.func(*args, **kwargs) # raises here?
elapsed = time.perf_counter() - start
self.call_count += 1 # never reached if func raises
self.total_time += elapsed # never reached if func raises
return result
If self.func raises, call_count and total_time are not updated. That may be exactly what you want -- failed calls arguably should not count. But if your intent is to record every invocation including failures, you need a try/finally block:
def __call__(self, *args, **kwargs):
start = time.perf_counter()
try:
result = self.func(*args, **kwargs)
return result
finally:
# Runs whether the function returns or raises
elapsed = time.perf_counter() - start
self.call_count += 1
self.total_time += elapsed
The finally block runs unconditionally. The exception still propagates to the caller -- you are not suppressing it -- but the timing and count are always recorded. For a CountCalls-style decorator, the same decision applies: increment before the call (counts attempts), after (counts successes), or in finally (counts all, regardless of outcome). There is no universally correct answer. The right choice depends on what the counter is for.
For every stateful __call__, ask three questions before shipping: Should failures update the state? Should partial updates be visible to other threads mid-call? Should state be resettable from outside the decorator? Each answer changes the implementation.
Is the State Thread-Safe?
Instance attributes in a class-based decorator are shared state. If two threads call the same decorated function concurrently, they both write to the same self.count or self.total_time. In CPython, integer increment (self.count += 1) is not atomic at the bytecode level -- it compiles to a read, an add, and a write, and a thread switch can occur between any two of those steps.
import functools
import threading
class CountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
self._lock = threading.Lock() # add a lock
def __call__(self, *args, **kwargs):
with self._lock:
self.count += 1 # now atomic
return self.func(*args, **kwargs) # run outside the lock
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
Notice that the lock wraps only the counter update, not the entire function call. Holding a lock across a potentially slow operation blocks other threads for the full duration. If the wrapped function is fast, locking around the whole call is acceptable. If it is slow -- a network request, a file read, a database query -- keep the lock scope as tight as possible around the state mutation only.
Single-threaded scripts, WSGI servers running one worker process, and any code where the decorated function is only called from one thread at a time do not need locks. Async code using asyncio is also single-threaded by default -- coroutines do not preempt each other the way threads do, so standard += is safe within a single event loop. Locks become necessary in multi-threaded environments: web servers using threading workers (Gunicorn with --worker-class=gthread), background task runners, or any code that calls threading.Thread. If you are not sure, adding a lock costs almost nothing in the single-threaded case and prevents a subtle class of bugs in the multi-threaded one.
Stacking Class-Based Decorators
Decorators stack from bottom to top. When you write two decorators above a function, the innermost one (closest to def) is applied first, and the outermost wraps the result of the inner one. With class-based decorators, the outermost decorator receives a class instance -- not the original function -- as its argument.
@TimingStats # applied second -- wraps the CountCalls instance
@CountCalls # applied first -- wraps the original function
def process_order(order_id: int) -> str:
"""Process a customer order."""
return f"Order {order_id} processed"
# Equivalent to:
# process_order = TimingStats(CountCalls(process_order))
The TimingStats instance receives a CountCalls instance as func. When it calls functools.update_wrapper(self, func), it copies the metadata from the CountCalls instance -- which already had update_wrapper applied to it, and therefore already carries the original function's __name__, __doc__, and __wrapped__. The chain is preserved because update_wrapper also sets __wrapped__ on the outer wrapper to point to the inner one, so inspect.signature() can follow the entire chain back to the original function.
print(process_order.__name__) # process_order
print(process_order.__wrapped__.__name__) # process_order (CountCalls instance)
print(type(process_order)) # <class 'TimingStats'>
print(type(process_order.__wrapped__)) # <class 'CountCalls'>
# Each layer's state is accessible on its own instance
process_order(101)
print(process_order.call_count) # 1 (TimingStats)
print(process_order.__wrapped__.count) # 1 (CountCalls)
@classmethod / @staticmethod
The __get__ fix that makes a class-based decorator work on instance methods does not extend to @classmethod or @staticmethod. Both of those have their own descriptor machinery. If you place a class-based decorator on a method that is also decorated with @classmethod, order matters and the result may be surprising -- the class-based decorator should generally go inside (closer to def) so that @classmethod wraps the already-decorated callable. Test explicitly; do not assume the stacking order is safe.
Controlling What update_wrapper Copies
functools.update_wrapper accepts two optional parameters that control which attributes are copied: assigned and updated. Their defaults are the module-level constants functools.WRAPPER_ASSIGNMENTS and functools.WRAPPER_UPDATES.
import functools
# Default values -- shown here for reference (Python 3.12+)
print(functools.WRAPPER_ASSIGNMENTS)
# ('__module__', '__name__', '__qualname__', '__annotations__',
# '__type_params__', '__doc__')
print(functools.WRAPPER_UPDATES)
# ('__dict__',)
WRAPPER_ASSIGNMENTS lists the attributes that are directly copied (set) on the wrapper. WRAPPER_UPDATES lists the attributes that are merged with update() -- meaning the wrapper's existing __dict__ is updated with the wrapped function's __dict__, not replaced. The __wrapped__ attribute is set unconditionally by update_wrapper regardless of these parameters.
You can pass custom values to either parameter when a specific attribute should be excluded or when you want to copy something not in the defaults:
class MyDecorator:
def __init__(self, func):
# Skip __doc__ -- this decorator intentionally hides the original docstring
functools.update_wrapper(
self, func,
assigned=('__module__', '__name__', '__qualname__',
'__annotations__', '__type_params__')
)
self.func = func
self.__doc__ = "Wrapped by MyDecorator. See __wrapped__ for original docs."
In practice, the defaults are correct for the overwhelming majority of decorators. The main reason to override assigned is when the wrapped object does not have one of the default attributes -- for example, wrapping a built-in function that has no __annotations__. In that case update_wrapper raises an AttributeError unless you either remove the missing attribute from assigned or pass updated=[] to suppress the merge phase entirely.
Think of assigned as a list of fields to stamp directly onto the wrapper -- each one overwrites whatever was there before. Think of updated as a list of dictionaries to merge, not overwrite -- the wrapper's existing keys survive, and only new keys from the wrapped function are added. That is why __dict__ is in WRAPPER_UPDATES rather than WRAPPER_ASSIGNMENTS: you want to add the wrapped function's custom attributes to the wrapper without erasing the wrapper's own state attributes like count or total_time.
Wrapping Async Functions
The standard class-based decorator template silently breaks when applied to a coroutine function. When Python calls self.func(*args, **kwargs) on a coroutine, it returns a coroutine object rather than executing it. The caller receives the coroutine object back instead of a result, and no await keyword ever drives it to completion.
import functools
import asyncio
import inspect
class CountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
@CountCalls
async def fetch(url: str) -> str:
"""Fetch content from a URL."""
await asyncio.sleep(0)
return f"result from {url}"
# WRONG: __call__ returns a coroutine object, not the result
# asyncio.run(fetch("https://example.com")) raises a TypeError
The correct solution is to use two separate classes — one with a synchronous __call__, one with an async def __call__ — and a factory function that returns the right variant. This is the pattern used by production libraries such as wrapt. Setting self.__call__ = async_fn inside __init__ does not work: Python resolves __call__ through the type's C-level slot, not through the instance's __dict__, so the instance attribute is silently ignored when the object is called with ().
On Python 3.12+, call inspect.markcoroutinefunction(self) in the async class's __init__. This sets a marker on the instance that inspect.iscoroutinefunction() checks, ensuring that frameworks like FastAPI — which use iscoroutinefunction to decide whether to await a route handler — correctly identify the decorated function as a coroutine:
import functools
import inspect
import asyncio
# Setting self.__call__ = async_func in __init__ does NOT work.
# Python resolves __call__ through the type's C-level slot, not the
# instance __dict__ -- the class-level __call__ always wins.
# The correct approach: two concrete classes, one factory function.
class _SyncCountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
class _AsyncCountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
inspect.markcoroutinefunction(self) # makes iscoroutinefunction() return True
async def __call__(self, *args, **kwargs):
self.count += 1
return await self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
def CountCalls(func):
"""Factory: returns the async or sync variant based on the wrapped function."""
if inspect.iscoroutinefunction(func):
return _AsyncCountCalls(func)
return _SyncCountCalls(func)
@CountCalls
async def fetch(url: str) -> str:
"""Fetch content from a URL."""
await asyncio.sleep(0)
return f"result from {url}"
@CountCalls
def process(data: list) -> dict:
"""Process data synchronously."""
return {"count": len(data)}
print(inspect.iscoroutinefunction(fetch)) # True
print(asyncio.run(fetch("https://example.com"))) # result from https://example.com
print(fetch.count) # 1
print(fetch.__name__) # fetch
print(process([1, 2, 3])) # {'count': 3}
print(process.count) # 1
print(process.__name__) # process
A second approach uses separate synchronous and asynchronous decorator classes with a shared factory function. This is more explicit but results in cleaner class definitions -- the async class has no dead synchronous __call__ code path. The factory approach also makes it straightforward to unit test each variant independently.
When using the factory pattern, _AsyncCountCalls.__get__ still returns functools.partial(self, obj). The partial is a regular callable that, when invoked, calls _AsyncCountCalls.__call__ — which is async def. The caller still needs to await the result. Method binding and async behavior compose correctly without any special handling in __get__.
Pickling and Multiprocessing Compatibility
Standard class-based decorator instances are not picklable by default. This matters in real-world code more than documentation suggests: Python's multiprocessing module serializes functions passed to worker pools via pickle, and any decorated function that cannot be pickled will raise a PicklingError when submitted to a Pool.map or concurrent.futures.ProcessPoolExecutor.
import pickle
import functools
class CountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
@CountCalls
def process_item(x: int) -> int:
return x * 2
# This raises PicklingError if the wrapped function is a lambda
# or a locally defined function. For module-level functions it may
# partially work, but the state (count) will not be preserved across
# the process boundary.
try:
data = pickle.dumps(process_item)
restored = pickle.loads(data)
print(f"Pickled name: {restored.__name__}") # process_item
print(f"Count reset: {restored.count}") # 0 -- state is lost
except Exception as e:
print(f"PicklingError: {e}")
Two issues arise. First, the state is not preserved across the process boundary -- each worker process receives a fresh instance with all counters reset to zero. Second, if the wrapped function is not itself picklable (a lambda or a closure), the entire pickle operation fails.
The robust solution is to implement __reduce__ or __reduce_ex__ on the decorator class to control how it serializes. For decorators where cross-process state consistency is required, a shared-memory or manager-based approach is necessary -- instance attributes alone cannot span process boundaries without explicit coordination:
import functools
import pickle
class CountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
def __reduce__(self):
# Reconstruct as CountCalls(self.func) -- state intentionally not serialized
# Each worker process gets a fresh counter starting at 0
return (CountCalls, (self.func,))
def __reduce_ex__(self, protocol):
return self.__reduce__()
# Now pickling works for module-level decorated functions
# Workers receive fresh instances (count=0)
# For cross-process count aggregation, use multiprocessing.Value or Manager.Value
Processes do not share memory. When a function is pickled for a worker process, Python serializes the object's state and reconstructs it independently in the worker's address space. Any increments the worker makes to its local self.count are invisible to the parent process. If aggregated counters across workers are required, the state must live in a multiprocessing.Manager().Value or a multiprocessing.Value with a lock -- not in a plain instance attribute. This is not a flaw of class-based decorators specifically; it is a constraint of process-based parallelism in Python.
Type-Safe Metadata with ParamSpec and TypeVar
functools.update_wrapper solves the runtime metadata problem -- __name__, __doc__, and __wrapped__ are preserved. But it does not solve the static type-checking problem. Type checkers like mypy and pyright see the decorator replacing a function with a class instance, and they cannot infer that the instance is callable with the same signature as the original function. The result is false positives: the type checker warns about incorrect argument types when calling the decorated function, or it refuses to infer the return type correctly.
import functools
from typing import TypeVar, ParamSpec, Callable, Generic
P = ParamSpec("P")
R = TypeVar("R")
class CountCalls(Generic[P, R]):
"""Type-safe class-based decorator preserving full call signature."""
def __init__(self, func: Callable[P, R]) -> None:
functools.update_wrapper(self, func)
self.func = func
self.count: int = 0
def __call__(self, *args: P.args, **kwargs: P.kwargs) -> R:
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj: object, objtype: type | None = None) -> "CountCalls[P, R]":
if obj is None:
return self
return functools.partial(self, obj) # type: ignore[return-value]
@CountCalls
def add(x: int, y: int) -> int:
"""Add two integers."""
return x + y
# mypy and pyright now correctly infer:
# add(1, 2) -> int (correct)
# add("a", "b") -> error (correctly flagged)
# add.count -> int (correct)
reveal_type(add(1, 2)) # Revealed type: int
ParamSpec captures the exact parameter specification of the original function and re-expresses it on __call__. TypeVar captures the return type. Together they let a type checker understand that a decorated function accepts the same arguments as the original and returns the same type -- the class instance is transparent to the type system.
ParamSpec was introduced in Python 3.10 (PEP 612). For codebases targeting 3.9 and below, import it from typing_extensions instead of typing. The runtime behavior is identical; only the import differs. The type | None union syntax in __get__ also requires 3.10+ -- use Optional[type] from typing for older targets.
Descriptor Name Awareness with __set_name__
When a class-based decorator is used as a class attribute (decorating a method), the decorator instance is stored under a specific name in the class namespace. By default, the decorator does not know its own attribute name -- it only knows the function's __name__, which may differ from the attribute name in edge cases like aliasing. Python 3.6 introduced __set_name__, a method called automatically on descriptors when the enclosing class is created.
import functools
class CountCalls:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.count = 0
self.owner_class = None # set by __set_name__
self.attr_name = None # set by __set_name__
def __set_name__(self, owner, name):
"""Called when the descriptor is assigned to a class attribute."""
self.owner_class = owner
self.attr_name = name
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
def __get__(self, obj, objtype=None):
if obj is None:
return self
return functools.partial(self, obj)
class DataPipeline:
@CountCalls
def run(self, data: list) -> dict:
"""Execute the pipeline."""
return {"processed": len(data)}
# __set_name__ was called during class creation
print(DataPipeline.run.owner_class) #
print(DataPipeline.run.attr_name) # run
# Practical use: better error messages
class CountCalls:
def __call__(self, *args, **kwargs):
self.count += 1
if self.count > 1000:
loc = f"{self.owner_class.__name__}.{self.attr_name}" if self.owner_class else self.__name__
raise RuntimeError(f"{loc} called more than 1000 times -- possible runaway loop")
__set_name__ is only called when the decorator is applied inside a class body. When applied to a standalone function, __set_name__ is never invoked -- so self.owner_class and self.attr_name remain None. Any code using those attributes must guard against that case, as shown in the error message example above. This is also the mechanism that property descriptors and dataclasses fields use internally to learn their own names without requiring the class author to repeat the name as a string argument.
Class-Based vs. Function-Based: When to Use Which
| Criterion | Function-Based | Class-Based |
|---|---|---|
| Metadata preservation | @functools.wraps(func) on wrapper |
functools.update_wrapper(self, func) in __init__ |
| State across calls | Requires mutable closure or nonlocal |
Natural -- use instance attributes |
| Methods and properties | Transparent -- functions are descriptors | Requires __get__ implementation |
| Readability for simple cases | Compact -- two nested functions | More verbose -- class definition overhead |
| Exposing custom attributes | Set on wrapper function after definition | Natural -- define methods and properties |
| Parameterized version | Three-layer nesting | __init__ takes params, __call__ takes func |
| Testability | Bypass via func.__wrapped__ |
Bypass via instance.__wrapped__ or instance.func |
| Async compatibility | Works transparently -- wrapper async def naturally |
Two classes (_Sync/_Async) + factory; async def __call__ + inspect.markcoroutinefunction(self) |
| Pickling / multiprocessing | Works for module-level functions automatically | Requires __reduce__ implementation |
| Static type safety | Use ParamSpec + TypeVar on wrapper function |
Use Generic[P, R] with ParamSpec + TypeVar |
@functools.wraps(func) on wrapperfunctools.update_wrapper(self, func) in __init__nonlocal__get__ implementation__init__ takes params, __call__ takes funcfunc.__wrapped__instance.__wrapped__ or instance.funcasync def naturallyasync def __call__ + inspect.markcoroutinefunction(self)__reduce__ implementationParamSpec + TypeVar on wrapper functionGeneric[P, R] with ParamSpec + TypeVarFor decorators that do not need state -- such as logging, timing a single call, or access control checks -- function-based decorators are simpler and should be the default choice. For decorators that accumulate data across calls -- counters, cumulators, caches with custom eviction, or rate limiters with token buckets -- the class-based approach is cleaner because the state lives in named instance attributes rather than mutable closures.
Spot the Bug
The following decorator is almost correct, but it contains two bugs. One is subtle and will silently produce wrong behavior. The other will cause an outright failure in a specific context. Read carefully before answering.
This decorator is intended to retry a function up to n times if it raises an exception. It compiles and runs without a syntax error. Identify the bug that causes incorrect behavior.
# Supposed to retry on exception, up to n attempts
import functools
class Retry:
def __init__(self, func, n=3):
functools.update_wrapper(self, func) # line 6
self.func = func
self.n = n
def __call__(self, *args, **kwargs):
for attempt in range(self.n):
try:
return self.func(*args, **kwargs)
except Exception:
if attempt == self.n - 1:
raise
def __get__(self, obj, objtype=None):
return functools.partial(self, obj) # line 19
@Retry
def fetch_data(url: str) -> dict:
"""Fetch JSON from a URL."""
...Which statement correctly identifies the bugs in this code?
class Retry:
def __init__(self, func=None, *, n=3):
self.n = n
if func is not None:
functools.update_wrapper(self, func) # correctly placed
self.func = func
def __call__(self, *args, **kwargs):
if not hasattr(self, 'func'):
# Parameterized path: __call__ receives the function
func = args[0]
functools.update_wrapper(self, func)
self.func = func
return self
for attempt in range(self.n):
try:
return self.func(*args, **kwargs)
except Exception:
if attempt == self.n - 1:
raise
def __get__(self, obj, objtype=None):
if obj is None: # guard against class-level access
return self
return functools.partial(self, obj)
Key Takeaways
- Use
functools.update_wrapper(self, func)in__init__. This is the class-based equivalent of@functools.wraps(func). It copies__name__,__doc__,__module__,__qualname__,__annotations__(and__type_params__on Python 3.12+), merges__dict__, and sets__wrapped__. - Call
update_wrapperbefore assigning other instance attributes.update_wrappermerges the original function's__dict__intoself.__dict__. Attributes you set after the call will not be overwritten by the merge. Attributes you set before it can be. - Implement
__get__for method compatibility. Without__get__, a class-based decorator fails when used on instance methods because Python cannot bindselffrom the enclosing class. Thefunctools.partial(self, obj)pattern restores correct method binding. - Class-based decorators excel at stateful behavior. Call counters, timing accumulators, caches, rate limiters, and any decorator that needs to remember information between calls benefit from instance attributes that persist naturally.
- Parameterized class-based decorators flip the pattern.
__init__receives the parameters,__call__receives and wraps the function. In this case,__call__returns a regular wrapper function, so@functools.wraps(func)applies to that function'sdefline. - Use the production template. Every class-based decorator you write should start from the three-method template:
__init__withupdate_wrapper,__call__with the wrapper logic, and__get__withfunctools.partialbinding. - Use a factory function for async compatibility. A synchronous
__call__applied to a coroutine function returns the coroutine object unawaited — a silent failure with no immediate error. The correct fix is two separate classes (_SyncVariantwith a regular__call__,_AsyncVariantwith anasync def __call__) and a factory function that returns the right one. Settingself.__call__ = async_fnin__init__does not work: Python resolves__call__through the type, not the instance__dict__. On Python 3.12+, callinspect.markcoroutinefunction(self)in the async class's__init__so thatinspect.iscoroutinefunction()returnsTrueon the decorated name. - Implement
__reduce__if the decorator needs to be picklable. Module-level decorated functions submitted tomultiprocessing.PoolorProcessPoolExecutorare pickled before transmission to worker processes. Without__reduce__, the pickle either fails outright or loses all accumulated state. Implement it to define explicitly what serializes and what does not. - Use
ParamSpecandTypeVarfor type-safe decorators.functools.update_wrapperpreserves runtime metadata but does not preserve the function signature for static type checkers. Making the decorator class generic overParamSpec[P]andTypeVar[R]lets mypy and pyright correctly infer argument types and return types through the class instance. - Implement
__set_name__when the decorator needs to know its attribute name. Python calls__set_name__(owner, name)automatically when a descriptor is assigned inside a class body. This gives the decorator instance access to its owner class and its own attribute name -- useful for error messages, logging, and validation that refers to the method by its fully qualified location. - Decide consciously what a raising function means for your state. State updates after the function call are skipped if it raises. Use
try/finallyif you need state to update regardless of outcome, and keep the state update before the call if you want to count attempts rather than successes. - Add a
threading.Lockfor multi-threaded environments. Integer increment is not atomic. Wrap state mutation with a lock, keeping its scope as tight as possible -- around the state write only, not the entire function call. - Stacking preserves the metadata chain. Each
update_wrappercall sets__wrapped__, soinspect.signature()follows the chain to the original function. Each layer's state is accessible via its own instance in the chain. Do not stack a class-based decorator outside@classmethodor@staticmethodwithout testing explicitly. - Override
assignedwhen wrapping objects that lack standard attributes. Built-ins and some C extensions do not have__annotations__. Pass a customassignedtuple excluding the missing attribute to prevent anAttributeErrorfromupdate_wrapper.
Class-based decorators add a bit more structure than their function-based counterparts, but the payoff is cleaner state management and the ability to expose custom properties and methods on the decorated function. The functools.update_wrapper call preserves identity, and the __get__ method preserves method binding. Together, they make a class-based decorator behave as transparently as a function-based one while carrying persistent state that function closures struggle to express clearly.
Frequently Asked Questions
- Why can't you use @functools.wraps directly in a class-based decorator?
@functools.wrapsis designed to decorate a function definition — it goes above adefline. In a class-based decorator, there is no standalone wrapper function. The class instance itself replaces the original function, and__call__is a method on that instance, not a standalone function being returned. Instead, callfunctools.update_wrapper(self, func)inside__init__to achieve the same result.- What is the difference between functools.wraps and functools.update_wrapper?
functools.wrapsis a convenience decorator that internally callsfunctools.update_wrapper. Writing@functools.wraps(func)above a wrapper function is equivalent to callingfunctools.update_wrapper(wrapper, func)after the wrapper is defined. The only difference is syntax:@wrapsis used as a decorator on function definitions, whileupdate_wrapperis called as a regular function, making it suitable for class-based decorators where there is no function definition to decorate.- Why do class-based decorators fail when applied to class methods?
- A class-based decorator that only implements
__init__and__call__does not participate in Python's descriptor protocol. When used on an instance method, the class instance replaces the function in the class namespace, but without a__get__method Python cannot bindselffrom the enclosing class to the method call. The fix is to implement__get__so the decorator returns a bound version of itself usingfunctools.partial. - When should you use a class-based decorator instead of a function-based one?
- Class-based decorators are the right choice when the decorator needs to maintain state across multiple calls — for example, counting invocations, caching results with custom eviction, tracking timing statistics, or enforcing rate limits with a token bucket. The class instance's attributes persist between calls naturally, while function-based decorators would need mutable closures or
nonlocalvariables to achieve the same thing. - Does functools.update_wrapper set the __wrapped__ attribute on class instances?
- Yes.
functools.update_wrappersets__wrapped__on whatever object it receives as the wrapper, including class instances. After callingfunctools.update_wrapper(self, func), the class instance gains a__wrapped__attribute that points to the original function. This allowsinspect.signature()to follow the chain and display the correct parameter signature. - Are class-based decorator instance attributes thread-safe?
- No. Instance attributes like counters and accumulators are shared mutable state. In a multi-threaded environment, concurrent calls can corrupt them because operations like
self.count += 1are not atomic at the bytecode level. The fix is to protect state mutations with athreading.Lock, keeping the lock scope tight around the state write rather than around the entire function call. - How do you make a class-based decorator work with async functions?
- Use two separate classes — one with a synchronous
__call__, one withasync def __call__— and a factory function that returns the right variant based oninspect.iscoroutinefunction(func). Settingself.__call__ = async_fninside__init__does not work because Python resolves__call__through the type's C-level slot, not the instance__dict__. On Python 3.12+, callinspect.markcoroutinefunction(self)in the async class__init__so thatinspect.iscoroutinefunction()returnsTruefor the decorator instance. - Does functools.update_wrapper make a class-based decorator visible to static type checkers?
- No.
functools.update_wrapperpreserves runtime attributes like__name__and__doc__but does not tell mypy or pyright that the class instance accepts the same arguments as the original function. Make the class generic overParamSpec(capturing the parameter specification) andTypeVar(capturing the return type), then annotate__call__usingP.argsandP.kwargs. - What happens to a stateful decorator's state if the wrapped function raises an exception?
- Any state update written after the function call in
__call__is skipped if the function raises, because the exception unwinds the stack immediately. Whether that is the desired behavior depends on the decorator's purpose. Use atry/finallyblock if state should update regardless of success or failure. Thefinallyblock runs unconditionally and the exception still propagates to the caller. - Why are class-based decorator instances not picklable by default?
- Python's pickle needs to reconstruct an object from scratch in a separate process. By default it tries to serialize all instance attributes, which may include non-picklable objects or state that should not cross process boundaries. Implementing
__reduce__on the decorator class lets you define exactly how it serializes — typically by returning(DecoratorClass, (self.func,))to reconstruct a fresh instance, consciously discarding accumulated state like counters. - What do functools.WRAPPER_ASSIGNMENTS and functools.WRAPPER_UPDATES control?
WRAPPER_ASSIGNMENTSlists the attributes copied directly from the wrapped function to the wrapper. Defaults are__module__,__name__,__qualname__,__annotations__,__doc__, and__type_params__(added in Python 3.12).WRAPPER_UPDATESlists attributes that are merged rather than overwritten — by default just__dict__, so the wrapper's own instance attributes survive. Both can be overridden when callingupdate_wrapperdirectly.- What is __set_name__ and when does a class-based decorator need it?
__set_name__(owner, name)is called automatically by Python when a descriptor is assigned to an attribute inside a class body. It gives the decorator instance access to its own attribute name and the class that owns it — useful for error messages or logging that refers to the method by its fully qualified location.__set_name__is only called during class body execution, not when the decorator is applied to a standalone function.