Writing decorators from scratch every time invites inconsistency. You forget functools.wraps on one, omit the return value on another, and break type checking on a third. This article provides five decorator templates that you can copy directly into any project. Each template satisfies the same three requirements: it works with any function signature, it preserves function metadata for help() and debuggers, and it passes the original return value through to the caller. The templates progress from the simplest possible decorator to a fully type-annotated version using ParamSpec from PEP 612.
A "well-behaved" decorator is one that is transparent to both the caller and the developer reading the code. After decoration, the function's __name__, __doc__, __qualname__, and __module__ still match the original function. help() shows the correct signature and docstring. inspect.signature() returns the original parameter list. The return value passes through unchanged. And if the developer uses a type checker, the decorated function's type signature matches the original. When a decorator achieves all of this, it is invisible in every way except the behavior it adds.
What @decorator Actually Does
The @ syntax is shorthand. When Python encounters this:
@my_decorator
def greet(name):
return f"Hello, {name}"
It executes exactly this:
def greet(name):
return f"Hello, {name}"
greet = my_decorator(greet)
The decorator is called with the original function as its argument, and whatever it returns replaces the name greet in the current scope. This is not a special Python mechanism — it is ordinary function call syntax. The @ is evaluated at class or module definition time, not at call time. By the time any code calls greet(), the name greet already refers to whatever my_decorator returned.
This equivalence explains every template in this article. The "wrapper" that my_decorator returns is the new greet. That is why the wrapper must accept any arguments — it will be called wherever the original function would have been called. It is why the wrapper must return the original function's result — callers expect a return value. And it is why functools.wraps is necessary — without it, greet.__name__ would return 'wrapper', because that is the name of the function that actually lives at that binding.
Think of greet as a label on a jar. Before decoration, the label points to your original function. After greet = my_decorator(greet), the label has been peeled off and stuck on the wrapper. The wrapper still holds a reference to the original jar (via the closure variable func), but anyone reaching for the greet label now gets the wrapper. functools.wraps reprints the label so it reads "greet" instead of "wrapper."
The Three Rules Every Decorator Must Follow
Before looking at the templates, these three rules define the baseline for any decorator that does not silently break the code it touches.
Rule 1: Accept any function signature. The wrapper function must use *args and **kwargs so it works with any function regardless of how many positional or keyword arguments it accepts. Decorators that hardcode specific parameter names only work with one specific function shape and break when applied to anything else.
Rule 2: Preserve metadata with @functools.wraps(func). Without this, the decorated function loses its __name__, __doc__, __qualname__, __module__, and __annotations__. This breaks help(), debugger output, documentation generators, serialization, and framework routing that depends on function names. functools.wraps also adds a __wrapped__ attribute that inspect.signature() follows to report the correct parameter list.
Rule 3: Return the original function's result. The wrapper must capture the return value of func(*args, **kwargs) and explicitly return it. Omitting the return statement is one of the most common decorator bugs. It causes every decorated function to silently return None, regardless of what the original function returns.
Picture a decorator as a sandwich: your original function is the filling, and the wrapper is the bread. The wrapper runs code before and after the filling, but it must pass the plate — the return value — all the way to the person eating. If the bread swallows the plate, the caller gets nothing. functools.wraps is the label on the wrapper that says "this sandwich contains X" — without it, the person holding the sandwich cannot read what is inside.
Five Templates, From Simple to Advanced
Template 1: Basic Decorator
This is the foundation. Every other template builds on this structure.
import functools
def my_decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
# logic before the function runs
result = func(*args, **kwargs)
# logic after the function runs
return result
return wrapper
Usage:
import functools
import time
def timer(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def compute_primes(limit):
"""Return all primes below the given limit."""
sieve = [True] * limit
sieve[0] = sieve[1] = False
for i in range(2, int(limit**0.5) + 1):
if sieve[i]:
for j in range(i*i, limit, i):
sieve[j] = False
return [i for i, is_prime in enumerate(sieve) if is_prime]
primes = compute_primes(1_000_000)
# compute_primes took 0.0732s
print(compute_primes.__name__) # compute_primes
print(compute_primes.__doc__) # Return all primes below the given limit.
This template covers the vast majority of decorator use cases. Use it for logging, timing, access control, input validation, or any behavior that runs before and/or after the function.
Template 2: Parameterized Decorator (Decorator Factory)
When a decorator needs configuration, it requires a third layer of nesting. The outer function accepts the configuration parameters and returns the actual decorator.
import functools
def my_decorator(param1, param2="default"):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
# use param1 and param2 here
result = func(*args, **kwargs)
return result
return wrapper
return decorator
Usage:
import functools
import time
def retry(max_attempts, delay=1.0):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
for attempt in range(1, max_attempts + 1):
try:
return func(*args, **kwargs)
except Exception:
if attempt == max_attempts:
raise # bare raise on the last attempt preserves the original traceback
time.sleep(delay)
return wrapper
return decorator
@retry(max_attempts=3, delay=0.5)
def fetch_price(ticker):
"""Fetch the current price for a stock ticker."""
import random
if random.random() < 0.6:
raise ConnectionError("Service unavailable")
return 142.50
print(fetch_price.__name__) # fetch_price
The parentheses are required when using this template: @retry(max_attempts=3) calls the outer function, which returns the decorator, which then wraps the function. Writing @retry without parentheses would pass the function as max_attempts, causing a TypeError.
Template 3: Optional-Argument Decorator
This template allows a decorator to be used both with and without parentheses: @my_decorator, @my_decorator(), and @my_decorator(option=value) all work. The trick is using a sentinel check on the first argument.
import functools
def my_decorator(func=None, *, option_a="default", option_b=False):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
# use option_a and option_b here
result = func(*args, **kwargs)
return result
return wrapper
if func is not None:
# Called as @my_decorator without parentheses
return decorator(func)
# Called as @my_decorator() or @my_decorator(option_a="value")
return decorator
Usage:
import functools
import logging
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger(__name__)
def log_calls(func=None, *, level=logging.INFO):
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
logger.log(level, "Calling %s", func.__name__)
result = func(*args, **kwargs)
logger.log(level, "%s returned %r", func.__name__, result)
return result
return wrapper
if func is not None:
return decorator(func)
return decorator
# All three forms work:
@log_calls
def add(a, b):
"""Add two numbers."""
return a + b
@log_calls()
def subtract(a, b):
"""Subtract b from a."""
return a - b
@log_calls(level=logging.DEBUG)
def multiply(a, b):
"""Multiply two numbers."""
return a * b
print(add(3, 4)) # 7
print(subtract(10, 3)) # 7
print(multiply(5, 6)) # 30
The bare * after func in the signature forces all configuration arguments to be keyword-only. This prevents ambiguity: when the decorator is called as @log_calls, Python passes the decorated function as func. When called as @log_calls(level=logging.DEBUG), func is None and the function arrives later through the returned decorator.
This pattern relies on the fact that a function object will never be None. The check if func is not None distinguishes between bare usage (@log_calls) and parameterized usage (@log_calls() or @log_calls(level=...)).
Template 4: Class-Based Decorator (Stateful)
When a decorator needs to maintain state across calls, a class-based decorator is cleaner than using nonlocal variables in nested closures. Use functools.update_wrapper instead of functools.wraps.
import functools
class MyDecorator:
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
# initialize state here
def __call__(self, *args, **kwargs):
# logic before
result = self.func(*args, **kwargs)
# logic after
return result
Usage:
import functools
import time
class RateLimit:
"""Enforce a maximum call frequency on a function."""
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.last_called = 0.0
self.min_interval = 1.0 # seconds
def __call__(self, *args, **kwargs):
now = time.time()
elapsed = now - self.last_called
if elapsed < self.min_interval:
wait = self.min_interval - elapsed
raise RuntimeError(
f"Rate limited. Try again in {wait:.2f}s"
)
self.last_called = now
return self.func(*args, **kwargs)
@RateLimit
def send_alert(message):
"""Send an alert notification."""
return f"Alert sent: {message}"
print(send_alert.__name__) # send_alert
print(send_alert.__doc__) # Send an alert notification.
print(send_alert("Disk full"))
try:
send_alert("CPU high") # called too soon
except RuntimeError as e:
print(e)
# Rate limited. Try again in 0.99s
The state (last_called) lives as an instance attribute on the decorator object. Each decorated function gets its own independent state because each @RateLimit application creates a new class instance.
Template 5: Type-Safe Decorator With ParamSpec (Python 3.10+)
PEP 612 introduced ParamSpec, a type variable that captures a function's entire parameter list. Using it alongside TypeVar in your decorator's type annotations allows type checkers like mypy and Pyright to verify that the decorated function's signature is preserved through the decorator, enabling correct autocompletion and catching type errors at call sites.
import functools
from typing import Callable, ParamSpec, TypeVar
P = ParamSpec("P")
R = TypeVar("R")
def my_decorator(func: Callable[P, R]) -> Callable[P, R]:
@functools.wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
# logic before
result = func(*args, **kwargs)
# logic after
return result
return wrapper
Usage:
import functools
import time
from typing import Callable, ParamSpec, TypeVar
P = ParamSpec("P")
R = TypeVar("R")
def timer(func: Callable[P, R]) -> Callable[P, R]:
@functools.wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def calculate_discount(price: float, percentage: float) -> float:
"""Apply a percentage discount to a price."""
return price * (1 - percentage / 100)
# Type checker sees: calculate_discount(price: float, percentage: float) -> float
result: float = calculate_discount(99.99, 15)
# This would be caught by mypy:
# calculate_discount("not a number", 15) # error: Argument 1 has incompatible type
The Callable[P, R] -> Callable[P, R] annotation tells the type checker that the output callable has the exact same parameter list (P) and return type (R) as the input. The P.args and P.kwargs annotations on the wrapper ensure that the type checker can trace argument types through the wrapper to the original function call.
For Python 3.8 and 3.9, ParamSpec is available through typing_extensions: from typing_extensions import ParamSpec. This backport provides the same type-checking behavior in older Python versions.
| Template | Use When | Nesting Levels |
|---|---|---|
| Basic | No configuration needed | 2 |
| Parameterized | Decorator requires arguments | 3 |
| Optional-Argument | Arguments should be optional | 2-3 (dynamic) |
| Class-Based | Decorator maintains state | 1 (class) |
| Type-Safe (ParamSpec) | Type checking is required | 2 |
Stacking Decorators
Any number of decorators can be applied to the same function. Python evaluates them bottom-up at decoration time and calls them top-down at call time.
@decorator_a
@decorator_b
@decorator_c
def my_func():
pass
# Equivalent to:
my_func = decorator_a(decorator_b(decorator_c(my_func)))
The innermost decorator (decorator_c) wraps the original function first. decorator_b then wraps that result. decorator_a wraps that result last. When my_func() is called, decorator_a's wrapper runs first, then decorator_b's, then decorator_c's, then the original function.
import functools
def bold(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
return "<b>" + func(*args, **kwargs) + "</b>"
return wrapper
def italic(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
return "<i>" + func(*args, **kwargs) + "</i>"
return wrapper
@bold
@italic
def greet(name):
return f"Hello, {name}"
print(greet("world")) # <b><i>Hello, world</i></b>
italic wraps greet first, producing a function that returns italic HTML. bold then wraps that, producing a function that wraps the italic output in bold tags. The order on the page is top-down execution order during the call — what you read first is what runs first.
When stacking a logging decorator and a caching decorator, the order determines what gets logged. Place the logging decorator above the caching decorator if you want to log every call including cache hits. Place it below if you only want to log cache misses that actually execute the function.
Making Any Template Async-Compatible
Any of the five templates above can be extended to work with both synchronous and asynchronous functions. The key is checking whether the decorated function is a coroutine function and defining the appropriate wrapper type.
import asyncio
import functools
import time
def timer(func):
if asyncio.iscoroutinefunction(func):
@functools.wraps(func)
async def async_wrapper(*args, **kwargs):
start = time.perf_counter()
result = await func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return async_wrapper
else:
@functools.wraps(func)
def sync_wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__} took {elapsed:.4f}s")
return result
return sync_wrapper
# Works with sync functions
@timer
def compute(n):
"""Compute the sum of range(n)."""
return sum(range(n))
# Works with async functions
@timer
async def fetch(url):
"""Fetch data from a URL."""
await asyncio.sleep(0.1) # simulating network I/O
return f"Response from {url}"
print(compute(1_000_000))
# compute took 0.0234s
asyncio.run(fetch("https://example.com"))
# fetch took 0.1003s
asyncio.iscoroutinefunction(func) returns True if the function was defined with async def. When it is, the decorator defines an async wrapper that uses await to call the original function. When it is not, the decorator defines a standard synchronous wrapper. Both paths apply @functools.wraps(func) and return the result.
A common mistake is wrapping an async function with a synchronous wrapper that calls func(*args, **kwargs) without await. This does not raise an error immediately. Instead, it returns a coroutine object instead of the expected result, which causes confusing bugs downstream. Always check asyncio.iscoroutinefunction(func) and await accordingly.
Decorating Class Methods
The same templates work on instance methods, but there are two details worth knowing before applying them to classes.
Instance methods receive self as their first argument. Because the wrapper uses *args, **kwargs, this passes through automatically — you do not need to handle it explicitly.
import functools
def log_call(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__}")
return func(*args, **kwargs)
return wrapper
class PaymentService:
@log_call
def charge(self, amount):
"""Process a payment charge."""
return f"Charged {amount}"
svc = PaymentService()
svc.charge(99.00) # Calling charge
Stack order matters when combining with @classmethod or @staticmethod. Always place @classmethod or @staticmethod above your custom decorator, not below. Python applies decorators bottom-up, so when @classmethod is above your decorator, your decorator receives the raw function first and wraps it, then @classmethod wraps the result. Placing your custom decorator above @classmethod means Python applies @classmethod first, and your decorator receives a classmethod descriptor object rather than a callable — which causes a TypeError at call time.
# Correct: @classmethod above your custom decorator
class UserFactory:
@classmethod
@log_call # log_call receives the raw function — works correctly
def from_dict(cls, data):
return cls()
# Wrong: your custom decorator above @classmethod
class UserFactory:
@log_call # log_call receives a classmethod descriptor, not a function
@classmethod # → TypeError: 'classmethod' object is not callable
def from_dict(cls, data):
return cls()
If you are writing a decorator specifically intended for use on class methods and need access to the instance (self), extract it from args[0] inside the wrapper. It will always be the first positional argument on unbound instance methods.
Exceptions, Testing, and Performance
How to Handle Exceptions Inside a Decorator
A decorator can observe, suppress, transform, or re-raise exceptions from the decorated function. The default — and safest — behavior is to let exceptions propagate naturally by not catching them at all. The wrapper calls func(*args, **kwargs), and any exception travels up the call stack to the caller, exactly as if the decorator were not there.
When a decorator needs to log exceptions before re-raising them, the pattern is:
import functools
import logging
logger = logging.getLogger(__name__)
def log_exceptions(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as exc:
logger.exception(
"%s raised %s: %s",
func.__name__, type(exc).__name__, exc
)
raise # re-raise so the caller still receives the exception
return wrapper
The critical detail is the bare raise at the end of the except block. It re-raises the original exception without modifying it, preserving the original traceback. Writing raise exc instead of bare raise would reset the traceback to this line, making the error harder to trace.
Exception Chaining in Decorators
When a decorator catches one exception and raises a different one — for example, translating a low-level ConnectionError into a domain-specific exception — use raise NewException(...) from original_exc rather than a bare raise NewException(...). The from clause explicitly chains the exceptions, setting the __cause__ attribute on the new exception and preserving the full causal chain in tracebacks.
import functools
class ServiceUnavailable(RuntimeError):
pass
def translate_errors(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except ConnectionError as exc:
raise ServiceUnavailable(
f"{func.__name__} could not reach the service"
) from exc # preserves original traceback as __cause__
return wrapper
@translate_errors
def fetch_user(user_id):
raise ConnectionError("TCP connection refused")
# Traceback shows both exceptions:
# ConnectionError: TCP connection refused
#
# The above exception was the direct cause of the following exception:
# ServiceUnavailable: fetch_user could not reach the service
The traceback phrase "The above exception was the direct cause of the following exception" is produced automatically when from exc is present. Without it, Python still attaches the original exception to __context__ implicitly, but the traceback reads "During handling of the above exception, another exception occurred" — which implies the second exception is accidental rather than intentional. Using from exc communicates intent clearly to both the runtime and the developer reading the traceback. To suppress the chain entirely and hide the original as an implementation detail, use raise NewException() from None, which sets __suppress_context__ = True.
Catching BaseException vs Exception in Decorators
Decorators should catch Exception, not BaseException, unless there is a deliberate reason to intercept process-level signals. BaseException includes KeyboardInterrupt, SystemExit, and GeneratorExit — exceptions that signal process-level termination rather than application errors. A decorator that catches BaseException and delays or suppresses re-raising will prevent clean shutdown, trap signal handlers, and interfere with context managers that depend on GeneratorExit for cleanup.
import functools
import logging
logger = logging.getLogger(__name__)
def log_exceptions(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as exc: # correct: application errors only
logger.exception( # KeyboardInterrupt and SystemExit propagate freely
"%s raised %s: %s",
func.__name__, type(exc).__name__, exc
)
raise
return wrapper
A decorator that catches Exception and returns a fallback value (like None or False) without logging or re-raising is one of the hardest bugs to diagnose. The function appears to return successfully, and the error disappears. Only suppress exceptions deliberately, with logging, and only when the caller is explicitly designed to handle a sentinel return value.
Testing a Decorated Function Without the Decorator
functools.wraps adds a __wrapped__ attribute to the wrapper that holds a direct reference to the original function. You can use this in tests to bypass the decorator entirely and test the raw function in isolation.
import functools
import time
def rate_limit(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
time.sleep(1) # enforce a 1-second delay
return func(*args, **kwargs)
return wrapper
@rate_limit
def get_user(user_id):
"""Fetch a user record by ID."""
return {"id": user_id, "name": "Alice"}
# In production: rate_limit enforced
result = get_user(42)
# In tests: bypass the decorator entirely
result = get_user.__wrapped__(42) # no sleep, immediate return
assert result == {"id": 42, "name": "Alice"}
With stacked decorators, __wrapped__ only unwraps one layer. inspect.unwrap(func) traverses the entire chain and returns the original function at the bottom, regardless of how many decorators are stacked.
Do Decorators Add Performance Overhead?
Yes, but the overhead is negligible for almost all real-world use cases. Each decorated call adds one extra function call — the wrapper — on top of the original. On modern Python (3.11+), an empty wrapper call costs roughly 100–200 nanoseconds. For functions called hundreds of times per second, this is undetectable. For functions called millions of times in a tight inner loop, it can accumulate.
If profiling reveals that a decorator is a measurable bottleneck in a hot path, the standard approach is to move the decorated function outside the loop rather than removing the decorator, or to use the __wrapped__ attribute to call the raw function directly in the performance-critical section.
Reducing Memory Overhead in Class-Based Decorators
Class-based decorators with __init__ and __call__ create a new instance for every decorated function. Each instance stores its attributes in a __dict__ by default, which allocates a separate hash table per instance. When the same decorator is applied to hundreds of functions — common in large codebases with widespread rate limiting, metering, or access control — the per-instance dictionary overhead adds up.
Adding __slots__ to the decorator class eliminates the per-instance __dict__ and replaces it with a fixed-size array of slot descriptors. The memory saving per instance is modest (typically 200–300 bytes depending on platform), but across hundreds of decorated functions the aggregate reduction is measurable.
import functools
import threading
import time
class RateLimit:
__slots__ = ("func", "last_called", "min_interval", "_lock",
"__wrapped__", "__doc__", "__name__", "__qualname__",
"__module__", "__annotations__", "__dict__")
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.last_called = 0.0
self.min_interval = 1.0
self._lock = threading.Lock()
def __call__(self, *args, **kwargs):
with self._lock:
now = time.time()
elapsed = now - self.last_called
if elapsed < self.min_interval:
raise RuntimeError(
f"Rate limited. Try again in {self.min_interval - elapsed:.2f}s"
)
self.last_called = now
return self.func(*args, **kwargs)
One important detail: functools.update_wrapper writes several attributes (__wrapped__, __doc__, __name__, __qualname__, __module__, __annotations__) onto the instance. These must be explicitly listed in __slots__ for the assignment to succeed; omitting them causes an AttributeError. Including __dict__ in __slots__ preserves the ability to add arbitrary attributes at runtime, which update_wrapper also uses when merging __dict__.
True Transparent Wrapping with wrapt
The templates in this article use functools.wraps to copy metadata, but there is a category of transparency that functools.wraps cannot achieve: correct descriptor protocol behavior. When a decorator built with the standard templates is applied to an instance method, the resulting wrapper does not behave as a bound method descriptor — it is a plain function. This means it does not support __get__ correctly, which breaks certain introspection tools and frameworks that inspect the descriptor chain rather than just the callable.
The wrapt library (available via pip) provides a @wrapt.decorator interface that preserves descriptor protocol behavior in addition to all metadata. The decorated function continues to behave correctly when accessed as an attribute on a class, when used with inspect.ismethod(), and when frameworks like mock and pytest-mock interact with it as a bound method.
import wrapt
@wrapt.decorator
def log_call(wrapped, instance, args, kwargs):
# wrapped — the original function
# instance — self for instance methods, the class for classmethods, None for functions
# args — positional arguments passed to the call
# kwargs — keyword arguments passed to the call
print(f"Calling {wrapped.__name__}")
return wrapped(*args, **kwargs)
class PaymentService:
@log_call
def charge(self, amount):
"""Process a payment charge."""
return f"Charged {amount}"
svc = PaymentService()
# inspect.ismethod correctly reports True for the bound method
import inspect
print(inspect.ismethod(svc.charge)) # True (would be False with functools.wraps)
For library code that will be consumed by frameworks performing detailed introspection, wrapt eliminates an entire class of subtle descriptor-related bugs. For application-level decorators where no such introspection occurs, the standard templates are sufficient.
The Registration Pattern
Not every decorator wraps behavior. A registration decorator records the function in a data structure and then returns the original function unchanged. No wrapper is involved. This is the pattern behind Flask routes, pytest fixtures, and CLI command registries.
from typing import Callable, TypeVar
F = TypeVar("F", bound=Callable)
class Registry:
def __init__(self):
self._handlers: dict[str, Callable] = {}
def register(self, event: str) -> Callable[[F], F]:
def decorator(func: F) -> F:
self._handlers[event] = func
return func # original function, no wrapper
return decorator
def dispatch(self, event: str, *args, **kwargs):
handler = self._handlers.get(event)
if handler is None:
raise KeyError(f"No handler registered for event: {event!r}")
return handler(*args, **kwargs)
bus = Registry()
@bus.register("user.created")
def handle_user_created(user_id: int) -> str:
return f"Welcome email sent to user {user_id}"
@bus.register("order.paid")
def handle_order_paid(order_id: int) -> str:
return f"Fulfillment triggered for order {order_id}"
print(bus.dispatch("user.created", 42))
# Welcome email sent to user 42
print(handle_user_created.__name__) # handle_user_created (unchanged)
Because the registration decorator returns func directly, the original function is still accessible at its original name with its full signature and metadata intact. functools.wraps is not needed — there is nothing to copy onto. The decorator's sole job is the side effect of inserting the function into the registry.
Use this pattern when the goal is discovery or mapping rather than behavioral modification. Common uses: routing tables (URL to handler), event buses (event name to handler), plugin systems (name to implementation), and command registries (CLI command string to function). If the decorator adds behavior at call time, use a wrapper instead.
What functools.wraps Copies Exactly
The article so far refers to functools.wraps copying "metadata," but the exact list matters when writing decorators for frameworks that inspect specific attributes.
functools.wraps is itself a decorator that calls functools.update_wrapper(wrapper, wrapped) with two configurable attribute lists:
import functools
# What functools.WRAPPER_ASSIGNMENTS contains (Python 3.12+):
functools.WRAPPER_ASSIGNMENTS
# ('__module__', '__name__', '__qualname__', '__annotations__',
# '__type_params__', '__doc__')
# What functools.WRAPPER_UPDATES contains:
functools.WRAPPER_UPDATES
# ('__dict__',)
# The difference between "assigned" and "updated":
# - ASSIGNED: the wrapper's attribute is replaced by the original's value
# - UPDATED: the wrapper's dict is merged with the original's dict (not replaced)
In addition to copying the ASSIGNED attributes, functools.update_wrapper adds a __wrapped__ attribute pointing directly to the original function. This is what inspect.signature() follows to report the original parameter list, and what inspect.unwrap() traverses.
You can override which attributes are copied by passing explicit assigned= or updated= arguments:
import functools
def my_decorator(func):
@functools.wraps(func, assigned=('__module__', '__name__', '__qualname__', '__doc__'))
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
# This copies only the four listed attributes, not __annotations__ or __type_params__.
# Useful when the wrapper intentionally changes the function's annotations
# (for example, to reflect a different return type).
If an attribute listed in WRAPPER_ASSIGNMENTS does not exist on the wrapped function (for example, a lambda has no __doc__), functools.update_wrapper silently skips it rather than raising an AttributeError. This is why decorating lambdas works without error — the missing attributes are simply not copied.
Changing the Return Type in a Typed Decorator
The ParamSpec template in this article annotates the decorator as Callable[P, R] -> Callable[P, R]. That annotation is correct when the decorator is transparent — it preserves both the parameter list and the return type. However, some decorators intentionally change the return type. Applying Callable[P, R] -> Callable[P, R] to a type-changing decorator will mislead type checkers.
When the return type changes, annotate the wrapper's return type explicitly rather than reusing R:
import functools
from typing import Callable, Optional, ParamSpec, TypeVar
P = ParamSpec("P")
R = TypeVar("R")
# Decorator that wraps the return value in Optional (None on any exception)
def safe_call(func: Callable[P, R]) -> Callable[P, Optional[R]]:
@functools.wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> Optional[R]:
try:
return func(*args, **kwargs)
except Exception:
return None
return wrapper
@safe_call
def divide(a: float, b: float) -> float:
return a / b
# Type checker sees: divide(a: float, b: float) -> Optional[float]
result = divide(10.0, 0.0) # returns None, not a ZeroDivisionError
print(result) # None
The key difference from Template 5 is the return annotation on both the decorator and the wrapper: Callable[P, Optional[R]] and Optional[R] instead of Callable[P, R] and R. The type checker now correctly infers that the decorated function may return None, and any code that treats the result as a guaranteed float will be flagged.
Writing Callable[P, R] -> Callable[P, R] is a promise to the type checker that the return type is preserved. If your decorator changes the return type but keeps this annotation, mypy and Pyright will both accept incorrect call sites without warning. Annotate accurately — the annotation is documentation for every caller, not just the type checker.
Thread Safety in Stateful Decorators
The RateLimit class-based decorator in Template 4 stores state (last_called) as an instance attribute and reads and writes it on every call. In a single-threaded application this is fine. In a multi-threaded application — a web server handling concurrent requests, a thread pool processing jobs — multiple threads may call the decorated function simultaneously. The check-then-act sequence (now - self.last_called followed by self.last_called = now) is not atomic, and two threads can both pass the rate-limit check before either updates the timestamp.
import functools
import threading
import time
class ThreadSafeRateLimit:
"""Enforce a maximum call frequency. Thread-safe."""
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.last_called = 0.0
self.min_interval = 1.0
self._lock = threading.Lock()
def __call__(self, *args, **kwargs):
with self._lock:
now = time.time()
elapsed = now - self.last_called
if elapsed < self.min_interval:
wait = self.min_interval - elapsed
raise RuntimeError(
f"Rate limited. Try again in {wait:.2f}s"
)
self.last_called = now
return self.func(*args, **kwargs)
@ThreadSafeRateLimit
def send_alert(message):
"""Send an alert notification."""
return f"Alert sent: {message}"
The with self._lock block ensures that the read-check-write sequence on last_called is atomic from the perspective of any other thread. Only one thread can hold the lock at a time, so two threads cannot both pass the rate-limit check for the same enforcement window. The actual function call (self.func(*args, **kwargs)) happens outside the lock so that the decorated function itself can run concurrently — only the state inspection is serialized.
Keep the lock scope as narrow as possible: protect only the state read and write, not the entire function call. Holding the lock while the original function executes would serialize all calls entirely, defeating the purpose of using threads. Acquire, check, update, release — then call the function.
Re-entrant Locks and Recursive Decorated Calls
A standard threading.Lock is not re-entrant: if the same thread attempts to acquire it a second time before releasing it, the thread deadlocks. This becomes a problem when a decorated function calls another function that is decorated with the same decorator instance, or when a decorator wraps a method that calls itself recursively.
The solution is threading.RLock (re-entrant lock), which allows the same thread to acquire the lock multiple times. Each acquisition must be paired with a release. The lock is fully released only when the acquisition count reaches zero.
import functools
import threading
class CallCounter:
"""Count how many times a function has been called. Thread-safe."""
__slots__ = ("func", "call_count", "_lock",
"__wrapped__", "__doc__", "__name__", "__qualname__",
"__module__", "__annotations__", "__dict__")
def __init__(self, func):
functools.update_wrapper(self, func)
self.func = func
self.call_count = 0
# Use RLock to allow recursive calls from within the decorated function
self._lock = threading.RLock()
def __call__(self, *args, **kwargs):
with self._lock:
self.call_count += 1
return self.func(*args, **kwargs)
@CallCounter
def factorial(n):
"""Compute n! recursively."""
if n <= 1:
return 1
return n * factorial(n - 1) # re-enters CallCounter.__call__ on each recursion
factorial(5)
print(factorial.call_count) # 5 (one increment per recursive call)
The distinction matters most in two scenarios: decorators that wrap recursive functions, and shared decorator instances applied across a call graph where one decorated function calls another. In both cases, threading.Lock will deadlock on the second acquisition from the same thread, while threading.RLock allows the re-entry. When in doubt about whether your decorator might be entered recursively, use RLock — the performance difference is minor and the protection against deadlock is significant.
threading.Lock and threading.RLock protect against OS-level thread preemption. If you are using asyncio, use asyncio.Lock instead — a threading lock acquired inside a coroutine will block the entire event loop. The two lock types are not interchangeable. There is no async equivalent of RLock in the standard library; re-entrant async locking requires a custom implementation or a third-party package.
Check Your Understanding
Work through these questions one at a time. Select an answer to see immediate feedback, then use the Try Again button to explore what each option actually does — understanding why wrong answers are wrong builds stronger intuition than only knowing the right one.
@functools.wraps(func). What is the first concrete symptom a developer is likely to notice?@retry(max_attempts=3). What happens if a developer mistakenly writes @retry without parentheses?async def function with a regular def wrapper that calls func(*args, **kwargs) without await. What does the caller actually receive?Spot the Bug
The decorator below was written with good intentions but contains a single structural mistake. Read it carefully — the code will run without raising an exception, so the failure is silent. Identify exactly what is wrong.
import functools
import logging
logger = logging.getLogger(__name__)
def log_result(func):
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
logger.info("%s returned %r", func.__name__, result)
return result
return wrapper
@log_result
def calculate_tax(amount, rate):
"""Calculate tax on an amount at a given rate."""
return amount * rate
How to Use These Templates
Every well-behaved Python decorator follows the same construction sequence regardless of which template variant you use. The steps below apply to Template 1. Templates 2 through 5 add layers around this core, but the inner wrapper is always built the same way.
- Import functools. Add
import functoolsat the top of your module. Without it,functools.wrapsis not available and step 3 cannot be completed. - Define the outer decorator function. Write
def my_decorator(func):. This function receives the target callable when the@syntax is evaluated at definition time. - Apply
@functools.wraps(func)to the wrapper. Place this decorator immediately above the inner wrapper definition. It copies__name__,__doc__,__qualname__,__module__,__annotations__, and__type_params__from the original function onto the wrapper, and adds a__wrapped__attribute. - Define the wrapper with
*argsand**kwargs. Writedef wrapper(*args, **kwargs):. This signature ensures the wrapper accepts any function regardless of how many positional or keyword arguments it takes. - Place pre-call logic, call the original function, and capture the result. Write
result = func(*args, **kwargs). Any logic that runs before the original function goes above this line; any logic that runs after goes below it. - Return the result. End the wrapper with
return result. This step is required. Omitting it causes every decorated function to silently returnNone. - Return the wrapper from the decorator. End the outer function with
return wrapper. This is what the@syntax consumes — the wrapper replaces the original function name in the current scope. - Choose the right variant for your use case. If the decorator needs configuration arguments, add a third outer layer (Template 2) or use the sentinel pattern for optional arguments (Template 3). If the decorator needs to maintain state across calls, use a class with
__init__and__call__(Template 4). If type checking is part of your workflow, addParamSpecandTypeVarannotations (Template 5).
Frequently Asked Questions
What does the @ symbol actually do in a Python decorator?
The @ syntax is shorthand for name rebinding. Writing @my_decorator above a function definition is exactly equivalent to writing func = my_decorator(func) immediately after the definition. The decorator is called with the original function as its argument, and whatever it returns replaces the original name in the current scope. This happens at definition time, not at call time.
What are the minimum requirements for a well-behaved Python decorator?
A well-behaved decorator must do three things: accept any function signature using *args and **kwargs on the wrapper, preserve the original function's metadata by applying @functools.wraps(func) to the wrapper, and explicitly return the result of calling the original function so the caller receives the correct return value.
How do I write a decorator that accepts optional arguments?
Use a sentinel pattern: define the decorator with func as the first parameter defaulting to None, followed by keyword-only arguments after a bare *. If func is None, return the decorator. If func is not None, the decorator was called without arguments and func is the decorated function itself. This allows @my_decorator, @my_decorator(), and @my_decorator(option=value) to all work correctly.
Should I use a function-based or class-based decorator?
Use function-based decorators for stateless behavior like logging, timing, or retry logic. Use class-based decorators when the decorator needs to maintain state across calls — such as counting invocations, enforcing rate limits, or accumulating metrics. Class-based decorators store state as instance attributes and call functools.update_wrapper(self, func) in __init__ instead of using functools.wraps.
How do I test a decorated function without the decorator's behavior running?
Use the __wrapped__ attribute added by functools.wraps. It holds a direct reference to the original function. Call func.__wrapped__(*args) in tests to bypass all decorator side effects. For functions with multiple stacked decorators, use inspect.unwrap(func) to traverse the entire chain and return the original function at the bottom.
Are class-based stateful decorators thread-safe by default?
No. Class-based decorators that store mutable state as instance attributes are not thread-safe by default. If multiple threads call the decorated function concurrently, they may read and write state simultaneously, producing race conditions. Protect all state mutations with a threading.Lock acquired inside __call__. Use threading.RLock instead of threading.Lock when the decorated function may call itself recursively or call another function decorated with the same instance — a standard lock will deadlock on a second acquisition from the same thread.
What is ParamSpec and why should I use it in decorator type hints?
ParamSpec, introduced in PEP 612 for Python 3.10, is a type variable that captures a function's entire parameter list. When you annotate a decorator as Callable[P, R] -> Callable[P, R] using ParamSpec P and TypeVar R, type checkers like mypy and Pyright can verify that the decorated function preserves the original function's parameter types and return type. This enables correct IDE autocompletion and catches type errors at the call site. For Python 3.8 and 3.9, ParamSpec is available through typing_extensions.
Key Takeaways
- Every decorator is syntactic sugar for name rebinding.
@my_decoratorabove a function definition is exactly equivalent tofunc = my_decorator(func)immediately after it. Understanding this makes every template structure self-explanatory: the wrapper returned by the decorator replaces the original name. - Every decorator must follow three rules. Accept any function signature with
*args, **kwargs. Preserve metadata with@functools.wraps(func). Return the original function's result with an explicitreturnstatement. Breaking any of these rules produces a decorator that silently corrupts the functions it touches. - Choose the template that matches your needs. Use Template 1 (basic) for stateless behavior with no configuration. Use Template 2 (parameterized) when the decorator requires arguments that must be provided at decoration time. Use Template 3 (optional-argument) when you want the decorator to work both with and without parentheses. Use Template 4 (class-based) when the decorator needs to maintain state across calls. Use Template 5 (type-safe) when type checking is part of your workflow.
- Place
@functools.wraps(func)on the innermost function. In two-level decorators, it goes on the wrapper. In three-level parameterized decorators, it goes on the innermost wrapper, not the middle decorator function. In class-based decorators, callfunctools.update_wrapper(self, func)in__init__. - Stacked decorators execute bottom-up at decoration time, top-down at call time. The decorator closest to the function definition wraps first. During a call, the outermost wrapper runs first. Order matters when decorators interact — a logging decorator above a caching decorator logs every call including hits; below it, only misses.
- Use
ParamSpecfor type safety. PEP 612 introducedParamSpecin Python 3.10 to allow type checkers to verify that decorators preserve parameter types. Annotating your decorator asCallable[P, R] -> Callable[P, R]enables correct IDE autocompletion and catches type mismatches at call sites. For Python 3.8-3.9, usetyping_extensions. - Handle async functions explicitly. Check
asyncio.iscoroutinefunction(func)and define separate async and sync wrapper paths. Wrapping an async function with a sync wrapper that forgetsawaitreturns a coroutine object instead of the expected result, causing bugs that are difficult to trace. - Place
@classmethodand@staticmethodabove your custom decorator, never below. Python applies decorators bottom-up. When@classmethodis above your decorator, your decorator receives the raw function and wraps it correctly, then@classmethodwraps the result. When your decorator is above@classmethod, it receives aclassmethoddescriptor object, not a callable, and raises aTypeError. - Re-raise exceptions with bare
raise, notraise exc. Bareraisepreserves the original traceback.raise excresets it to the decorator line, making production errors harder to diagnose. When translating exceptions, useraise NewException() from original_excto chain them explicitly. Usefrom Noneonly when the original is a private implementation detail. Always catchException, notBaseException— catchingBaseExceptioninterceptsKeyboardInterrupt,SystemExit, andGeneratorExitand prevents clean process shutdown. - Use
__slots__on class-based decorators applied at scale. Each class-based decorator instance normally allocates a per-instance__dict__. Declaring__slots__eliminates that dictionary, reducing memory overhead across large numbers of decorated functions. List all attributes written byfunctools.update_wrapperin__slots__explicitly, including__wrapped__,__doc__,__name__,__qualname__,__module__, and__annotations__. - Use
threading.RLockwhen the decorated function may call itself recursively or call another function decorated with the same instance. A standardthreading.Lockdeadlocks on a second acquisition from the same thread.threading.RLockallows re-entry from the same thread without deadlock. For async code, useasyncio.Lockinstead — threading locks block the entire event loop when acquired inside a coroutine. - Use
wraptfor library-level decorators that must survive framework introspection. The standard templates preserve metadata but not the descriptor protocol. When a framework callsinspect.ismethod()or accesses the descriptor chain on a decorated method, a wrapper built withfunctools.wrapsreturns incorrect results.wrapt.decoratorpreserves full descriptor behavior. For application-level decorators, the standard templates are sufficient. - Use
__wrapped__to test raw functions in isolation.functools.wrapsadds__wrapped__as a direct reference to the original function. Callfunc.__wrapped__(*args)in tests to bypass the decorator's side effects entirely. Useinspect.unwrap(func)to traverse a full stack of decorators. - Not every decorator wraps a function. Registration decorators record a function in a registry and return it unchanged. No wrapper, no
functools.wraps. The function remains fully accessible at its original name with its original signature. This is the pattern behind Flask routes, pytest fixtures, and event buses. functools.wrapscopies six named attributes and merges__dict__. The ASSIGNED attributes are__module__,__name__,__qualname__,__annotations__,__type_params__, and__doc__. Missing attributes are silently skipped. Pass customassigned=orupdated=arguments when a decorator intentionally changes annotations or return type.- Annotate return-type-changing decorators accurately.
Callable[P, R] -> Callable[P, R]is a promise that the return type is preserved. If your decorator wraps results inOptional[R], a result container, or any other transformed type, annotate the wrapper's return type explicitly. ReusingRwhen the type changes will mislead the type checker and every caller. - Stateful class-based decorators are not thread-safe by default. Any read-check-write sequence on an instance attribute is a potential race condition in multi-threaded code. Protect state mutations with
threading.Lock, keeping the lock scope as narrow as possible — acquire around the state access only, not around the entire function call. Useasyncio.Lockinstead ofthreading.Lockin async contexts.
The value of these templates is consistency. When every decorator in a codebase follows the same structure, developers can read any decorator and immediately identify where the pre-call logic is, where the post-call logic is, and where the original function is invoked. The structure becomes invisible, and the behavior becomes the focus. Copy the template that fits your use case, fill in the behavior, and the decorator will be transparent to callers, debuggers, documentation generators, and type checkers. When the decorator registers rather than wraps, return the original function unchanged. When it changes the return type, annotate it accurately. When it holds state in a multi-threaded environment, protect that state with a lock — and reach for RLock when recursion or shared decorator instances are involved. When translating exceptions, chain them explicitly with from exc. For library-level decorators that must survive descriptor introspection, use wrapt. For class-based decorators applied at scale, declare __slots__.