Standard List of Built-in and Core Module Decorators in Python

Python ships with decorators spread across the builtins and several standard library modules. Knowing what is available eliminates the need to reinvent patterns that the language already provides. This reference catalogs every decorator in the builtins and the core modules -- functools, dataclasses, contextlib, abc, typing, warnings, and atexit -- with a concise code example, version history, and a one-paragraph explanation for each. Coverage is current through Python 3.14.

Built-in Decorators (No Import Required)

Three decorators are available as builtins. They require no import and are used directly in class bodies.

@property

Converts a method into a managed attribute. The method becomes a getter, and the returned property object provides .setter and .deleter methods that can be used as decorators to define write and delete behavior:

class Temperature:
    def __init__(self, celsius):
        self._celsius = celsius

    @property
    def fahrenheit(self):
        return self._celsius * 9 / 5 + 32

    @property
    def celsius(self):
        return self._celsius

    @celsius.setter
    def celsius(self, value):
        if value < -273.15:
            raise ValueError("Below absolute zero")
        self._celsius = value

t = Temperature(100)
print(t.fahrenheit)  # 212.0
t.celsius = 0
print(t.fahrenheit)  # 32.0

@staticmethod

Removes the implicit first argument (self or cls) from a method. The method behaves like a plain function namespaced inside the class:

class MathHelper:
    @staticmethod
    def clamp(value, low, high):
        return max(low, min(value, high))

print(MathHelper.clamp(15, 0, 10))  # 10

@classmethod

Passes the class (cls) as the first argument instead of the instance. Commonly used for factory methods and alternative constructors that produce instances through alternative construction paths:

import json

class Config:
    def __init__(self, settings):
        self.settings = settings

    @classmethod
    def from_json(cls, path):
        with open(path) as f:
            return cls(json.load(f))

    @classmethod
    def defaults(cls):
        return cls({"debug": False, "log_level": "INFO"})

functools Decorators

The functools module contains the highest concentration of decorators in the standard library. All require from functools import ....

@functools.wraps

Preserves the original function's metadata when writing a decorator. Copies __name__, __doc__, __module__, __qualname__, and __annotations__ from the wrapped function onto the wrapper, and also sets __wrapped__ to point to the original function. The __wrapped__ attribute allows introspection tools and the inspect module to unwrap decorator chains. Source: Python docs, functools.wraps.

from functools import wraps

def log(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        print(f"Calling {func.__name__}")
        return func(*args, **kwargs)
    return wrapper

@log
def greet(name):
    """Return a greeting."""
    return f"Hello, {name}"

print(greet.__name__)  # greet (not 'wrapper')
print(greet.__doc__)   # Return a greeting.

@functools.lru_cache

Memoizes function results using a Least Recently Used cache with a configurable maxsize (default 128). Supports optional parentheses since Python 3.8:

from functools import lru_cache

@lru_cache(maxsize=256)
def fibonacci(n):
    if n < 2:
        return n
    return fibonacci(n - 1) + fibonacci(n - 2)

print(fibonacci(100))         # 354224848179261915075
print(fibonacci.cache_info()) # CacheInfo(hits=..., misses=101, maxsize=256, currsize=101)

@functools.cache

Added in Python 3.9. An unbounded version of @lru_cache with no size limit and no LRU eviction. Equivalent to @lru_cache(maxsize=None) but slightly faster because it skips the eviction bookkeeping:

from functools import cache

@cache
def factorial(n):
    if n < 0:
        raise ValueError("factorial is not defined for negative numbers")
    return n * factorial(n - 1) if n else 1

print(factorial(10))  # 3628800

@functools.cached_property

Added in Python 3.8. A property that is computed once and cached as an instance attribute. Subsequent reads return the cached value without re-executing the method. The cache can be cleared by deleting the attribute:

from functools import cached_property

class DataSet:
    def __init__(self, values):
        self._values = values

    @cached_property
    def stats(self):
        print("Computing stats...")
        return {
            "mean": sum(self._values) / len(self._values),
            "count": len(self._values),
        }

ds = DataSet([10, 20, 30])
print(ds.stats)  # Computing stats... {'mean': 20.0, 'count': 3}
print(ds.stats)  # {'mean': 20.0, 'count': 3}  (no recomputation)

@functools.singledispatch

Added in Python 3.4. Turns a function into a generic function that dispatches on the type of its first argument. Use .register to define type-specific implementations. The registered implementations can also be discovered externally via func.dispatch(type) and func.registry. Source: Python docs, functools.singledispatch.

from functools import singledispatch

@singledispatch
def serialize(obj):
    raise TypeError(f"Cannot serialize {type(obj)}")

@serialize.register(int)
def _(obj):
    return str(obj)

@serialize.register(list)
def _(obj):
    return "[" + ", ".join(serialize(item) for item in obj) + "]"

print(serialize(42))         # 42
print(serialize([1, 2, 3]))  # [1, 2, 3]

@functools.singledispatchmethod

Added in Python 3.8. The method-compatible version of @singledispatch. Dispatches on the type of the first argument after self or cls:

from functools import singledispatchmethod

class Formatter:
    @singledispatchmethod
    def format(self, arg):
        return str(arg)

    @format.register
    def _(self, arg: int):
        return f"{arg:,}"

    @format.register
    def _(self, arg: float):
        return f"{arg:.2f}"

f = Formatter()
print(f.format(1000000))  # 1,000,000
print(f.format(3.14159))  # 3.14

@functools.total_ordering

Class decorator that generates missing comparison methods from a minimal set. You define __eq__ and one of __lt__, __le__, __gt__, or __ge__; the decorator fills in the rest:

from functools import total_ordering

@total_ordering
class Version:
    def __init__(self, major, minor):
        self.major = major
        self.minor = minor

    def __eq__(self, other):
        return (self.major, self.minor) == (other.major, other.minor)

    def __lt__(self, other):
        return (self.major, self.minor) < (other.major, other.minor)

v1, v2 = Version(1, 2), Version(2, 0)
print(v1 < v2)   # True
print(v1 >= v2)   # False (auto-generated)

dataclasses Decorators

@dataclasses.dataclass

Class decorator added in Python 3.7 that auto-generates __init__, __repr__, __eq__, and optionally __hash__, __lt__, and others based on class annotations. Supports optional parentheses. The slots=True parameter (Python 3.10+) generates __slots__ automatically, reducing per-instance memory overhead. The kw_only=True parameter (also Python 3.10+) forces all generated __init__ fields to be keyword-only, preventing ambiguous positional construction. For a complete guide to field options, inheritance behavior, and post-init processing, see Python dataclasses. Source: Python docs, dataclasses.

from dataclasses import dataclass

@dataclass(frozen=True, slots=True)
class Point:
    x: float
    y: float

p = Point(3.0, 4.0)
print(p)            # Point(x=3.0, y=4.0)
print(p == Point(3.0, 4.0))  # True

contextlib Decorators

@contextlib.contextmanager

Converts a generator function into a context manager. Code before yield runs on entry; code after yield runs on exit:

from contextlib import contextmanager
import time

@contextmanager
def timer(label):
    start = time.perf_counter()
    try:
        yield
    finally:
        print(f"{label}: {time.perf_counter() - start:.4f}s")

with timer("sort"):
    sorted(range(1_000_000, 0, -1))

@contextlib.asynccontextmanager

Added in Python 3.7. The async version of @contextmanager for use with async with statements:

from contextlib import asynccontextmanager
import asyncio

@asynccontextmanager
async def lifespan(app):
    print("startup: acquiring resources")
    try:
        yield app
    finally:
        print("shutdown: releasing resources")

async def main():
    async with lifespan("myapp") as app:
        print(f"running with {app}")

asyncio.run(main())

abc Decorators

@abc.abstractmethod

Marks a method as abstract, requiring subclasses to provide an implementation before they can be instantiated. Can be stacked with @classmethod or @staticmethod (always as the innermost decorator). Note: stacking with @property was deprecated in Python 3.11 and removed in Python 3.13. Source: Python docs, abc.abstractmethod.

from abc import ABC, abstractmethod

class Shape(ABC):
    @abstractmethod
    def area(self):
        ...

    @classmethod
    @abstractmethod
    def from_string(cls, s):
        ...

class Circle(Shape):
    def __init__(self, radius):
        self.radius = radius

    def area(self):
        return 3.14159 * self.radius ** 2

    @classmethod
    def from_string(cls, s):
        return cls(float(s))

c = Circle.from_string("5")
print(c.area())  # 78.53975

typing Decorators

Decorators in the typing module are primarily consumed by type checkers and have minimal or no runtime effect.

@typing.overload

Declares multiple type signatures for a single function. The overloaded signatures exist purely for type checkers; the actual implementation follows as a plain, un-decorated definition. At runtime, only the implementation is executed. Source: Python docs, typing.overload.

from typing import overload

@overload
def process(data: str) -> str: ...
@overload
def process(data: int) -> int: ...

def process(data):
    if isinstance(data, str):
        return data.upper()
    return data * 2

print(process("hello"))  # HELLO
print(process(5))        # 10

@typing.final

Added in Python 3.8. Signals to type checkers that a method cannot be overridden or a class cannot be subclassed:

from typing import final

class Base:
    @final
    def critical_method(self):
        return "do not override"

# Type checker error: Cannot override final method
# class Child(Base):
#     def critical_method(self):
#         return "overridden"

@typing.override

Added in Python 3.12. Marks a method as intentionally overriding a parent method. Type checkers will flag it as an error if the parent does not define that method, catching typos in method names:

from typing import override

class Animal:
    def speak(self):
        return "..."

class Dog(Animal):
    @override
    def speak(self):
        return "Woof"

    # Type checker error: no 'spek' in parent
    # @override
    # def spek(self):
    #     return "typo"

@typing.runtime_checkable

Added in Python 3.8. Makes a Protocol class usable with isinstance() checks at runtime:

from typing import Protocol, runtime_checkable

@runtime_checkable
class Closable(Protocol):
    def close(self) -> None: ...

class Connection:
    def close(self) -> None:
        print("closed")

conn = Connection()
print(isinstance(conn, Closable))  # True

@typing.no_type_check

Disables type checking for a function or class. Annotations on the decorated function or class will not be evaluated by type checkers. Note: the companion decorator @typing.no_type_check_decorator (which applied @no_type_check to all functions within a decorator factory) was deprecated in Python 3.13 and is scheduled for removal in Python 3.15 -- Python 3.13 release notes note that the decorator spent eight years in the standard library without ever gaining support from any major type checker. Use @no_type_check applied directly instead:

from typing import no_type_check

@no_type_check
def legacy_function(x, y):
    # Type checker ignores this entirely
    return x + y

@typing.dataclass_transform

Added in Python 3.11. Signals to type checkers that a decorator, base class, or metaclass provides dataclass-like behavior. Used by libraries like attrs, pydantic, and SQLAlchemy to get type checker support without using @dataclass directly:

from typing import TypeVar, dataclass_transform

T = TypeVar("T")

@dataclass_transform()
def my_dataclass(cls: type[T]) -> type[T]:
    # At runtime this would invoke a library's own __init__/__eq__/etc.
    # synthesis logic. The decorator itself is a signal to type checkers.
    return cls


# Type checkers treat MyModel as if it were a @dataclass
@my_dataclass
class MyModel:
    id: int
    name: str

warnings Decorators

The warnings module gained its first decorator in Python 3.13 via PEP 702. Requires from warnings import deprecated.

@warnings.deprecated

Added in Python 3.13 (PEP 702). Marks a function, method, class, or @typing.overload signature as deprecated. At call time or instantiation, it emits a DeprecationWarning with the supplied message. Type checkers (mypy, pyright, pyrefly) consume the decorator to surface deprecation warnings at analysis time. The message is stored in a __deprecated__ attribute on the decorated object. An optional category parameter lets you change the warning class (set to None to suppress the runtime warning entirely, keeping only the type-checker signal). Source: Python docs, warnings.deprecated; PEP 702.

from warnings import deprecated

@deprecated("Use process_v2() instead. Will be removed in v4.0.")
def process(data):
    return data.upper()

@deprecated("LegacyClient is superseded by AsyncClient.")
class LegacyClient:
    pass

# Runtime: emits DeprecationWarning on call/instantiation
result = process("hello")   # DeprecationWarning: Use process_v2() instead...
client = LegacyClient()     # DeprecationWarning: LegacyClient is superseded...

# The message is accessible without calling
print(process.__deprecated__)  # Use process_v2() instead. Will be removed in v4.0.

For codebases targeting Python 3.12 or earlier, the same decorator is available via the typing_extensions package as from typing_extensions import deprecated, with identical semantics.

atexit Decorators

The atexit module provides shutdown registration. Requires import atexit.

@atexit.register

Registers a function to run automatically when the Python interpreter shuts down normally (i.e., when sys.exit() is called or the main module finishes executing). The Python docs note that atexit.register returns func unchanged, which is what makes decorator use possible. Registered functions are called in last-in, first-out order. The decorator form is only suitable for functions that take no arguments; functions requiring arguments must use the functional form atexit.register(func, arg1, arg2). Shutdown handlers registered via @atexit.register are not called if the process receives an unhandled signal, if a Python fatal internal error occurs, or if os._exit() is called directly. Source: Python docs, atexit.

import atexit

@atexit.register
def cleanup():
    print("interpreter shutting down: flushing buffers")

# For handlers that need arguments, use the functional form:
def goodbye(name):
    print(f"Goodbye, {name}")

atexit.register(goodbye, "user")

# Output when interpreter exits (LIFO order):
# Goodbye, user
# interpreter shutting down: flushing buffers
Thread Safety Note

Starting in Python 3.12, attempting to start a new thread or call os.fork() inside an @atexit.register handler raises a RuntimeError. Earlier Python versions allowed it but produced race conditions during shutdown.

Quick Reference Table

Decorator Module Added Purpose
@propertybuiltins2.2Managed attribute with getter/setter/deleter
@staticmethodbuiltins2.2Method with no implicit first argument
@classmethodbuiltins2.2Method receiving class as first argument
@wrapsfunctools2.5Preserve wrapped function metadata
@lru_cachefunctools3.2Memoization with LRU eviction
@total_orderingfunctools3.2Auto-generate comparison methods
@singledispatchfunctools3.4Type-based function overloading
@cached_propertyfunctools3.8Lazily computed cached attribute
@singledispatchmethodfunctools3.8Type-based method overloading
@cachefunctools3.9Unbounded memoization
@dataclassdataclasses3.7Auto-generate class boilerplate
@contextmanagercontextlib2.5Generator-based context manager
@asynccontextmanagercontextlib3.7Async generator context manager
@abstractmethodabc2.6Require subclass implementation
@overloadtyping3.5Multiple type signatures (type checker only)
@finaltyping3.8Prevent override/subclass (type checker only)
@runtime_checkabletyping3.8Enable isinstance() on Protocol
@no_type_checktyping3.5Disable type checking
@dataclass_transformtyping3.11Signal dataclass-like behavior
@overridetyping3.12Mark intentional method override
@deprecatedwarnings3.13Emit deprecation warning at call / instantiation
@atexit.registeratexit3.xRegister shutdown cleanup handler
Managed attribute with getter/setter/deleter
Method with no implicit first argument
Method receiving class as first argument
Preserve wrapped function metadata
Memoization with LRU eviction
Auto-generate comparison methods
Type-based function overloading
Lazily computed cached attribute
Type-based method overloading
Unbounded memoization
Auto-generate class boilerplate
Generator-based context manager
Async generator context manager
Require subclass implementation
Multiple type signatures (type checker only)
Prevent override/subclass (type checker only)
Enable isinstance() on Protocol
Disable type checking
Signal dataclass-like behavior
Mark intentional method override
Emit deprecation warning at call / instantiation
Register shutdown cleanup handler
When to use @warnings.deprecated vs comments

A code comment or docstring saying "this is deprecated" is invisible to tooling. @warnings.deprecated emits a live DeprecationWarning at runtime and is understood by mypy, pyright, and pyrefly. If you maintain a library or internal SDK, prefer the decorator over prose for any symbol you intend to remove in a future version. Callers who filter DeprecationWarning with -W default or warnings.simplefilter("always") will see the warning in their test output.

  1. The three builtins are descriptors, not just syntax sugar. @property, @staticmethod, and @classmethod are implemented as descriptor objects that implement __get__ (and __set__ / __delete__ for property). When Python resolves an attribute on an instance, it walks the MRO and calls __get__ on any descriptor it finds. Stacking @classmethod with @property was deprecated in Python 3.11 and the support for it was removed entirely in Python 3.13 — the Python 3.14 docs explicitly state that class methods can no longer wrap other descriptors such as property() (Python docs, classmethod). Code that previously relied on this pattern to create class-level computed attributes should migrate to a custom descriptor class or a metaclass-level property. The stacking appeared to work in some Python versions before 3.11, but the outer descriptor did not see the inner descriptor's __get__ result correctly, producing inconsistent behavior across subclasses.
  2. Choose between @cache, @lru_cache, and @cached_property based on object lifetime, not just call frequency. @cache is unbounded and holds every unique argument set for the process lifetime -- a memory leak risk on functions with high-cardinality inputs. @lru_cache bounds growth but adds eviction bookkeeping and is not thread-safe for the .cache_clear() call in all Python versions before 3.12. @cached_property is per-instance, so the cache lives and dies with the object -- the correct choice when the computed value depends on mutable instance state. It also requires that the class not define a __slots__ without including the property name, or the assignment will raise AttributeError at runtime.
  3. Use @singledispatch to enforce open/closed boundaries instead of isinstance chains. The common pattern of if isinstance(x, int): ... elif isinstance(x, str): ... is closed to extension -- adding a new type requires modifying the function. With @singledispatch, external code can register implementations for new types via func.register(NewType) without touching the original function. This is the correct approach for serialization pipelines, visitor patterns, and any API that needs to be extensible by consumers. For class methods, @singledispatchmethod dispatches on the first argument after self or cls, which means dispatch type must be fully resolved at call time -- it does not support union types or typing.Protocol as dispatch targets.
  4. The typing decorators have runtime behavior that is more consequential than their documentation implies. @runtime_checkable enables isinstance() on Protocol classes, but the check only validates that the object has the required attributes -- it does not check method signatures or return types. A class with a close attribute set to an integer passes an isinstance(obj, Closable) check. @final has no enforcement at runtime; a subclass can still override a @final method and Python will not raise an error -- only type checkers (mypy, pyright, pyrefly) flag it. @override also relies entirely on the type checker to catch mismatches; calling it in a codebase without a checker configured to use it provides no protection at all.
  5. Combine @dataclass with @classmethod factory methods to solve the multiple-constructor problem cleanly. @dataclass generates a single __init__ from annotations, but real-world classes need multiple construction paths -- from a dict, from a database row, from an environment variable. The correct pattern is frozen=True, slots=True on the dataclass for immutability and memory efficiency, then @classmethod methods named from_* for each alternative constructor. This pattern keeps __init__ as a pure field assignment while making alternative construction paths explicit and type-checkable. Using __post_init__ for all construction logic is the anti-pattern that leads to boolean flags and fragile branching inside the initializer.

This reference covers every decorator shipped with CPython's standard library as of Python 3.14 (released October 7, 2025). Third-party libraries add their own decorators (Flask's @app.route, pytest's @pytest.fixture, Django's @login_required), but all of them are built on the same descriptor and callable-wrapping mechanism described throughout this series. The standard library decorators are the foundation to understand before reaching for any external dependency. For the most current version of each decorator's API, consult the Python Standard Library reference.

How to Choose the Right Decorator

The following steps walk through the selection logic for each category of built-in and standard library decorator.

  1. Working inside a class body? Use @property for managed attribute access with getter, setter, and deleter control. Use @staticmethod when the method needs no access to the instance or class. Use @classmethod when the method needs the class itself — most often for factory constructors.
  2. Need memoization? Use @functools.cache for pure functions with a bounded input domain where unbounded growth is not a concern. Use @functools.lru_cache(maxsize=N) to cap memory usage with LRU eviction. Use @functools.cached_property when the result is per-instance and should be computed lazily on first access.
  3. Need type-based dispatch? Use @functools.singledispatch to dispatch a standalone function on the type of its first argument. Use @functools.singledispatchmethod for the same pattern inside a class.
  4. Need to reduce class boilerplate? Apply @dataclasses.dataclass to generate __init__, __repr__, and __eq__ from annotations. Add slots=True for reduced memory overhead (Python 3.10+) and frozen=True for immutable instances.
  5. Need a context manager without a full class? Use @contextlib.contextmanager for synchronous code and @contextlib.asynccontextmanager for async with blocks.
  6. Need type-checker signals? Use @typing.overload for multiple type signatures, @typing.final to block overrides or subclassing, @typing.override to assert an intentional override, and @typing.runtime_checkable to enable isinstance() on a Protocol.
  7. Marking an API for removal? Use @warnings.deprecated (Python 3.13+) with a message string. For earlier Python versions, use the same decorator from the typing_extensions package.
  8. Need a shutdown cleanup handler? Use @atexit.register for zero-argument functions. For functions that take arguments, use the functional form atexit.register(func, arg1, arg2).

Frequently Asked Questions

What are the three built-in decorators in Python?

The three decorators built into Python without any import are @property, @staticmethod, and @classmethod. @property converts a method into a managed attribute with getter, setter, and deleter hooks. @staticmethod strips the automatic self or cls binding from a method. @classmethod passes the class (cls) as the first argument instead of the instance.

What functools decorators does Python provide?

The functools module provides @wraps (preserves decorated function metadata including __wrapped__), @lru_cache and @cache (memoization), @cached_property (lazy computed attributes), @singledispatch and @singledispatchmethod (type-based function overloading), and @total_ordering (auto-generates comparison methods from a minimal set).

What is the difference between @functools.cache and @functools.lru_cache?

@cache (added in Python 3.9) is an unbounded cache that stores every unique call result indefinitely. @lru_cache uses a Least Recently Used eviction policy with a configurable maxsize (default 128), discarding the least recently used entries when the cache is full. Use @cache for pure functions with a bounded input domain. Use @lru_cache when memory is a concern or input domains are large.

What does @contextlib.contextmanager do?

@contextlib.contextmanager converts a generator function into a context manager usable with the with statement. Code before yield runs on entry; code after yield runs on exit. This eliminates the need to write a full class with __enter__ and __exit__ methods for simple resource management patterns.

What does @warnings.deprecated do in Python?

@warnings.deprecated (added in Python 3.13 via PEP 702) marks a function, method, class, or @typing.overload signature as deprecated. At runtime it emits a DeprecationWarning when the decorated object is called or instantiated. Type checkers such as mypy and pyright consume the decorator to flag usages at analysis time. The message is stored in a __deprecated__ attribute on the decorated object.

What does @atexit.register do in Python?

@atexit.register registers a function to be called automatically when the Python interpreter exits normally. Because atexit.register returns the original function unchanged, it can be used directly as a decorator with no arguments. Registered functions are called in last-in, first-out order. The decorator form is only suitable for functions that take no arguments; functions requiring arguments must use the functional form atexit.register(func, arg1, arg2).

What typing module decorators does Python provide?

The typing module provides @overload (declares multiple type signatures for a function), @final (prevents subclassing or method overriding), @override (marks a method as intentionally overriding a parent method, added in 3.12), @runtime_checkable (allows isinstance() checks against Protocol classes), @no_type_check (disables type checking), and @dataclass_transform (signals dataclass-like behavior for type checkers, added in 3.11). Note: @no_type_check_decorator was deprecated in Python 3.13 and is scheduled for removal in Python 3.15.

Can @classmethod and @property be stacked in Python?

Stacking @classmethod with @property was formally deprecated in Python 3.11 and the chaining support was removed in Python 3.13. Before 3.11, some code combined the two to create class-level computed attributes, but the behavior was inconsistent. In Python 3.13 and later, @classmethod can no longer wrap @property. The correct approach for a class-level computed attribute is a custom descriptor or a metaclass property.