The @ symbol in Python does two completely different things depending on where it appears. When placed on a line above a function or class definition, it applies a decorator. When placed between two values in an expression, it performs matrix multiplication. These are unrelated features that share a symbol, which is why the @ character can be confusing the first time you encounter it. This article explains both uses from scratch, with code examples that demonstrate exactly what Python does when it sees @.
Python introduced the @ symbol for function and method decorators in version 2.4, following the proposal in PEP 318. Class decorator support was extended later in Python 2.6 via PEP 3129. Over a decade after PEP 318, Python 3.5 added a second use for @ as the matrix multiplication operator via PEP 465, authored by Nathaniel J. Smith. The two uses are distinguished entirely by context: Python's parser knows which meaning to apply based on where the symbol appears in your code.
Use 1: Decorator Syntax
The primary use of @ in Python is decorator syntax. When you see @something on the line directly above a def or class statement, Python is applying a decorator to that function or class. A decorator is a function that takes another function as its argument and returns a modified version of it.
The @ decorator syntax was finalized at EuroPython 2004, where Guido van Rossum chose what PEP 318 calls the "Java-style @decorator syntax." Barry Warsaw named it the pie-decorator syntax — partly because the @ character looks like a pie, and partly as a nod to the Pie-thon Parrot shootout happening at the same time. The name "decorator" itself comes from the compiler world, where syntax trees are walked and annotated. In Python 2.4, @ only applied to functions and methods; class decorator support was added two years later in Python 2.6 via PEP 3129.
from functools import wraps
def shout(func):
@wraps(func)
def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
return result.upper()
return wrapper
@shout
def greet(name):
return f"hello, {name}"
print(greet("Kandi"))
# HELLO, KANDI
The @shout line tells Python to pass the greet function to the shout function and replace greet with whatever shout returns. In this case, shout returns a wrapper function that calls the original greet, takes its return value, and converts it to uppercase. After the decorator is applied, calling greet("Kandi") calls the wrapper function, which in turn calls the original greet and transforms the result.
Without the @ syntax, you would need to write the decoration manually after the function definition. The @ line is a shortcut that keeps the decoration visible at the top of the function, right where you declare it.
Given the code below, what does Python do when it encounters @shout on the line above def greet?
How Decorator @ Translates to Code
The @decorator syntax is syntactic sugar. It does not introduce any new behavior that was not already possible in Python. Every use of @ as a decorator can be rewritten as a manual function call. Understanding the translation helps demystify what @ does.
# WITH @ syntax:
@shout
def greet(name):
return f"hello, {name}"
# WITHOUT @ syntax (identical behavior):
def greet(name):
return f"hello, {name}"
greet = shout(greet) # greet is now the wrapper returned by shout
These two forms are exactly equivalent. Python's interpreter transforms the first form into the second form before running it. The @shout line above the def is just a cleaner way to write greet = shout(greet) on the line below it.
When a decorator takes arguments, the translation has one more step. The decorator expression is called first to produce the decorator, and then the decorator is called with the function:
from functools import wraps
def repeat(num_times):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
for _ in range(num_times):
result = func(*args, **kwargs)
return result
return wrapper
return decorator
# WITH @ syntax:
@repeat(num_times=3)
def say_hi():
print("Hi!")
# WITHOUT @ syntax (identical behavior):
def say_hi():
print("Hi!")
say_hi = repeat(num_times=3)(say_hi)
say_hi()
# Hi!
# Hi!
# Hi!
@repeat(num_times=3) first calls repeat(num_times=3), which returns the decorator function. Python then calls decorator(say_hi), which returns the wrapper function. The name say_hi is then rebound to that wrapper.
Which of the following is exactly equivalent to writing @my_dec above def process()?
The @ syntax only works on the line immediately above a def or class statement. You cannot use @ to decorate a variable assignment, a lambda, or any other construct. It is exclusively for function and class definitions.
Stacking Multiple @ Decorators
Multiple @ lines can be stacked on top of each other. Python applies them from bottom to top, meaning the decorator closest to the def wraps the function first:
from functools import wraps
def bold(func):
@wraps(func)
def wrapper(*args, **kwargs):
return f"<b>{func(*args, **kwargs)}</b>"
return wrapper
def italic(func):
@wraps(func)
def wrapper(*args, **kwargs):
return f"<i>{func(*args, **kwargs)}</i>"
return wrapper
@bold
@italic
def say_hello():
return "hello"
print(say_hello())
# <b><i>hello</i></b>
# Equivalent to:
# say_hello = bold(italic(say_hello))
italic wraps say_hello first, then bold wraps the result. The output shows <b> on the outside and <i> on the inside, confirming that bold is the outermost wrapper.
Practical Decorator Patterns
The toy examples above illustrate the mechanics. In production Python code, decorators solve recurring structural problems: retrying unreliable operations, enforcing rate limits, validating arguments before a function body runs, and memoizing expensive computations. The following patterns appear in real codebases — not just in tutorials.
Retry with exponential backoff. Network calls and external API requests fail intermittently. A retry decorator handles that failure mode at the call site, keeping the function body clean:
import time
import random
from functools import wraps
def retry(max_attempts=3, delay=1.0, backoff=2.0, exceptions=(Exception,)):
"""Retry a function on failure with exponential backoff.
Args:
max_attempts: Total number of attempts before giving up.
delay: Initial wait time in seconds between attempts.
backoff: Multiplier applied to delay after each failure.
exceptions: Tuple of exception types that trigger a retry.
"""
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
wait = delay
for attempt in range(1, max_attempts + 1):
try:
return func(*args, **kwargs)
except exceptions as exc:
if attempt == max_attempts:
raise
print(
f"{func.__name__}: attempt {attempt} failed "
f"({exc!r}), retrying in {wait:.1f}s"
)
time.sleep(wait)
wait *= backoff
return wrapper
return decorator
@retry(max_attempts=4, delay=0.5, backoff=2.0, exceptions=(OSError, TimeoutError))
def fetch_config(url: str) -> dict:
"""Fetch remote configuration. May fail transiently."""
# Simulates a flaky network call for demonstration purposes
if random.random() < 0.6:
raise OSError("connection reset by peer")
return {"timeout": 30, "retries": 3}
config = fetch_config("https://config.example.com/app.json")
# fetch_config: attempt 1 failed (OSError('connection reset by peer')), retrying in 0.5s
# fetch_config: attempt 2 failed (OSError('connection reset by peer')), retrying in 1.0s
print(config)
The retry decorator accepts only specific exception types. Non-matching exceptions propagate immediately. The backoff multiplier doubles the wait after each failure, which reduces pressure on a recovering service. The function body — fetch_config — contains only the request logic; the retry policy lives entirely in the decorator layer.
Argument validation before the function body runs. Rather than littering a function's body with guard clauses, a validator decorator enforces preconditions at decoration time using annotations:
import inspect
from functools import wraps
def validate_types(func):
"""Raise TypeError if arguments do not match annotated types."""
sig = inspect.signature(func)
hints = func.__annotations__
@wraps(func)
def wrapper(*args, **kwargs):
bound = sig.bind(*args, **kwargs)
bound.apply_defaults()
for param_name, value in bound.arguments.items():
if param_name in hints and not isinstance(value, hints[param_name]):
expected = hints[param_name].__name__
got = type(value).__name__
raise TypeError(
f"{func.__name__}() argument '{param_name}' "
f"expected {expected}, got {got}"
)
return func(*args, **kwargs)
return wrapper
@validate_types
def send_message(recipient: str, body: str, priority: int = 1) -> None:
print(f"[P{priority}] To {recipient}: {body}")
send_message("ops-team", "deploy complete", priority=2)
# [P2] To ops-team: deploy complete
send_message("ops-team", "deploy complete", priority="high")
# TypeError: send_message() argument 'priority' expected int, got str
The decorator reads the function's type annotations at definition time using inspect.signature and checks each bound argument at call time. The function body is never reached when types are wrong. This pattern is used in lightweight validation layers where a full schema library would be too heavy.
Memoization with functools.lru_cache. Python ships a memoization decorator in the standard library. It caches return values keyed by the arguments, eliminating redundant computation for pure functions called repeatedly with the same inputs:
from functools import lru_cache
import time
@lru_cache(maxsize=256)
def levenshtein(s1: str, s2: str) -> int:
"""Edit distance between two strings (cached recursion)."""
if not s1:
return len(s2)
if not s2:
return len(s1)
if s1[0] == s2[0]:
return levenshtein(s1[1:], s2[1:])
return 1 + min(
levenshtein(s1[1:], s2), # delete
levenshtein(s1, s2[1:]), # insert
levenshtein(s1[1:], s2[1:]) # replace
)
start = time.perf_counter()
print(levenshtein("saturday", "sunday")) # 3
print(levenshtein("saturday", "sunday")) # 3 — served from cache
elapsed = time.perf_counter() - start
# Cache statistics: hits, misses, maxsize, currsize
print(levenshtein.cache_info())
# CacheInfo(hits=1, misses=..., maxsize=256, currsize=...)
print(f"total: {elapsed:.6f}s")
Without @lru_cache, the naïve recursive edit-distance function recomputes the same subproblems exponentially. With it, each unique (s1, s2) pair is computed once and stored in a bounded cache. The maxsize=256 parameter caps memory usage; when the cache is full, the least recently used entry is evicted. Passing maxsize=None creates an unbounded cache (@cache in Python 3.9+ is shorthand for this). The .cache_info() and .cache_clear() methods are attached to the decorated function automatically by lru_cache.
Why @wraps Matters Inside a Decorator
Every code example in this article uses @wraps(func) from the functools module inside the decorator's inner function. Without it, the decorator silently replaces the wrapped function's identity. Here is what gets overwritten when you omit @wraps:
from functools import wraps
def shout_bad(func):
"""A decorator that forgets to use @wraps."""
def wrapper(*args, **kwargs):
return func(*args, **kwargs).upper()
return wrapper
def shout_good(func):
"""A decorator that uses @wraps correctly."""
@wraps(func)
def wrapper(*args, **kwargs):
return func(*args, **kwargs).upper()
return wrapper
@shout_bad
def greet_bad(name):
"""Returns a greeting."""
return f"hello, {name}"
@shout_good
def greet_good(name):
"""Returns a greeting."""
return f"hello, {name}"
# Without @wraps: the function's identity is lost
print(greet_bad.__name__) # wrapper -- WRONG
print(greet_bad.__doc__) # None -- WRONG
# With @wraps: identity is preserved
print(greet_good.__name__) # greet_good -- correct
print(greet_good.__doc__) # Returns a greeting. -- correct
# @wraps also sets __wrapped__, enabling introspection tools
# to unwrap the decorator chain and reach the original function
print(greet_good.__wrapped__) # <function greet_good at 0x...>
@wraps copies __name__, __qualname__, __doc__, __dict__, __module__, and __annotations__ from the original function onto the wrapper. It also sets __wrapped__ to point back to the unwrapped original, which tools like inspect.unwrap() follow to reach the base function. Omitting @wraps is one of the silent errors in Python that does not raise an exception but breaks documentation generators, test introspectors, and any code that reads function metadata.
Python includes several built-in decorators that you will encounter in standard code. Each one uses the same @ syntax, but they serve very different purposes.
class Circle:
def __init__(self, radius):
self._radius = radius
@property
def radius(self):
"""Access radius like an attribute instead of a method call."""
return self._radius
@radius.setter
def radius(self, value):
if value < 0:
raise ValueError("Radius cannot be negative")
self._radius = value
@staticmethod
def unit_circle():
"""Create a circle with radius 1. No access to self needed."""
return Circle(1)
@classmethod
def from_diameter(cls, diameter):
"""Create a circle from diameter. Receives the class, not an instance."""
return cls(diameter / 2)
c = Circle(5)
print(c.radius) # 5 (accessed like an attribute)
c.radius = 10 # setter validates the value
print(c.radius) # 10
unit = Circle.unit_circle() # called on the class, not an instance
print(unit.radius) # 1
half = Circle.from_diameter(20) # classmethod receives the class itself
print(half.radius) # 10.0
@property turns a method into something that behaves like an attribute. @staticmethod removes the automatic self parameter, making the method callable without an instance. @classmethod replaces self with cls, giving the method access to the class itself rather than a specific instance. All three use the same @ syntax to modify how the method behaves.
A method decorated with @classmethod receives which argument as its first parameter instead of self?
The @dataclass decorator from the dataclasses module is another built-in you will see frequently. It automatically generates __init__, __repr__, and __eq__ methods for a class based on its annotated fields, saving significant boilerplate code.
Use 2: Matrix Multiplication Operator
Starting in Python 3.5, the @ symbol gained a second meaning: a binary operator for matrix multiplication. When @ appears between two values in an expression (rather than above a def statement), Python treats it as an operator that calls the __matmul__ method on the left operand.
This use was introduced by PEP 465 specifically to improve readability for scientific and mathematical code. The mnemonic from the PEP is: @ is * for mATrices.
import numpy as np
A = np.array([[1, 2],
[3, 4]])
B = np.array([[5, 6],
[7, 8]])
# Matrix multiplication using @
C = A @ B
print(C)
# [[19 22]
# [43 50]]
# This is identical to:
C_alt = np.matmul(A, B)
print(C_alt)
# [[19 22]
# [43 50]]
A @ B performs matrix multiplication: each element in the result is the dot product of a row from A and a column from B. This is different from A * B, which performs element-wise multiplication (each element in A is multiplied by the corresponding element in B).
The @ operator significantly improves readability when mathematical formulas involve multiple matrix products. Consider the difference when translating a linear algebra formula into code:
import numpy as np
# Simulated data for linear regression
rng = np.random.default_rng(42)
X = rng.random((100, 3))
y = rng.random(100)
# Without @: nested function calls are hard to read
beta_old = np.dot(np.linalg.inv(np.dot(X.T, X)), np.dot(X.T, y))
# With @: reads like the math formula beta = (X^T X)^-1 X^T y
beta_new = np.linalg.inv(X.T @ X) @ X.T @ y
# Both produce the same result
print(np.allclose(beta_old, beta_new)) # True
# Note: for production code, np.linalg.lstsq avoids explicit inversion
# and is numerically more stable for ill-conditioned matrices:
# beta_lstsq, _, _, _ = np.linalg.lstsq(X, y, rcond=None)
The version with @ reads left to right and closely mirrors the mathematical notation. The version with np.dot() nests function calls inside each other, making it harder to trace the order of operations.
Implementing @ on Custom Classes
The @ operator is not limited to NumPy arrays. Any class can support it by implementing the __matmul__ dunder method. Python also recognizes __rmatmul__ (right-hand matrix multiplication) and __imatmul__ (in-place @=):
from __future__ import annotations
class Vector:
def __init__(self, components: list[float]) -> None:
self.components = list(components)
def __matmul__(self, other: Vector) -> float:
"""Dot product via the @ operator."""
if len(self.components) != len(other.components):
raise ValueError("Vectors must have the same length")
return sum(a * b for a, b in zip(self.components, other.components))
def __repr__(self) -> str:
return f"Vector({self.components})"
v1 = Vector([1, 2, 3])
v2 = Vector([4, 5, 6])
# @ calls __matmul__, computing the dot product
dot_product = v1 @ v2
print(dot_product) # 32 (1*4 + 2*5 + 3*6)
The @ operator is a general-purpose hook. While PEP 465 designed it with matrix multiplication in mind, the __matmul__ method can be given any behavior that makes sense for the class.
Operator Precedence and operator.matmul
PEP 465 specified that @ has the same precedence and associativity as *. This means a chain like A @ B @ C evaluates left to right as (A @ B) @ C, and A @ B + C evaluates as (A @ B) + C because @ binds more tightly than +. The Python standard library also exposes a function form of the operator through the operator module:
import operator
import numpy as np
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
# operator.matmul is the function equivalent of the @ operator
result = operator.matmul(A, B)
print(result)
# [[19 22]
# [43 50]]
# Identical to:
print(A @ B)
# [[19 22]
# [43 50]]
# operator.matmul is useful when you need to pass matrix multiplication
# as a callable to higher-order functions like functools.reduce
import functools
matrices = [np.eye(2), A, B]
product = functools.reduce(operator.matmul, matrices)
print(product)
# [[19. 22.]
# [43. 50.]]
operator.matmul was added to the standard library at the same time the @ operator was introduced in Python 3.5. It is useful in any context where you need to pass matrix multiplication as a first-class callable, such as a key function or an argument to functools.reduce.
Python's parser determines the meaning of @ based on its syntactic position. There is no ambiguity, because the two uses appear in different grammatical contexts.
A.__imatmul__(B)The parser never confuses the two because they occupy different positions in the grammar. A decorator @ always appears at the start of a line, immediately followed by an expression and then a newline. An operator @ always appears between two expressions in the middle of a statement. There is no scenario where the same @ token could be interpreted as both.
Here is a snippet that uses both meanings in the same file to illustrate the distinction:
import time
import numpy as np
from functools import wraps
# @ as a decorator: wraps a function with timing logic
def timer(func):
@wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{func.__name__}: {elapsed:.4f}s")
return result
return wrapper
@timer # <-- @ means "apply this decorator"
def multiply_matrices(a, b):
return a @ b # <-- @ means "matrix multiplication"
A = np.random.rand(500, 500)
B = np.random.rand(500, 500)
C = multiply_matrices(A, B)
# multiply_matrices: 0.0052s
print(C.shape) # (500, 500)
Line 15 uses @timer as decorator syntax: it passes multiply_matrices to timer and replaces it with the wrapper. Line 17 uses a @ b as the matrix multiplication operator: it calls NumPy's __matmul__ to compute the product of the two arrays. The same symbol, two different meanings, determined entirely by position.
When Python evaluates A @ B inside an expression, what does it actually call?
Key Takeaways
- 01The
@symbol has two uses in Python. Above adeforclass, it is decorator syntax. Between two values, it is the matrix multiplication operator. These are completely separate features that share a symbol. - 02Decorator
@is syntactic sugar. Writing@decoratorabove a function is equivalent to writingfunc = decorator(func)after the function definition. It does not add any capability that did not already exist. - 03Decorators solve real structural problems. Beyond toy examples, the pattern handles retry with exponential backoff, argument validation using
inspect.signature, and memoization withfunctools.lru_cache. These are the decorator uses that appear most often in production codebases. - 04PEP 318 covered functions; class decorators came later. The
@decorator syntax for functions and methods was introduced in Python 2.4 (PEP 318, 2004). Class decorator support was added separately in Python 2.6 via PEP 3129. The two are often described together but were distinct language changes separated by two years. - 05Always use
@wraps(func)when writing decorators. Without it, the wrapper silently replaces the original function's__name__,__doc__, and other metadata.@wrapsalso sets__wrapped__, which introspection tools use to unwrap the decorator chain. - 06Matrix multiplication
@calls__matmul__. Introduced in Python 3.5 via PEP 465 (authored by Nathaniel J. Smith), the operator has the same precedence as*and improves readability for scientific code. The standard library exposesoperator.matmul()as its callable equivalent. - 07Python's parser resolves the ambiguity. A decorator
@appears at the start of a line above adeforclassstatement. An operator@appears between two values inside an expression. There is no case where the meaning is unclear.
The @ symbol is one of the few characters in Python that carries more than one meaning, but the two uses never collide. If you see @ above a function or class, you are looking at a decorator — and once you understand that mechanics, the practical patterns follow naturally: retry logic, argument validation, memoization, and anything else that belongs in the wrapping layer rather than the function body. If you see @ between two values, you are looking at matrix multiplication, with __matmul__ and operator.matmul as the underlying machinery. Recognizing which context you are in is all it takes to read @ correctly, and knowing which real-world problems decorators solve is what lets you use it well.