Python Class Decorator: How to Make a Class Instance Behave Like a Callable

Function-based decorators use closures to wrap behavior around a target function. That works well until you need to track state between calls, expose methods on the decorator itself, or organize complex configuration logic. At that point, a class whose instances are callable through the __call__ method becomes the cleaner alternative. This article covers the full pattern for building class-based decorators in Python, from the basic __init__ / __call__ structure through preserving metadata, handling method decoration, and adding configuration arguments. It is part of the Python tutorials published on PythonCodeCrack.

Why __call__ Makes a Class Instance Callable

In Python, any object whose class defines a __call__ method is callable. The distinction matters: adding __call__ directly to an instance does not make it callable, because Python looks up special methods on the type, not the instance. You can verify this with the built-in callable() function:

class Greeter:
    def __init__(self, greeting):
        self.greeting = greeting

    def __call__(self, name):
        return f"{self.greeting}, {name}!"


hello = Greeter("Hello")
print(callable(hello))
print(hello("reader"))
Output True Hello, reader!

The instance hello behaves like a function. You call it with parentheses and arguments, and Python routes the call to __call__. Because a decorator is any callable that takes a function and returns a callable, a class with __call__ can serve as a decorator. This same mechanic is why lambdas, functools.partial objects, and built-in functions are all valid decorators — their types all implement __call__. The Python data model documentation specifies that for implicit invocations like obj(args), Python always looks up __call__ on type(obj), bypassing the instance dictionary entirely.

Execution Flow: What Happens at Each Stage
Definition time
@LogCalls
def add(a, b): ...
Python calls LogCalls(add) — the class is instantiated with the function as the argument to __init__
After decoration
add = LogCalls(add)
The name add in the module namespace now points to a LogCalls instance, not the original function
Call time
add(3, 5)
Python calls LogCalls.__call__(self, 3, 5)__call__ runs your wrapper logic and then calls the original stored function

The Basic Class Decorator Pattern

The simplest class-based decorator receives the function in __init__ and wraps it in __call__:

class LogCalls:
    def __init__(self, func):
        self.func = func

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        result = self.func(*args, **kwargs)
        print(f"{self.func.__name__} returned {result}")
        return result


@LogCalls
def add(a, b):
    return a + b


print(add(3, 5))
Output Calling add add returned 8 8

When Python encounters @LogCalls, it calls LogCalls(add), creating an instance. That instance replaces add in the module namespace. Every subsequent call to add(3, 5) actually calls LogCalls.__call__(self, 3, 5).

One consequence worth knowing: type(add) is now LogCalls, not function. Any code that tests isinstance(add, types.FunctionType) will return False. If you have utilities, test harnesses, or framework code that inspects whether something is a plain function, a class-based decorator will fail that check. Function-based decorators do not have this problem because their wrappers are ordinary functions. This is a real compatibility consideration when introducing class-based decorators into an existing codebase.

Preserving Metadata with update_wrapper

After decoration, accessing multiply.__name__ on the unpatched instance raises AttributeError — class instances do not carry a __name__ attribute unless one is explicitly set. Tools like help(), inspect.signature(), and testing frameworks depend on that attribute being present. Use functools.update_wrapper to copy the original function's metadata onto the instance:

import functools


class LogCalls:
    def __init__(self, func):
        self.func = func
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        return self.func(*args, **kwargs)


@LogCalls
def multiply(a, b):
    """Multiply two numbers."""
    return a * b


print(multiply.__name__)
print(multiply.__doc__)
print(multiply.__wrapped__)
Output multiply Multiply two numbers. <function multiply at 0x...>
Note

For function-based decorators, you use @functools.wraps(func) on the inner wrapper. For class-based decorators, you call functools.update_wrapper(self, func) in __init__ because the class instance itself is the wrapper. They do the same thing — wraps is a partial application of update_wrapper, so the two are mechanically identical. The attributes copied by default (defined in functools.WRAPPER_ASSIGNMENTS) are __module__, __name__, __qualname__, __annotations__, and __doc__. As of Python 3.12, __type_params__ was added to that list (per the official docs). Additionally, the wrapper's __dict__ is updated (merged, not replaced) with the original's, and a __wrapped__ attribute is set pointing back to the original function.

The __wrapped__ attribute set by update_wrapper does more than serve as documentation. inspect.signature() follows the __wrapped__ chain to return the original function's signature rather than the wrapper's. This means type checkers, IDEs, and help() all see the correct parameter names and annotations after decoration — as long as update_wrapper was called.

Check Your Understanding Question 1 of 3

After applying @LogCalls to def add(a, b) — without calling functools.update_wrapper — what does add.__name__ return?

What does functools.update_wrapper(self, func) set the __wrapped__ attribute to?

Which attribute is in functools.WRAPPER_ASSIGNMENTS but is often overlooked when people list what update_wrapper copies?

Stateful Decorators: Tracking Data Across Calls

A closure can track state too, but it requires nonlocal variables, which can only be mutated — not inspected or reset from the outside. A class instance holds state as ordinary attributes, which are readable and writable from anywhere that has a reference to the decorator. That distinction becomes important the moment you want a decorator to answer questions like "how many times has this been called?" or "clear the cache now." The primary advantage of class-based decorators over closures is precisely this: clean, readable state management. Instance attributes persist across calls naturally:

import functools
import time


class Timer:
    def __init__(self, func):
        self.func = func
        self.total_time = 0.0
        self.call_count = 0
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        start = time.perf_counter()
        result = self.func(*args, **kwargs)
        elapsed = time.perf_counter() - start
        self.total_time += elapsed
        self.call_count += 1
        return result

    def stats(self):
        avg = self.total_time / self.call_count if self.call_count else 0
        return {
            "calls": self.call_count,
            "total_seconds": round(self.total_time, 6),
            "avg_seconds": round(avg, 6),
        }


@Timer
def compute(n):
    return sum(range(n))


compute(1_000_000)
compute(2_000_000)
compute(500_000)

print(compute.stats())
Output {'calls': 3, 'total_seconds': 0.089341, 'avg_seconds': 0.029780}

The stats() method is accessible directly on the decorated function because compute is a Timer instance. With a closure-based decorator, exposing extra methods requires attaching them to the wrapper function, which is less natural.

Thread Safety

Stateful class-based decorators are not thread-safe by default. If a decorated function is called concurrently from multiple threads, increments like self.count += 1 are read-modify-write operations that can interleave, producing incorrect counts. For production use in multithreaded code, protect mutable state with a threading.Lock:

import functools
import threading


class ThreadSafeCounter:
    def __init__(self, func):
        self.func = func
        self.count = 0
        self._lock = threading.Lock()
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        with self._lock:
            self.count += 1
        return self.func(*args, **kwargs)

    def reset(self):
        with self._lock:
            self.count = 0

The lock only guards the counter update, not the function call itself, so the decorated function still runs concurrently. Only the state mutation is serialised. If your decorator accumulates results into a list or dict, those operations also need lock protection.

Parameterized Class Decorators

So far, the class has received the function directly in __init__. But what if the decorator itself needs configuration — a retry count, a timeout, a log level? The trick is to shift the roles one step: __init__ receives the configuration and stores it, then __call__ plays the role that __init__ played before — it receives the function and returns a wrapper. The decorator syntax @Retry(max_attempts=3) creates the configured instance first, and Python then calls that instance with the function automatically.

When the decorator needs configuration arguments, the class structure changes. The __init__ receives the configuration, and __call__ receives the function. The __call__ method returns a wrapper function rather than calling the function directly:

import functools
import time


class Retry:
    def __init__(self, max_attempts=3, delay=1.0):
        self.max_attempts = max_attempts
        self.delay = delay

    def __call__(self, func):
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            last_exception = None
            for attempt in range(1, self.max_attempts + 1):
                try:
                    return func(*args, **kwargs)
                except Exception as e:
                    last_exception = e
                    print(f"  Attempt {attempt} failed: {e}")
                    if attempt < self.max_attempts:
                        time.sleep(self.delay)
            raise last_exception
        return wrapper


@Retry(max_attempts=3, delay=0.1)
def fetch_data(url):
    """Fetch data from a remote endpoint."""
    raise ConnectionError(f"Cannot reach {url}")


print(fetch_data.__name__)

try:
    fetch_data("https://api.example.com/data")
except ConnectionError:
    print("All attempts exhausted")
Output fetch_data Attempt 1 failed: Cannot reach https://api.example.com/data Attempt 2 failed: Cannot reach https://api.example.com/data Attempt 3 failed: Cannot reach https://api.example.com/data All attempts exhausted

The flow here is different from the non-parameterized version. @Retry(max_attempts=3, delay=0.1) first creates a Retry instance with the configuration. Python then calls that instance with fetch_data as the argument, which invokes __call__(self, func). The method returns the wrapper function, which replaces fetch_data in the namespace.

The Method Problem and the Descriptor Protocol

When Python looks up obj.method, it doesn't just return whatever is stored at that attribute. If the stored object implements a __get__ method, Python calls it and uses the result instead. This is the descriptor protocol, and it's how ordinary functions become bound methods — functions implement __get__ natively in C. A class-based decorator instance does not implement __get__ by default, so the protocol never fires, and the instance argument never gets prepended. That's the entire method problem in one sentence.

A non-parameterized class-based decorator has a known limitation: it does not work correctly on class methods. When a class instance replaces a function, it loses the ability to bind to the object it is called on. The self argument of the method never gets passed:

import functools


class LogCalls:
    def __init__(self, func):
        self.func = func
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        return self.func(*args, **kwargs)


class Calculator:
    @LogCalls
    def add(self, a, b):
        return a + b


c = Calculator()
try:
    c.add(2, 3)
except TypeError as e:
    print(f"Error: {e}")
Output Calling add Error: Calculator.add() missing 1 required positional argument: 'b'

The problem is that LogCalls instances are not descriptors. When Python looks up c.add, it finds a LogCalls instance but does not invoke the descriptor protocol to bind c as the first argument. You can fix this by implementing __get__:

import functools
from types import MethodType


class LogCalls:
    def __init__(self, func):
        self.func = func
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        return self.func(*args, **kwargs)

    def __get__(self, obj, objtype=None):
        if obj is None:
            return self
        return MethodType(self, obj)


class Calculator:
    @LogCalls
    def add(self, a, b):
        return a + b


c = Calculator()
print(c.add(2, 3))
Output Calling add 5

The __get__ method makes the LogCalls instance participate in the descriptor protocol. When c.add is accessed, Python calls LogCalls.__get__(self, c, Calculator), which returns a bound method. The bound method prepends c as the first argument when called, so self is correctly passed to Calculator.add.

A precise note on descriptor type: because LogCalls defines only __get__ and not __set__ or __delete__, it is a non-data descriptor. The Python data model (Python Descriptor HowTo Guide) distinguishes non-data descriptors from data descriptors specifically by this: instance __dict__ entries take precedence over non-data descriptors during attribute lookup. In practice this means that if something sets c.__dict__['add'], that value will shadow the decorated method. This is the same behaviour regular functions exhibit, so it is rarely a problem — but it is the technically precise reason the pattern works the way it does.

Critical

If your class-based decorator will ever be used on methods (not just standalone functions), you must implement __get__. Without it, the decorator silently receives the wrong number of arguments.

🐛 Spot the Bug This decorator is broken when applied to a class method. What's wrong?
import functools

class LogCalls:
    def __init__(self, func):
        self.func = func
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        return self.func(*args, **kwargs)

class Calculator:
    @LogCalls
    def add(self, a, b):
        return a + b

c = Calculator()
c.add(2, 3)  # TypeError

Select the diagnosis that explains why this fails:

Returning a Wrapper Instead of Using __call__ Directly

An alternative to implementing __get__ is to have __call__ return a regular wrapper function instead of calling the stored function directly. Because functions are descriptors natively, the returned wrapper handles method binding automatically:

import functools
from types import MethodType


class CountCalls:
    def __init__(self, func):
        self.func = func
        self.count = 0

    def __call__(self, *args, **kwargs):
        self.count += 1
        print(f"[{self.func.__name__}] call #{self.count}")
        return self.func(*args, **kwargs)


# Alternative: return a wrapper from __init__ via __call__ acting as factory
class CountCallsSafe:
    def __init__(self, func):
        self.count = 0
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            self.count += 1
            print(f"[{func.__name__}] call #{self.count}")
            return func(*args, **kwargs)
        self.wrapper = wrapper

    def __call__(self, *args, **kwargs):
        return self.wrapper(*args, **kwargs)

    def __get__(self, obj, objtype=None):
        if obj is None:
            return self
        return MethodType(self, obj)

Both approaches work. The first is more direct but requires __get__ for method compatibility. The second approach stores a closure-based wrapper and delegates to it, combining the class's state management with the closure's native descriptor behavior.

Exposing Decorator Methods to Callers

One of the strongest reasons to use a class-based decorator is that callers can interact with the decorator instance through its methods. A Timer decorator can expose stats() and reset(). A Cache decorator can expose cache_clear() and cache_info():

import functools


class Memoize:
    def __init__(self, func):
        self.func = func
        self.cache = {}
        functools.update_wrapper(self, func)

    def __call__(self, *args):
        if args in self.cache:
            return self.cache[args]
        result = self.func(*args)
        self.cache[args] = result
        return result

    def cache_clear(self):
        self.cache.clear()

    def cache_info(self):
        return {"size": len(self.cache)}


@Memoize
def fibonacci(n):
    if n < 2:
        return n
    return fibonacci(n - 1) + fibonacci(n - 2)


print(fibonacci(10))
print(fibonacci.cache_info())

fibonacci.cache_clear()
print(fibonacci.cache_info())
Output 55 {'size': 11} {'size': 0}

The fibonacci name points to a Memoize instance, so fibonacci.cache_clear() and fibonacci.cache_info() are natural method calls. With a closure-based decorator, you would need to attach these as attributes on the wrapper function, which feels less cohesive.

Stacking Class Decorators

Class-based decorators can be stacked with other decorators just like function-based ones. The rules are the same: decorators are applied bottom-up, meaning the one closest to the function definition runs first. What changes with class-based decorators is how the __wrapped__ chain behaves and what inspect.unwrap() can recover.

import functools
import inspect


class LogCalls:
    def __init__(self, func):
        self.func = func
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"LogCalls: calling {self.func.__name__}")
        return self.func(*args, **kwargs)


def repeat(n):
    def decorator(func):
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            for _ in range(n):
                result = func(*args, **kwargs)
            return result
        return wrapper
    return decorator


# @LogCalls is outermost — it runs first on every call.
# @repeat(2) is innermost — it runs the original function twice.
@LogCalls
@repeat(2)
def greet(name):
    print(f"Hello, {name}")


greet("world")

# inspect.unwrap() follows __wrapped__ all the way to the original
print(inspect.unwrap(greet))
Output LogCalls: calling greet Hello, world Hello, world <function greet at 0x...>

The outer decorator always intercepts the call first. In this example LogCalls logs the call, then delegates to the repeat(2)-wrapped version, which runs the original body twice. inspect.unwrap() follows the __wrapped__ chain created by both functools.update_wrapper and @functools.wraps to reach the bare original function — useful for testing, introspection, and tools like pytest that need to call the unwrapped version directly.

One practical tip: if you stack a class-based decorator on top of another class-based decorator and neither calls functools.update_wrapper, the __wrapped__ chain breaks after the first link. Always call update_wrapper in every __init__ that wraps a function to keep the chain intact across all layers.

Decorating Async Functions

A naive class-based decorator breaks silently when applied to an async def function. The __call__ method calls self.func(*args, **kwargs), which returns a coroutine object — but nothing awaits it. The coroutine is created and immediately discarded, and Python emits a RuntimeWarning: coroutine was never awaited:

import asyncio
import functools


class LogCalls:
    def __init__(self, func):
        self.func = func
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        return self.func(*args, **kwargs)  # returns a coroutine, never awaited


@LogCalls
async def fetch(url):
    await asyncio.sleep(0)
    return f"data from {url}"


# This silently does nothing useful — the coroutine is returned but discarded.
result = fetch("https://api.example.com")
print(type(result))  # <class 'coroutine'>
Output Calling fetch <class 'coroutine'> RuntimeWarning: coroutine 'fetch' was never awaited

The fix is to check whether the wrapped function is a coroutine function at decoration time, and if so, define an async-compatible call path. The cleanest approach uses asyncio.iscoroutinefunction() in __init__ and returns the coroutine unchanged from __call__ — since the caller is responsible for awaiting it:

import asyncio
import functools


class LogCalls:
    def __init__(self, func):
        self.func = func
        self._is_async = asyncio.iscoroutinefunction(func)
        functools.update_wrapper(self, func)

    def __call__(self, *args, **kwargs):
        print(f"Calling {self.func.__name__}")
        result = self.func(*args, **kwargs)
        # If func is async, result is a coroutine. Return it so the caller can await it.
        return result


@LogCalls
async def fetch(url):
    await asyncio.sleep(0)
    return f"data from {url}"


@LogCalls
def add(a, b):
    return a + b


async def main():
    print(add(2, 3))           # works synchronously
    data = await fetch("https://api.example.com")
    print(data)

asyncio.run(main())
Output Calling add 5 Calling fetch data from https://api.example.com

This works because self.func(*args, **kwargs) on an async def function returns a coroutine object without executing it. The __call__ method passes that coroutine straight back to the caller, who then awaits it in the normal way. The _is_async flag on the instance is available if you want to branch logic inside __call__ based on whether the function is asynchronous.

Pro Tip

For decorators that need to run async logic around the call (timing, logging with await, etc.), define a separate async inner function inside __call__ and return it as a coroutine. This keeps the decorator fully async-aware without requiring a separate class per function type.

When to Choose Classes Over Closures

Pro Tip

If you are unsure, start with a function-based decorator. Refactor to a class when you find yourself using nonlocal to track mutable state across calls, or when you want the decorator to have its own public API (like reset() or stats()). For a related pattern, the @classmethod decorator demonstrates how Python uses the descriptor protocol to alter method binding at the class level.

Key Takeaways

  1. __call__ makes instances callable: Any object with a __call__ method can be used as a decorator because decorators are just callables that take a function and return a callable.
  2. Non-parameterized: __init__ gets the function, __call__ wraps it: The class instance replaces the function in the namespace. Every call to the decorated function routes through __call__.
  3. Parameterized: __init__ gets config, __call__ gets the function: The decorator expression @Class(args) creates an instance, then Python calls that instance with the function. __call__ returns a wrapper.
  4. Use functools.update_wrapper for metadata: Call functools.update_wrapper(self, func) in __init__ to copy __module__, __name__, __qualname__, __annotations__, and __doc__ to the instance (Python 3.12+ also copies __type_params__), merge __dict__, and set __wrapped__.
  5. Implement __get__ for method compatibility: Without the descriptor protocol, class-based decorators do not correctly bind the instance argument when used on methods. Implement __get__ to return a MethodType bound to the object — mirroring exactly how Python functions implement __get__ in C (Python Descriptor HowTo Guide). This makes the decorator a non-data descriptor, consistent with how all regular methods work.
  6. Class decorators excel at stateful behavior: Call counters, execution timers, caches, and rate limiters are all naturally expressed as class attributes. The decorator instance can also expose methods like stats(), reset(), and cache_clear() that callers can use directly.
  7. Protect mutable state in multithreaded code: Increments and list appends in __call__ are not atomic. Use threading.Lock to guard state mutations when the decorated function may be called from multiple threads simultaneously.
  8. Async functions need a pass-through, not an await: Calling an async def function returns a coroutine. A well-written __call__ returns that coroutine unchanged so the caller can await it. Check asyncio.iscoroutinefunction(func) in __init__ if you need to branch behaviour.
  9. Stacking preserves the __wrapped__ chain: When class-based and function-based decorators are stacked, inspect.unwrap() can follow the chain back to the original — but only if every layer calls functools.update_wrapper or @functools.wraps.
  10. type() changes after decoration: The decorated name is a class instance, not a types.FunctionType. Code that uses isinstance(fn, types.FunctionType) will return False. This is a known trade-off of the class-based approach.

Class-based decorators are not a replacement for function-based decorators — they are an alternative for cases where persistent state and a public method interface make the code clearer. The __call__ method is the bridge that lets an object behave like a function, and once you add functools.update_wrapper and __get__, the resulting decorator is production-ready. The natural next step from here is the wrapt library, which builds a transparent object proxy on top of these same principles, solving the full set of introspection edge cases that bare update_wrapper leaves open — including correct argument specification preservation and support for decorating descriptors that are themselves descriptors.

How to Build a Class-Based Decorator in Python

Step 1: Define __init__ to receive the decorated function

Write __init__(self, func) on your class and store the function as self.func. When Python processes the @ClassName syntax, it calls ClassName(your_function) — meaning __init__ runs immediately at decoration time, before the decorated function is ever called.

Step 2: Implement __call__ to wrap execution

Write __call__(self, *args, **kwargs) to define what happens each time the decorated function is invoked. Place your pre-call logic first, then call self.func(*args, **kwargs) and return its result. Every call to the decorated name routes through this method.

Step 3: Preserve function metadata with functools.update_wrapper

Call functools.update_wrapper(self, func) inside __init__ to copy __module__, __name__, __qualname__, __annotations__, and __doc__ from the original function to the class instance. Without this, tools like help(), inspect.signature(), and testing frameworks see the decorator class rather than the wrapped function.

Step 4: Add instance attributes to track state

Declare instance attributes in __init__ — a call counter, a results cache, an elapsed-time accumulator — to maintain data between calls. This is the primary advantage of the class-based approach over a closure, where equivalent state requires nonlocal and cannot be accessed through a clean method interface.

Step 5: Implement __get__ for method compatibility

Add def __get__(self, obj, objtype=None): return MethodType(self, obj) if obj is not None else self to make the decorator work correctly when applied to instance methods. Without this, the descriptor protocol is not triggered, and the method's implicit first argument (self of the decorated class) is never passed to the wrapped function.

Step 6: Handle async functions in __call__

Return self.func(*args, **kwargs) unchanged when the wrapped function is async — do not attempt to await it inside __call__. Calling an async def function produces a coroutine object; the caller is responsible for awaiting it. Use asyncio.iscoroutinefunction(func) in __init__ to detect async functions and branch internal logic if needed. See the full Python async guide for coroutine fundamentals.

Frequently Asked Questions

How do I make a class work as a decorator in Python?

Implement the __call__ method on the class. For a simple decorator, __init__ receives the function being decorated and stores it as an instance attribute. __call__ receives the call-time arguments, executes logic around the stored function, and returns its result. The class instance replaces the original function in the namespace.

When should I use a class-based decorator instead of a function?

Use a class-based decorator when you need to maintain state across multiple calls to the decorated function, when the decorator has complex configuration that benefits from instance attributes, or when you want the decorator itself to expose methods that callers can interact with (like resetting a counter or clearing a cache).

How do I preserve function metadata in a class-based decorator?

Use functools.update_wrapper(self, func) inside __init__, or if your __call__ returns a wrapper function, use @functools.wraps(func) on that wrapper. By default, update_wrapper copies __module__, __name__, __qualname__, __annotations__, and __doc__ from the original function to the class instance (these are defined in functools.WRAPPER_ASSIGNMENTS). It also merges the original's __dict__ into the wrapper's __dict__, and sets a __wrapped__ attribute pointing back to the unwrapped function. Python 3.12 added __type_params__ to the list. Source: Python docs: functools.update_wrapper.

Why does a class-based decorator fail when decorating a method?

Class instances are not descriptors by default, so they do not bind to the instance when accessed as method attributes. The self argument of the method never gets passed. To fix this, implement the __get__ method on the decorator class so it returns a bound version of itself using types.MethodType.

Can a class-based decorator wrap an async function?

Yes, but the __call__ method must not try to execute the coroutine itself. Calling an async def function returns a coroutine object. A correct __call__ returns that object unchanged so the caller can await it. Check asyncio.iscoroutinefunction(func) at decoration time if you need to differentiate behaviour between sync and async wrapped functions.

Are class-based stateful decorators thread-safe?

Not by default. Operations like self.count += 1 are read-modify-write sequences that can interleave across threads, producing incorrect results. Wrap state mutations in a threading.Lock to make them atomic. The lock should guard only the state update, not the wrapped function call, so the function itself still runs concurrently.