PythonFunctions

Python Functions Learning Path

25 tutorials covering def, lambda, *args, **kwargs, closures, decorators, and generators

25 tutorials beginner / intermediate / advanced

This Python functions learning path moves from writing your first def block through the advanced patterns that production Python relies on every day. Each section builds on the last: argument types, closures, decorators, generators, and functional programming tools like map(), filter(), and functools.

Every tutorial in this path includes runnable code examples. Start at the beginning or jump to the section that matches where you are.

What separates Python functions from those in most other languages is that they are first-class objects. This is not a figure of speech. When you write def greet(name): ..., CPython compiles your source into a code object — a PyCodeObject struct in C — and then wraps it in a function object stored on the heap. The function object holds a pointer to that code object, a reference to the global namespace it was defined in, and a tuple of default argument values. You can inspect all of this at runtime: greet.__code__.co_varnames, greet.__code__.co_argcount, greet.__defaults__. Nothing is hidden.

The code object itself carries the bytecode instructions CPython executes, the constant pool (co_consts), the local variable name table (co_varnames), and the free variable table (co_freevars) that closures depend on. Since Python 3.11, those bytecode instructions include inline cache entries — CACHE slots inserted immediately after certain opcodes so the interpreter can store runtime type information directly in the bytecode stream. When the interpreter sees that a particular operation consistently involves the same types, it replaces the generic opcode with a specialized one: LOAD_FAST may become a fused superinstruction, and a binary multiply on two floats may become BINARY_OP_MULTIPLY_FLOAT. This is the Specializing Adaptive Interpreter introduced by PEP 659 — the mechanism behind the 10–25% real-world speedups Python 3.11 delivered over 3.10.

Python 3.13 extended this with a Tier 2 IR (micro-ops) and a copy-and-patch JIT compiler introduced by PEP 744. The CPython documentation describes it as hot Tier 1 bytecode being translated to a purely internal intermediate representation. Python 3.14 — released October 7, 2025 — advanced this further: the official Windows and macOS binary releases now ship with the experimental JIT compiler included, enabled by default in those builds. Python 3.14 also raised the JIT warmup threshold from 16 loop iterations to 4,096, meaning code must run longer before JIT compilation triggers, but the compiled traces are more reliably hot when they do. Understanding how your function definitions feed into this pipeline — how def becomes bytecode, how bytecode gets specialized, how closures embed frames inside generator objects — is what separates writing Python from understanding it.

Closures add another layer. When an inner function references a variable from an enclosing scope, CPython does not copy that variable. It wraps it in a cell object. Both the outer and inner functions hold a reference to the same cell. You can see this directly: my_closure.__code__.co_freevars names the captured variables, and my_closure.__closure__ returns the tuple of cell objects. This cell-based design is also why the classic lambda-in-a-loop bug exists: all the lambdas share the same cell, and the cell holds the loop variable's final value, not the value at the time the lambda was created.

Generators take this further. A generator function compiles to a code object with the CO_GENERATOR flag set. When you call it, CPython does not execute the body. It allocates a PyGenObject and embeds a full _PyInterpreterFrame — stored in the gi_iframe field of the struct — directly inside the generator object on the heap. The CPython internal documentation states that generators and coroutines have a _PyInterpreterFrame embedded in them so they can be created with a single memory allocation. That is why generators can suspend and resume: their frame is not on the C call stack. When next() is called, the interpreter links the generator's frame back into the current thread's frame stack, resumes execution from the yield point, and unlinks it again when the next yield is reached. This is also why coroutines and async generators compose cleanly with await — all three types share the same _PyGenObject_HEAD macro and the same embedded frame mechanism.

Decorators are syntactic sugar over what the object model already supports. @functools.lru_cache above a def is exactly equivalent to assigning the result of lru_cache(your_function) back to the same name. The decorator receives your function object, wraps it in a new callable — typically a closure or a class instance with __call__ — and returns that wrapper. The only thing the @ syntax adds is readability and the guarantee that the wrapping happens immediately at definition time, not later when the name might have been rebound.

The tutorials in this path are sequenced so each concept has its prerequisites in place before you need them. The CPython internals notes above are woven throughout the individual tutorials in more depth, with disassembled bytecode examples you can run yourself.

Tutorials marked with the cert badge include a final exam that awards a certificate of completion you can download and share.

how-to

How to Use This Python Functions Learning Path

  1. 01
    Start with Defining and Calling Functions

    If you are new to Python functions, begin with Defining Functions in Python and Using def in Python. These two tutorials cover the syntax, function objects, return values, and docstrings you need before anything else makes sense.

  2. 02
    Work through Arguments and Parameters

    Read the Arguments and Parameters section in order. Start with keyword arguments and default values, then move to *args and **kwargs and the / and * parameter separators for positional-only and keyword-only control.

  3. 03
    Study Lambda, Closures, and Functional Patterns

    Once argument handling is solid, read Python Lambda vs def to understand when anonymous functions are appropriate. Follow that with the lambda loop closure tutorial to avoid a common bug, then work through the functional programming guide for map(), filter(), and composition patterns.

  4. 04
    Finish with Decorators, Generators, and Advanced Patterns

    Read Python Decorators Demystified before any generator tutorials — decorators rely on the first-class function concepts from earlier sections. Then work through Python yield, recursive generators, and Python Coroutines in sequence.

faq

Frequently Asked Questions About Python Functions

A function in Python is a named, reusable block of code defined with the def keyword. It takes zero or more parameters, executes a body of statements, and optionally returns a value with return. Functions are first-class objects in Python, which means they can be assigned to variables, passed as arguments, and returned from other functions.
A method is a function that is bound to an object and defined inside a class. It receives the object itself as its first argument, conventionally named self. A standalone function is not bound to any object. In practice, len(my_list) is a function call and my_list.append(1) is a method call. Both are implemented as function objects internally.
*args collects any number of positional arguments into a tuple. **kwargs collects any number of keyword arguments into a dictionary. Both are used when you want a function to accept a variable number of inputs without specifying each parameter by name. The names args and kwargs are conventions — the * and ** syntax is what matters.
A closure is a function that remembers variables from its enclosing scope even after that scope has finished executing. When an inner function references a variable from an outer function, Python stores that variable in a cell object attached to the inner function. Closures are the mechanism that makes decorators and factory functions work.
A decorator is a function that takes another function as its argument, wraps it in additional behavior, and returns the modified function. The @decorator_name syntax above a function definition is shorthand for func = decorator_name(func). Decorators are commonly used for logging, access control, caching, and timing.
A generator is a function that uses yield instead of return to produce a sequence of values lazily, one at a time. Each call to next() on the generator resumes execution from where it last paused at a yield. Generators are memory-efficient for large data sequences because they do not compute all values upfront.
A lambda function is an anonymous, single-expression function created with the lambda keyword. It is equivalent to a def function with one return statement. Lambdas are useful for short, throwaway functions passed as arguments — for example, as the key in sorted() or as a predicate in filter(). For anything longer than one expression, a named def function is clearer.
A practical order: (1) define and call functions with def and return, (2) understand positional, keyword, default, and variadic arguments including *args and **kwargs, (3) learn closures and scope, (4) study lambda functions and higher-order functions like map() and filter(), (5) learn decorators, and (6) finish with generators and coroutines. This path is structured in exactly that order.
return exits the function and sends one value back to the caller. The function's local state is destroyed. yield pauses the function, sends a value to the caller, and preserves the function's local state so it can be resumed later. A function that contains yield becomes a generator function and returns a generator object when called.
Python functions can return multiple values by returning a tuple. Writing return x, y is equivalent to return (x, y). The caller can unpack these values with a, b = my_func(). Technically only one object is returned — the tuple — but tuple unpacking makes it feel like multiple values.