How Python's generator.send() Works

Final Exam & Certification

Complete this tutorial and pass the 10-question final exam to earn a downloadable certificate of completion.

skip to exam

Many Python developers learn generators as one-way pipes: the function yields a value, the caller receives it, repeat until exhaustion. That mental model is incomplete. Since Python 2.5, generators have had a second channel — a way to push data back in while execution is paused. That channel is .send(), and understanding it changes how you think about what a generator actually is.

What's in this tutorial

The confusion around .send() is almost always the same confusion at its root: people treat yield as a statement that pauses and emits, full stop. But yield in Python is an expression — it both emits a value outward and resolves to a value inward, and .send() is what controls that inward resolution. This article explains exactly how that works, where it came from, what goes wrong when you skip the priming step, and where the pattern still belongs today.

Misconception Checker

Three beliefs about .send() — tap each one to see if it holds up before reading further.

T / F The return value of gen.send("hello") is "hello" — the generator echoes back what it received.
T / F Calling next(gen) and gen.send(None) behave differently — send(None) passes an explicit None while next() passes nothing at all.
T / F Calling a generator function — e.g. gen = my_gen() — does not execute any of the function body. The body only begins running when you first call next() or .send().

tap any statement to reveal the answer

Generators Were Always One-Way — Until Python 2.5

Python introduced generators in version 2.2 via PEP 255. The design was clean and deliberate: a generator function suspends at a yield statement, returns the yielded value to the caller, and resumes from that exact point when next() is called again. All local state — variable bindings, the instruction pointer, the internal stack — is preserved across the suspension. This made generators excellent lazy iterators: you could produce sequences of values on demand without building them all in memory at once.

What generators could not do was receive input after they started. Every call to next() resumed execution, but brought nothing with it. The generator could only observe its own internal state and whatever was captured in its closure. If you wanted it to behave differently based on external input, you had to encode that into the arguments passed at construction time. The communication was strictly one direction: generator to caller.

That limitation became a real problem as developers tried to use generators for more sophisticated tasks — simulations, event-driven pipelines, cooperative multitasking. The PEP 342 authors, Guido van Rossum and Phillip J. Eby, diagnosed the gap precisely. As stated in PEP 342, Python's generators were "almost coroutines — but not quite" because they allowed pausing execution to produce a value but provided no mechanism for passing values or exceptions back in when execution resumed.

The fix required two linked changes: making yield an expression rather than a pure statement, and adding a .send() method to the generator object. Both shipped together in Python 2.5, released in September 2006. The cumulative effect, as PEP 342 described it, was to turn generators from one-way producers of information into both producers and consumers — and in doing so, to turn them into proper coroutines.

How yield Became an Expression

Before Python 2.5, yield was a statement. You wrote yield some_value and that was the whole story: the value was emitted and execution paused. The statement had no return value because there was nothing to return — no channel existed through which the caller could inject anything.

PEP 342 changed yield into an expression. This is a meaningful distinction. An expression in Python produces a value. Statements do not. When yield became an expression, the syntax received = yield emitted_value became legal and meaningful: emitted_value is sent out to the caller, and whatever the caller injects back becomes the value that yield evaluates to — stored in received.

The PEP captures the conceptual flip precisely. As written in PEP 342: a yield expression is "like an inverted function call — the argument to yield is in fact returned (yielded) from the currently executing function, and the return value of yield is the argument passed in via send()." That inversion framing is the clearest single-sentence description of what .send() does. The generator's yield is both an outgoing return and an incoming parameter slot, depending on which side of the channel you are reading from.

The Python Language Reference formalizes the receive side: the value argument of .send() "becomes the result of the current yield expression." If no value is sent — if next() is called instead — the yield expression evaluates to None. That asymmetry is important and is the source of most confusion about .send().

Note

The PEP 342 authors recommend always wrapping yield expressions in parentheses when using the result: received = (yield emitted_value). The parens are not always syntactically required — received = yield emitted_value is valid at the top level of an assignment — but they make the expression boundary explicit and prevent subtle parsing errors when the yield result is used inside a larger expression.

The practical consequence of yield-as-expression is that a single yield point in your generator now does double duty simultaneously. It emits a value on the way out and receives a value on the way in. Those two actions happen at the same suspension point, which is why the timing of .send() is so important: you can only send into a yield that is currently waiting. You cannot send into thin air.

The Mechanics of .send() Step by Step

When you call gen.send(value), the following sequence happens in order inside CPython:

First, the generator's execution is resumed from the point where it is currently suspended — at a yield expression. Second, the value argument you passed becomes the result of that yield expression inside the generator's frame. Third, execution continues forward inside the generator until the next yield expression is reached. Fourth, the value following that next yield is returned to the caller as the return value of .send(). If no further yield is reached and the generator function returns, StopIteration is raised.

Here is a minimal example that makes each of these steps visible:

python
def two_way():
    print("Generator started")
    received = (yield "first yield")   # emits "first yield", waits
    print(f"Received: {received}")
    received2 = (yield "second yield") # emits "second yield", waits
    print(f"Received again: {received2}")
    # no more yields — StopIteration will be raised next

gen = two_way()

# Prime: advance to the first yield
out1 = next(gen)         # prints "Generator started"
print(out1)              # prints "first yield"

# Send a value in; generator resumes, "Hello" is the value of the yield expression
out2 = gen.send("Hello") # prints "Received: Hello"
print(out2)              # prints "second yield"

# Send again
try:
    gen.send("World")    # prints "Received again: World"
except StopIteration:
    print("Generator exhausted")

Trace through this carefully. The first next(gen) call advances execution to the first yield "first yield". At that point the generator is suspended: it has emitted "first yield" and is waiting at the yield expression. When gen.send("Hello") is called, "Hello" is injected as the value of that waiting yield expression — so received gets the string "Hello". Execution continues until the second yield "second yield", which emits "second yield" outward to the caller (the return value of .send()). The generator is now suspended again at the second yield. The final .send("World") injects "World" into that expression, prints the message, then the function ends with no further yields, raising StopIteration.

Pro Tip

The return value of .send() is always the value produced by the next yield the generator hits after resuming — not the value you sent in. The value you send in goes to the current yield expression. Keeping those two directions distinct in your mental model is the key to understanding .send() without confusion.

Note: .send() is not available on all iterables

.send() is a method on generator objects specifically — objects produced by calling a generator function. It is not available on arbitrary iterables. A list, a range, a custom class implementing __iter__, or even a generator expression like (x*2 for x in range(5)) does not expose .send(). Generator expressions produce generator objects with a .send() method only because Python compiles them into an internal generator function, but since there is no authored yield expression to inject into, sending a non-None value raises TypeError just as it would on any unprimed generator. If you need two-way communication, the object must be an explicitly authored generator function with a yield expression designed to capture the sent value.

Build It Yourself: Writing a Two-Way Generator from Scratch

The section above described what .send() does. This section makes you do it. Follow each step in order, typing the code yourself rather than copying the finished version. The goal is to feel the execution hand-off, not just read about it.

You will need Python 3.x open in a terminal, REPL, or notebook. Each step builds on the previous one.

Step 1 — Write the generator function shell

Start with the function definition only. Do not add any logic yet. A generator function is just a regular function that contains at least one yield expression.

python
def greeter():
    pass

Calling greeter() right now returns a generator object immediately — Python sees the def keyword and notices that the function body will contain yield (it will in a moment), so it compiles it as a generator function. No code inside the function has run yet. Nothing runs until you advance the generator.

Step 2 — Add a yield expression that emits and receives

Replace pass with a single yield that emits a prompt and captures whatever the caller sends back. Wrap it in parentheses so the receive assignment is unambiguous.

python
def greeter():
    name = (yield "What is your name?")

Read this line carefully: "What is your name?" is the value emitted outward to the caller. name is the variable that will receive whatever the caller sends back. Both sides are wired through a single yield expression. Nothing has run yet — this is still just a definition.

Step 3 — Add a response yield

After the generator receives the name, it should emit a personalised greeting. Add a second yield that uses the received value:

python
def greeter():
    name = (yield "What is your name?")
    yield f"Hello, {name}!"

The function now has two suspension points. The first yields the prompt and waits. The second yields the greeting after the name has been received. After the second yield, the function ends — the next call to .send() will raise StopIteration.

Step 4 — Create the generator object

Call the function. Note that calling it does not run the body:

python
gen = greeter()
print(gen)   # <generator object greeter at 0x...>

You have a generator object. The body has not executed. The generator is in GEN_CREATED state — created, never started. If you tried to call gen.send("Alice") right now, you would get TypeError. Try it if you want to see it fail — then continue.

Step 5 — Prime the generator

Advance it to the first yield using next(). This runs the function body until it hits the first yield, suspends, and returns the yielded value to you:

python
prompt = next(gen)
print(prompt)   # What is your name?

What just happened: the generator ran from the top of the function body up to the first yield "What is your name?". It emitted that string, suspended, and is now waiting. The variable name inside the generator has no value yet — the yield expression has not resolved. The generator is now in GEN_SUSPENDED state.

Step 6 — Send a value in

Now call .send() with the name. This resumes the generator, injects your value as the result of the waiting yield expression, and runs until the next yield:

python
greeting = gen.send("Alice")
print(greeting)   # Hello, Alice!

Inside the generator: "Alice" became the resolved value of the yield expression, so name was assigned "Alice". Execution continued to the second yield f"Hello, {name}!", which emitted the greeting string and suspended again. gen.send("Alice") returned that string to you.

Step 7 — Handle the end of the generator

The generator is still suspended at the second yield. One more .send() will resume it, find no further yield, and raise StopIteration. Wrap it:

python
try:
    gen.send(None)   # resume from second yield; no more yields → StopIteration
except StopIteration:
    print("Generator finished.")

Step 8 — Put it together and extend it

Here is the complete version as it now stands, with the full call sequence in one block so you can see the whole picture at once:

python
def greeter():
    name = (yield "What is your name?")
    yield f"Hello, {name}!"

gen = greeter()
prompt   = next(gen)            # prime → "What is your name?"
greeting = gen.send("Alice")    # send name → "Hello, Alice!"
try:
    gen.send(None)              # no more yields → StopIteration
except StopIteration:
    print("Done.")

Now extend it yourself — or better yet, drive it live right here.

Live Scratchpad

Drive the greeter() generator step by step. Enter a name and watch the handshake happen in real time.

Generator State
GEN_CREATED — not yet started
# Generator created. Click "Prime" to run to the first yield.
What to remember from this exercise

The number of .send() calls (excluding the priming call) equals the number of yield expressions that are designed to receive values. The number of values returned by .send() equals the number of yield expressions that emit values — which is all of them. Every yield both emits and receives simultaneously; the direction you care about depends on which side you are writing.

Why You Must Prime a Generator First

The rule is firm: you cannot call .send(non_None_value) on a generator that has not yet been advanced to its first yield. Attempting to do so raises TypeError: can't send non-None value to a just-started generator.

The reason is structural. When a generator object is first created, its code has not executed at all. There is no yield expression currently suspended and waiting to receive anything. The .send() mechanism works by injecting a value into the current yield expression — but if the generator has never run, there is no current yield expression. Sending a non-None value into that void is an error.

python
def my_gen():
    value = (yield "ready")
    yield value * 2

gen = my_gen()

# This raises TypeError — generator hasn't started yet
# gen.send(10)   # TypeError: can't send non-None value to a just-started generator

# Correct approach: prime first
first = next(gen)   # advances to first yield, returns "ready"
print(first)        # "ready"

result = gen.send(10)  # injects 10, generator yields 20
print(result)           # 20

Priming means advancing the generator to its first yield so that there is a suspended yield expression ready to receive input. The two standard ways to prime are next(gen) and gen.send(None). They are precisely equivalent: as confirmed by the CPython source and PEP 342, __next__() is implemented as send(None). Sending None is allowed on a fresh generator because None is the defined neutral value — it becomes the value of the yield expression, which is fine as long as your generator checks for it (or you use next() which implies the same thing).

Warning

A common pattern for coroutine-style generators is to wrap the generator function in a decorator that auto-primes it — calling next() immediately after construction so callers never have to think about the priming step. PEP 342 itself shows this pattern using a @consumer decorator. It is clean, but make sure any generator you wrap this way is actually designed to receive values from the start, or the first None injected by the priming call will cause a silent bug if the generator's code tries to use the value of the yield expression without checking for None.

Pop Quiz Priming
You create a generator and immediately call gen.send(42) — before calling next(). What happens?

What happens when you .send() into an already-exhausted generator?

The article has described what happens when a .send() call reaches the end of a generator: StopIteration is raised and the generator is closed. But readers often run into a different, subtler problem — calling .send() again on a generator that is already in the GEN_CLOSED state. The error is the same exception class but a different situation entirely:

python
def one_shot():
    yield (yield "first")

gen = one_shot()
next(gen)              # prime → yields "first", suspends at outer yield
gen.send("second")     # injects "second" into inner yield; outer yield emits "second"
                       # returns "second" — generator is still alive, suspended at outer yield

try:
    gen.send("third")  # injects "third" into outer yield; generator returns (no more yields)
except StopIteration:
    print("Generator exhausted")

gen.send("fourth")     # StopIteration raised immediately — generator already closed
                       # NOT from running out of yields — the frame is gone

Both the third and fourth calls raise StopIteration, but they mean different things. The third call — gen.send("third") — is the normal end-of-iteration signal: the generator ran out of yield expressions and returned. Note that the second call, gen.send("second"), does not raise StopIteration — it returns "second" and leaves the generator still suspended, because the nested yield (yield "first") has two yield points, not one. The fourth call raises StopIteration immediately because the generator is already in the GEN_CLOSED state — there is no frame left to resume. A well-written driver loop always catches StopIteration on the call that exhausts the generator and stops there — if you see it immediately on what feels like the wrong call, use inspect.getgeneratorstate() to confirm the generator was already closed before you called.

Pro Tip

Defensive driver loops should call inspect.getgeneratorstate(gen) before .send() in debug builds, or wrap every .send() in try/except StopIteration and check whether the generator is now closed before deciding whether to restart it.

next() vs .send() vs .send(None)

These three are closely related but not identical in intent, even when they produce the same mechanical outcome.

next(gen) Advance as a plain iterator or prime a coroutine
Value injected into yield expression None
Allowed on fresh generator? Yes
Typical use Advancing a generator used as an iterator; priming a coroutine-style generator
gen.send(None) Explicitly signal “no input this cycle”
Value injected into yield expression None
Allowed on fresh generator? Yes
Typical use Equivalent to next(gen); used explicitly when you want to signal “no input this cycle”
gen.send(value) Push real data into a suspended generator
Value injected into yield expression value
Allowed on fresh generator? No — raises TypeError
Typical use Passing data into a suspended generator; driving a coroutine with meaningful input

From CPython's perspective, next(gen) and gen.send(None) are the same operation. The distinction is one of intent and readability. Use next() when you are treating the generator as a plain iterator and do not care about two-way communication. Use .send(None) when you are explicitly driving a coroutine-style generator and want to be clear that you are choosing not to send a value this cycle. Use .send(value) when you have actual data to inject.

The Other Two: .throw() and .close()

PEP 342 added three methods to generators, not one. .send() gets most of the attention, but .throw() and .close() complete the picture — and understanding them clarifies what .send() is actually doing at the frame level.

.throw() — injecting an exception at the yield point

gen.throw(exc) resumes the generator at the currently suspended yield expression, but instead of supplying a value, it raises the specified exception there. Pass an exception instance directly — the older three-argument form gen.throw(ExcType, value, traceback) is deprecated since Python 3.12 (see Python 3.12 deprecations) and will be removed in a future version. The generator can catch the thrown exception with a normal try/except block and continue running, or let it propagate — in which case the exception bubbles out to the caller of .throw().

python
def resilient():
    while True:
        try:
            value = (yield "waiting")
            print(f"Got: {value}")
        except ValueError as e:
            print(f"Caught inside generator: {e}")
            # generator continues — does not stop

gen = resilient()
next(gen)                        # prime

gen.send("hello")                # Got: hello
gen.throw(ValueError("bad input"))  # Caught inside generator: bad input
gen.send("back to normal")       # Got: back to normal

The generator catches ValueError internally, handles it, loops back to the yield, and resumes normally. If the generator does not catch the thrown exception, the exception propagates out to the caller and the generator is exhausted. This is the mechanism that asyncio uses internally to cancel tasks — it throws CancelledError into a coroutine's suspended yield point.

.close() — shutting a generator down cleanly

gen.close() throws GeneratorExit into the generator at the current yield point. This gives the generator a chance to run any finally blocks and release resources before stopping. If the generator catches GeneratorExit and then yields again, Python raises RuntimeError — a generator that catches GeneratorExit must either return or re-raise it.

python
def with_cleanup():
    try:
        while True:
            value = (yield "running")
            print(f"Processing: {value}")
    except GeneratorExit:
        print("Generator is shutting down — cleaning up")
        # return here (or just fall off the end) — do NOT yield again

gen = with_cleanup()
next(gen)
gen.send("first")
gen.close()   # prints: Generator is shutting down — cleaning up
# gen is now exhausted — any further .send() raises StopIteration
New in Python 3.13: .close() Can Now Return a Value

Before Python 3.13, gen.close() always returned None, even if the generator caught GeneratorExit and returned a meaningful result. Starting with Python 3.13 (CPython issue #104770), close() returns the generator's StopIteration.value when the generator handles GeneratorExit gracefully and uses a return statement. This makes pipeline-finalizer generators significantly cleaner — a generator can now accumulate results and hand them back at shutdown without any workaround:

python
# Python 3.13+ only
def string_collector():
    """Collects strings; returns CSV summary on close()."""
    items = []
    try:
        while True:
            s = (yield)
            if isinstance(s, str):
                items.append(s)
    except GeneratorExit:
        return ", ".join(items)   # returned by close() in Python 3.13+

gen = string_collector()
next(gen)           # prime
gen.send("alpha")
gen.send("beta")
gen.send("gamma")
summary = gen.close()   # Python 3.13+: returns "alpha, beta, gamma"
print(summary)          # alpha, beta, gamma

# Python 3.12 and earlier: gen.close() always returns None regardless

On Python 3.12 and earlier, .close() discards the return value and returns None. If you need the final value on older Python, drive the generator to exhaustion manually with gen.send(None) and catch StopIteration to read e.value.

Pro Tip

Think of the three PEP 342 methods as three ways to resume a suspended generator: .send(value) resumes with a value, .throw(exc) resumes with an exception, and .close() resumes with GeneratorExit. All three inject something into the same waiting yield expression — the difference is what they inject.

The send/yield Handshake: A Visual

The interaction between caller and generator through .send() is a back-and-forth handshake. Each party is suspended while the other runs. Step through the interaction below — each click advances one operation and shows exactly what value moves in which direction.

Execution Stepper — .send() Handshake
Caller
gen = two_way()
out1 = next(gen)
print(out1)
out2 = gen.send("Hello")
print(out2)
gen.send("World")
# → StopIteration
Generator
def two_way():
  print("started")
  r = (yield "first")
  print(f"got {r}")
  r2 = (yield "second")
  print(f"got {r2}")
  # return → StopIteration
Click Step to begin. 0 / 7

Real Patterns: Accumulators, State Machines, Pipelines

The value of .send() becomes concrete when you look at patterns where two-way communication simplifies otherwise awkward code.

Running Accumulator

A generator that maintains a running total is a classic illustration. Without .send(), you would need a class with mutable state or a closure with a nonlocal variable. With .send(), the generator's own frame is the state container:

python
def accumulator():
    total = 0
    while True:
        value = (yield total)  # emit current total; receive next addend
        if value is None:
            break
        total += value

acc = accumulator()
next(acc)          # prime: advances to first yield, emits 0

print(acc.send(5))   # total = 5,  emits 5
print(acc.send(10))  # total = 15, emits 15
print(acc.send(3))   # total = 18, emits 18

Each call to .send() adds a number to the running total and immediately returns the updated total. The generator carries the state between calls without any external variable. This is the pattern that PEP 342 used in its own accumulator example, cited in the Python documentation.

Spot the Bug Challenge
This accumulator generator has a bug. A developer writes it, runs it, and gets a TypeError on line 5. Which line contains the actual mistake?
Read every line carefully before choosing.
1  def accumulator():
2      total = 0
3      while True:
4          value = (yield total)
5          total += value
6  
7  acc = accumulator()
8  acc.send(5)   # <-- called without priming first

Resettable State Machine

State machines are another natural fit. The generator's local variables are the machine's state; .send() is the event input; the yielded value is the output or the new state label:

python
def traffic_light():
    states = {"red": "green", "green": "yellow", "yellow": "red"}
    current = "red"
    while True:
        override = (yield current)
        if override and override in states:
            current = override          # forced transition
        else:
            current = states[current]   # automatic cycle

light = traffic_light()
print(next(light))           # "red"   (auto-advance)
print(light.send(None))      # "green" (auto-advance)
print(light.send("red"))     # "red"   (forced override)
print(light.send(None))      # "green" (auto-advance from red)

Processing Pipeline

PEP 342 specifically motivated .send() with pipeline and consumer patterns. Multiple coroutine-style generators can be chained so that each one receives data from upstream and passes processed results downstream. The author notes show a thumbnail pipeline where image frames are sent into a generator that pages them, all using (yield) as the intake point:

python
def printer():
    """A simple sink: receives values and prints them."""
    while True:
        item = (yield)
        print(f"Processed: {item}")

def uppercaser(downstream):
    """Transforms input and forwards to downstream consumer."""
    while True:
        item = (yield)
        downstream.send(item.upper())

# Wire up the pipeline
sink = printer()
next(sink)

pipe = uppercaser(sink)
next(pipe)

pipe.send("hello")    # prints: Processed: HELLO
pipe.send("world")    # prints: Processed: WORLD

Each stage in the pipeline uses (yield) with no emitted value — it only receives. The yield expression here evaluates to whatever was sent in; nothing is emitted outward (the caller's .send() would return None). This is a valid and common pattern for pure consumer generators.

Pop Quiz send() return value
When you call gen.send("hello"), what does the return value of that call represent?

The Auto-Prime Decorator

Any generator designed to receive values with .send() must be primed before use. When you have many such generators in a codebase, the priming call scattered at every construction site is noise. PEP 342 itself demonstrates a @consumer decorator that handles priming automatically:

python
import functools

def consumer(func):
    """Decorator that auto-primes a generator on construction."""
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        gen = func(*args, **kwargs)
        next(gen)   # advance to the first yield
        return gen
    return wrapper

@consumer
def logger(prefix):
    while True:
        message = (yield)
        print(f"[{prefix}] {message}")

# No manual priming needed — the decorator handles it
log = logger("INFO")
log.send("Server started")   # [INFO] Server started
log.send("Request received") # [INFO] Request received

The decorator replaces the generator function with a wrapper that calls next() immediately after construction and returns the already-primed generator. Callers see a clean API: construct, then send. One caution applies: only use this decorator on generators that are genuinely designed to receive input from the first iteration. If the generator yields a meaningful value at the first yield — one the caller is supposed to see — the decorator silently discards it.

yield from and .send() Delegation

PEP 380 (Python 3.3) introduced yield from, which delegates to a subgenerator. A key part of that delegation is that .send() calls on the outer generator are automatically forwarded to the inner one. You do not need to manually wire them:

python
def inner():
    x = (yield "inner waiting")
    print(f"Inner received: {x}")
    y = (yield "inner waiting again")
    print(f"Inner received: {y}")

def outer():
    print("Outer: delegating to inner")
    yield from inner()
    print("Outer: inner exhausted, continuing")

gen = outer()
print(next(gen))          # Outer: delegating to inner
                          # inner waiting
print(gen.send("alpha"))  # Inner received: alpha
                          # inner waiting again
try:
    gen.send("beta")      # Inner received: beta
                          # Outer: inner exhausted, continuing
except StopIteration:
    pass

The gen.send("alpha") call on the outer generator is transparently forwarded by yield from directly into the inner generator's suspended yield expression. The outer generator never sees the values — it simply acts as a transparent conduit. The same forwarding applies to .throw() and .close(). This delegation mechanism is the foundation of Python's asyncio coroutine chain: at the bytecode level, await is implemented using YIELD_VALUE and a dedicated GET_AWAITABLE opcode (not YIELD_FROM — PEP 492 intentionally uses a distinct opcode to restrict what await accepts), but the protocol of forwarding .send() calls down through a chain of awaitables to the event loop mirrors the same delegation pattern that yield from established.

What a Generator's return Value Actually Does

The article has shown that when a generator runs out of yield expressions, StopIteration is raised. What it has not yet covered is that a return value statement inside a generator sets StopIteration.value — and that yield from captures this value and makes it available to the outer generator as the result of the entire delegation.

This is the missing link between .send(), subgenerator delegation, and bidirectional communication across multiple generator layers:

python
def inner_worker():
    total = 0
    while True:
        value = (yield)
        if value is None:
            return total    # return value becomes StopIteration.value
        total += value

def outer():
    # yield from captures the return value of inner_worker
    # when inner_worker returns, result gets that value
    result = yield from inner_worker()
    yield f"Final total: {result}"

gen = outer()
next(gen)          # prime — advances into inner_worker's first yield

gen.send(10)
gen.send(25)
gen.send(7)

# Sending None signals inner_worker to return its total
# yield from captures the return value and assigns it to `result`
# outer then hits its own yield
print(gen.send(None))  # Final total: 42

Three things happen here that are easy to miss. First, return total inside a generator does not immediately propagate as an unhandled exception — it sets StopIteration.value to total and exits the generator normally. Second, yield from specifically intercepts that StopIteration, extracts its .value, and makes it the result of the yield from expression on the left-hand side — so result = yield from inner_worker() gives outer the computed total without any extra plumbing. Third, if you were driving this without yield from — calling .send() on the inner generator directly — you would need to catch StopIteration yourself and read e.value to retrieve the return value:

python
# Driving inner_worker manually — no yield from
worker = inner_worker()
next(worker)

worker.send(10)
worker.send(25)

try:
    worker.send(None)   # triggers return inside generator
except StopIteration as e:
    print(e.value)      # 35 — the return value is on StopIteration.value

This pattern matters in practice whenever you write generator-based protocols or implement a custom scheduler. The return value of a subgenerator is how it communicates its final result back to its caller — and yield from is what makes that communication automatic rather than requiring manual exception handling at every delegation boundary.

CPython Internals: What Actually Happens When You Call .send()

The previous sections described the behavior of .send() from the outside. This section goes to the machine level. None of what follows is required to use generators correctly, but it resolves questions that every experienced Python developer eventually runs into — why certain errors have the wording they do, what changed across Python versions, and what those undocumented attributes on the generator object are actually tracking.

RETURN_GENERATOR: How a Generator Object Is Created (Python 3.11+)

When you call a generator function, none of its body runs. What actually happens is that CPython's compiler emits RETURN_GENERATOR as the very first opcode of every generator function's bytecode. When that opcode executes, the interpreter allocates a PyGenObject, copies the current interpreter frame into it as an embedded _PyInterpreterFrame, immediately returns that generator object to the caller, and discards the outer frame. The body of the function — every statement you wrote — sits in the bytecode immediately after RETURN_GENERATOR, and none of it runs until the first next() or .send(None).

You can observe this directly:

python
import dis

def my_gen():
    x = (yield 1)
    yield x + 1

dis.dis(my_gen)
# Python 3.11+:
#   0  RETURN_GENERATOR       ← creates the generator object and returns it
#   2  RESUME          0
#   4  LOAD_CONST      1 (1)
#   6  YIELD_VALUE     0      ← suspends; sent value becomes x
#   8  RESUME          1
#  10  LOAD_FAST       0 (x)
#  12  LOAD_CONST      1 (1)
#  14  BINARY_OP       0 (+)
#  18  YIELD_VALUE     0
#  20  RESUME          1
#  22  LOAD_CONST      0 (None)
#  24  RETURN_VALUE

RETURN_GENERATOR was added in Python 3.11 (replacing the older GEN_START opcode from Python 3.10 and the even older mechanism before that). The shift matters for one subtle reason: because the generator's frame is now an embedded _PyInterpreterFrame inside the PyGenObject struct rather than a heap-allocated PyFrameObject, Python 3.11 significantly reduced the allocation cost of creating a generator. Old-style PyFrameObject heap objects are now only created on demand — when debuggers or introspection functions such as sys._getframe() request them — rather than for every generator creation.

The SEND Opcode: How yield from and await Delegate .send() at the Bytecode Level

Before Python 3.11, yield from compiled to a YIELD_FROM opcode. In Python 3.11 that opcode was removed and replaced with SEND. The SEND opcode is defined as:

STACK[-1] = STACK[-2].send(STACK[-1]) — if the call raises StopIteration, pop the top value, push StopIteration.value, and advance the instruction pointer past the delegation block.

SEND is used for both yield from and await. This means every await expression in an async def function is, at the bytecode level, a SEND that forwards the event loop's injected values down through a chain of awaitable objects. The distinction between await and yield from is enforced at the type level by a separate opcode (GET_AWAITABLE) that restricts what objects can be awaited, not at the SEND level — both compile to the same delegation primitive.

gi_yieldfrom: Seeing the Active Subgenerator

When a generator is suspended inside a yield from delegation, it exposes the subgenerator it is currently waiting on through the gi_yieldfrom attribute. This attribute was added in Python 3.5 (issue #24450, contributed by Benno Leslie and Yury Selivanov). It is None whenever the generator is not currently delegating to another generator. Its coroutine equivalent is cr_await.

python
def inner():
    yield "inner waiting"

def outer():
    yield from inner()

gen = outer()
next(gen)   # prime — outer is now suspended inside yield from

print(gen.gi_yieldfrom)   # 
# gi_yieldfrom is the live inner() generator object
# You can call .send() or .throw() on it directly if needed for debugging

This attribute is practically useful when debugging a suspended generator chain: if .send() behaves unexpectedly, inspecting gi_yieldfrom tells you exactly which subgenerator is currently holding control, without disrupting execution.

gi_suspended: The Cheaper State Check (Python 3.11+)

Python 3.11 added a gi_suspended attribute directly to generator objects. It is a C-level integer field — 1 if the generator is currently suspended at a yield expression, 0 otherwise. Unlike inspect.getgeneratorstate(), reading gi_suspended requires no Python function call and no import. The inspect module's own implementation of getgeneratorstate() now uses gi_suspended internally as its fast path.

python
def my_gen():
    yield 1

gen = my_gen()
print(gen.gi_suspended)   # 0 — not yet suspended (GEN_CREATED)

next(gen)
print(gen.gi_suspended)   # 1 — suspended at the yield

try:
    next(gen)
except StopIteration:
    pass
print(gen.gi_suspended)   # 0 — closed (GEN_CLOSED)

In tight scheduler loops that check generator state frequently, reading gen.gi_suspended directly is measurably faster than calling inspect.getgeneratorstate(gen). The coroutine equivalent is cr_suspended (also added in Python 3.11); async generators gained ag_suspended in Python 3.12.

Pop Quiz CPython internals
In Python 3.11+, what is the first bytecode instruction in every generator function's compiled code?

Each PyGenObject in CPython carries its own gi_exc_state field — a private exception state stack independent of the calling thread's exception state. When the interpreter enters a generator's frame (on any .send() call), it swaps the thread's active exception state for the generator's saved one. When the generator suspends, the generator's exception state is saved back into gi_exc_state and the caller's exception state is restored.

This swap behavior was introduced in Python 3.7 (commit ae3087c, "Move exc state to generator", fixing bpo-25612). Before 3.7, exception state lived in the frame object, which caused obscure bugs where an active exception caught inside a generator could be visible in the caller's sys.exc_info() after the generator yielded. The fix — moving gi_exc_state to the generator object itself — is why the following code behaves correctly today but would have produced surprising results in Python 3.6 and earlier:

python
import sys

def gen_with_caught_exception():
    try:
        raise ValueError("inside generator")
    except ValueError:
        yield "caught it"   # yields while an exception is active inside gen
    yield "done"

g = gen_with_caught_exception()
print(next(g))               # "caught it"
print(sys.exc_info())        # (None, None, None) — caller's exception state is clean
                             # In Python 3.6, this would have shown the ValueError

Why the .throw() Method Is Named "throw" and Not "raise"

The name throw was chosen deliberately and specifically because raise is a Python keyword and cannot be used as a method name. The name first appeared in PEP 288 (Raymond Hettinger, 2002), which proposed generator exception injection before PEP 342 existed. PEP 288 noted that throw was also already associated with exception injection in other languages, that it was suggestive of placing the exception in a different location, and that alternatives considered — resolve(), signal(), genraise(), raiseinto(), flush() — were all judged inferior. PEP 342 adopted the name from PEP 288 without change.

yield from with Plain Iterables: When .send() Degrades to next()

yield from works on any iterable, not only on generators. When given a plain iterable such as a list or range, it calls iter() on it to produce a plain iterator. That plain iterator has no .send() method. The PEP 380 delegation specification handles this explicitly: when the delegating generator receives a .send(value) call and the subiterator is a plain iterator, if value is None the delegation falls back to calling next() on the subiterator. If value is non-None, Python calls subiterator.send(value), which raises AttributeError on a plain iterator — the caller of the delegating generator sees that AttributeError propagated out. The practical consequence: a generator using yield from some_list can be primed and advanced with next() safely, but calling .send("data") on it will raise AttributeError because the list iterator has no send channel.

Where .send() Fits Now That async/await Exists

Python 3.5 introduced native coroutines via PEP 492, with the async def and await syntax. One of PEP 492's explicit motivations was the confusion between generator-based coroutines and plain generators: they shared syntax, which made it difficult to tell at a glance whether a function was intended to be driven with next() or as a coroutine. Native coroutines removed that ambiguity by creating a distinct type with its own syntax.

PEP 492 (Yury Selivanov) noted that distinguishing coroutines from regular generators was a persistent source of confusion, particularly for developers newer to Python, given that both shared the same syntax.

As Luciano Ramalho explains in Fluent Python (O'Reilly), the .send() infrastructure arrived with PEP 342 in Python 2.5, which was when yield became a proper expression and generators gained the ability to function as coroutines. Native async/await coroutines build on the same underlying mechanism — coroutine objects still expose .send(), .throw(), and .close() — but the event loop calls those methods internally, and application code never calls .send() directly.

This means .send() on generators remains relevant in several specific situations today:

It is appropriate when you are writing a stateful generator that needs two-way communication but does not need to participate in an async event loop. The accumulator and state machine patterns above are examples where async/await would be unnecessary overhead. It is also appropriate when you are working at a low level with coroutine objects — writing a custom event loop, implementing a trampoline scheduler, or building a testing harness that drives coroutines directly. And it appears in the implementation of yield from (PEP 380, Python 3.3), which internally forwards .send() calls down to subgenerators through the delegation chain.

For ordinary asynchronous I/O and task coordination, async/await is the right tool. For synchronous stateful computation with bidirectional communication, .send() on a plain generator is still clean and precise.

Note: PEP 479 and StopIteration

Since Python 3.7 (via PEP 479), any StopIteration raised inside a generator — including one triggered accidentally by an unguarded next() call several frames deep — is converted to a RuntimeError instead of silently terminating the iteration. This change affects code that uses .send() in pipelines: if a downstream generator is exhausted and your pipeline does not handle it explicitly, you will now get a visible RuntimeError rather than a silent stop. Always wrap pipeline termination in explicit try/except StopIteration blocks or use return inside the generator to signal exhaustion.

Inspecting Generator State

When a .send() call misbehaves — wrong value comes back, unexpected TypeError, premature StopIteration — the first question is usually: where is this generator right now? Python exposes that through the inspect module and a few attributes on the generator object itself.

python
import inspect

def my_gen():
    yield 1
    yield 2

gen = my_gen()

print(inspect.getgeneratorstate(gen))  # GEN_CREATED  — never started

next(gen)
print(inspect.getgeneratorstate(gen))  # GEN_SUSPENDED — at a yield

# exhaust it
list(gen)
print(inspect.getgeneratorstate(gen))  # GEN_CLOSED — no more yields

gen2 = my_gen()
gen2.close()
print(inspect.getgeneratorstate(gen2)) # GEN_CLOSED — explicitly closed

The four states are GEN_CREATED (constructed, never advanced — this is why .send(non_None) fails here), GEN_RUNNING (currently executing, only visible from inside the generator itself), GEN_SUSPENDED (paused at a yield, ready to receive .send()), and GEN_CLOSED (exhausted or closed).

The generator's current local variables are accessible via gen.gi_frame.f_locals when the state is GEN_SUSPENDED. Once the generator is closed, gi_frame is None. This is useful when debugging a long-running stateful generator to verify what the accumulated state looks like without disrupting execution:

python
def accumulator():
    total = 0
    while True:
        value = (yield total)
        if value is None:
            break
        total += value

acc = accumulator()
next(acc)
acc.send(10)
acc.send(25)

# Peek at internal state without consuming the generator
print(acc.gi_frame.f_locals)  # {'total': 35, 'value': 25}
Warning

Accessing gi_frame.f_locals is useful for debugging but is considered an implementation detail of CPython. It is not guaranteed to be available or accurate in other Python implementations (PyPy, Jython, MicroPython). Do not rely on it in production logic — only use it in debugging and diagnostic tools.

Python Notes to Remember

1 yield is an expression, not just a statement.
Since Python 2.5 (PEP 342), yield both emits a value outward and evaluates to a value inward. The inward value is what .send() supplies.
2 .send() is specific to generator objects.
It is not available on arbitrary iterables, generator expressions with no authored yield expression, or custom classes implementing __iter__. Two-way communication requires an explicitly authored generator function with a yield expression designed to capture the sent value.
3 .send(value) resumes the generator and injects value into the currently suspended yield expression.
The return value of .send() is the value produced by the next yield the generator reaches after resuming — not the value you sent in.
4 You must prime a generator before sending a non-None value.
Call next(gen) or gen.send(None) first. Calling .send(value) on a fresh generator raises TypeError; calling it on an already-exhausted generator raises StopIteration immediately — a different error for a different situation.
5 next(gen) and gen.send(None) are identical at the CPython level.
Use next() when treating a generator as an iterator; use .send(None) when explicitly driving a coroutine and signalling “no input this cycle.”
6 PEP 342 added three methods, not one.
.send(value) resumes with a value, .throw(exc) resumes by raising an exception at the yield point, and .close() resumes by throwing GeneratorExit. All three inject something into the same waiting yield expression. Since Python 3.12, the three-argument form of .throw() is deprecated — pass an exception instance directly.
7 Python 3.13 added return-value support to .close().
If a generator catches GeneratorExit and returns a value, gen.close() now returns that value on Python 3.13+. On Python 3.12 and earlier, .close() always returned None. This change makes pipeline-finalizer generators significantly cleaner.
8 A generator's return value becomes StopIteration.value.
yield from captures this automatically and assigns it as the result of the delegation expression. Without yield from, callers must catch StopIteration and read e.value manually.
9 yield from forwards .send() calls transparently.
Any .send(), .throw(), or .close() call on an outer generator is automatically delegated to the inner subgenerator. This is the mechanism that makes await work in async functions.
10 .send() is still the right tool for synchronous stateful generators.
For async I/O, prefer async/await. For in-process two-way state machines, accumulators, and pipelines that do not need an event loop, .send() on a plain generator remains a clean, precise solution. Use inspect.getgeneratorstate() and gi_frame.f_locals to debug when needed.

The generator protocol in Python has always been richer than the basic for loop version of it suggests. .send() is the part that turns a generator from a one-directional sequence producer into something closer to a function that can be paused, handed a value, and then continued — a pattern that underpins everything from simple stateful counters to the coroutine machinery that powers Python's entire async ecosystem.

Frequently Asked Questions

Q What does generator.send() do in Python?

generator.send(value) resumes the generator from where it is suspended and injects value into the currently waiting yield expression. The yield expression inside the generator evaluates to that value. The return value of .send() is whatever the generator yields next after resuming. If the generator reaches its end without another yield, StopIteration is raised.

Q Why do you have to prime a generator before calling .send()?

When a generator object is first created, its code has not executed at all — there is no yield expression currently suspended and waiting to receive a value. Calling .send(non_None_value) on a fresh generator raises TypeError because there is no active yield expression to inject into. You must first call next(gen) or gen.send(None) to advance execution to the first yield, creating a suspended yield expression that can then receive input.

Q What is the difference between next() and gen.send(None)?

At the CPython level, next(gen) and gen.send(None) are identical operations — __next__() is implemented as send(None). Both inject None as the value of the current yield expression and advance the generator to the next yield. The distinction is one of intent: use next() when treating a generator as a plain iterator, and use .send(None) when explicitly driving a coroutine-style generator to signal that no input is being provided this cycle.

Q When should you use .send() instead of async/await?

Use .send() on a plain generator when you need synchronous, stateful two-way communication without an async event loop — for example, running accumulators, state machines, or in-process pipelines. Use async/await for asynchronous I/O and task coordination. Native coroutines (async def) build on the same underlying .send() mechanism, but the event loop calls .send() internally so application code never needs to call it directly.

Q What PEP introduced generator.send()?

PEP 342, authored by Guido van Rossum and Phillip J. Eby, introduced generator.send() in Python 2.5 (released 2006). The PEP also made yield an expression rather than a pure statement, and added .throw() and .close() to the generator API. These changes transformed Python generators from one-way value producers into proper coroutines capable of both producing and consuming values.

Q What does generator.throw() do?

gen.throw(exc) resumes the generator at the currently suspended yield expression but raises the specified exception there instead of supplying a value. Pass an exception instance — e.g. gen.throw(ValueError("bad input")). The older two-argument form gen.throw(ExcType, value) is deprecated since Python 3.12. The generator can catch the exception with a normal try/except block and continue running, or let it propagate — in which case the exception bubbles out to the caller of .throw(). This is the same mechanism asyncio uses internally to cancel tasks by throwing CancelledError into a suspended coroutine.

Q Can generator.close() return a value?

Starting in Python 3.13 (CPython issue #104770), gen.close() can return a final value. If the generator catches GeneratorExit and uses a return statement, close() returns the StopIteration.value produced by that return. In Python 3.12 and earlier, close() discards the value and always returns None. This new behavior is particularly useful for pipeline accumulator generators that collect data throughout their lifetime and need to hand back a final result at shutdown — without requiring the caller to catch StopIteration.

Q Does yield from forward .send() calls to the subgenerator?

Yes. When a generator uses yield from to delegate to a subgenerator, any .send(), .throw(), or .close() call on the outer generator is automatically forwarded to the inner one. The outer generator acts as a transparent conduit. This delegation is how the await keyword works in async functions — at the bytecode level, await compiles to yield from, so every await is a transparent .send() chain down to the event loop.

Q What does a generator's return statement do, and how does it relate to StopIteration?

A return value statement inside a generator sets StopIteration.value when the generator finishes. If you drive the generator manually with .send(), you must catch StopIteration and read e.value to retrieve it. If you use yield from, Python captures that StopIteration.value automatically and assigns it as the result of the yield from expression — so result = yield from subgen() gives the outer generator the subgenerator's return value without any manual exception handling.

Q What error do you get when you call .send() on an already-exhausted generator?

Calling .send() on a generator that is already in the GEN_CLOSED state raises StopIteration immediately, before any code runs. This looks identical to the StopIteration raised when a generator naturally runs out of yield expressions, but it means something different: the generator has no frame left to resume. The fix is to catch StopIteration on the call that exhausts the generator and stop sending — or check inspect.getgeneratorstate(gen) before sending to confirm the generator is in GEN_SUSPENDED.

Q Does .send() work if the yield expression result is not assigned to a variable?

Yes. If a generator contains a bare yield value with no assignment — for example, yield total rather than received = (yield total) — the generator still runs correctly and the caller can still call .send() on it. The sent value is simply discarded: the yield evaluates to it internally, but since nothing captures it, it is immediately lost. This is not an error; it is a valid pattern when the generator only needs to emit values outward and has no use for incoming data. If you later need to use the sent value, add the assignment.

Q Does .send() work on generator expressions?

No. Generator expressions — written as (x for x in iterable) — produce generator objects, but those objects do not have a yield expression designed to receive a sent value. Calling .send(non_None_value) on a generator expression raises TypeError for the same reason as calling it on any unprimed generator — there is no authored yield expression waiting to capture the input. Two-way communication requires an explicitly written generator function where the yield expression is designed to be on the right-hand side of an assignment.

Q What are the parentheses rules for yield expressions?

A yield expression must be parenthesized whenever it appears inside a larger expression — for example, val = (yield x) + 1 requires the parentheses, otherwise Python parses it as val = yield (x + 1), which is different. The one case where parentheses are not required is when the yield expression is the entire right-hand side of a simple assignment: received = yield emitted is legal without them. The Python 2.5 release notes for PEP 342 recommend always using parentheses anyway — received = (yield emitted) — because it makes the boundary explicit and eliminates any ambiguity when the expression is extended later.

Q What is GEN_RUNNING and when would you encounter it?

The GEN_RUNNING state means the generator is currently executing — its frame is on the call stack right now. You would only encounter it if you called inspect.getgeneratorstate(gen) from inside the generator itself, or if you tried to call .send() on a generator from within its own execution. In practice, trying to re-enter a running generator raises ValueError: generator already executing. CPython uses the GEN_RUNNING flag internally to enforce this re-entrancy guard — it is set when a generator's frame starts executing and cleared when the frame suspends or returns.

Q Does .send() work on two generators communicating with each other?

Yes, and this is precisely the pipeline pattern described earlier in this article. One generator can hold a reference to another and call .send() on it directly — for example, a transformer generator that receives a value, processes it, and then calls downstream.send(result). Each generator in the chain is independently primed and driven. The key requirement is that every generator in the pipeline must be primed before the pipeline is used, either manually or via the auto-prime decorator pattern.

Q Is generator.send() thread-safe?

No. A Python generator object is not thread-safe. If two threads attempt to call .send() on the same generator concurrently, CPython's re-entrancy guard will raise ValueError: generator already executing on the second caller. The GIL in CPython means two threads cannot execute bytecode simultaneously, but that does not make a generator safe to share across threads — a context switch between a .send() call and the generator's next suspension point can leave internal state partially updated. If multiple threads need to communicate through a shared stateful object, use a threading.Lock and a regular class with explicit state management rather than a shared generator.

Q How does .send() interact with try/finally inside a generator?

A finally block inside a generator is guaranteed to run when the generator is closed — whether via .close(), garbage collection, or running off the end of the function. When you call gen.close(), Python throws GeneratorExit into the generator at the current suspension point; the generator's finally block then executes before the generator terminates. This guarantee has been in place since Python 2.5, when PEP 342 removed the restriction that prevented yield from appearing inside a try/finally block. Generators used with .send() can safely hold resources — open files, locks, database connections — in finally blocks, and those resources will be released reliably even if the caller abandons the generator mid-iteration.

Q Can you use .send() inside a for loop?

No — a for loop only calls __next__() on each iteration, which is equivalent to .send(None). There is no mechanism in the for statement to inject a value into the current yield expression. If you need to send values into a generator on each iteration, drive it manually with a while loop and catch StopIteration to end the loop:

python
def doubler():
    while True:
        x = (yield)
        if x is None:
            break
        yield x * 2

gen = doubler()
next(gen)  # prime

inputs = [3, 7, 11]
for value in inputs:
    result = gen.send(value)   # send input in
    print(result)              # 6, 14, 22
    next(gen)                  # advance past the second yield before sending again

Use a for loop when treating a generator as a read-only sequence. Use a while True loop with try/except StopIteration when you need to push data back in on each cycle.

Q Can you send complex objects — lists, dicts, class instances — into a generator?

Yes. .send() accepts any Python object as its argument. The sent value becomes the result of the currently suspended yield expression inside the generator — the generator receives a reference to the same object the caller passed in, with no serialisation, copying, or type restriction. This means that if you send a mutable object such as a list or a dictionary and the generator modifies it, those modifications are visible to the caller through the same reference. If that is not what you want, send a copy instead: gen.send(my_list.copy()) or gen.send(dict(my_dict)) for shallow copies, or use copy.deepcopy() for nested structures.

Q What happens when .throw() is not caught inside the generator?

If the generator does not have a try/except block that matches the thrown exception — or if it has one but re-raises — the exception propagates out of the generator's frame and becomes the exception raised by the .throw() call on the caller's side. The generator is also closed: it moves to GEN_CLOSED state, and any subsequent .send() or .throw() calls will raise StopIteration immediately. The key point is that .throw() does not automatically terminate the generator silently — the generator has the opportunity to catch, handle, and continue. But if it does not, the caller receives the exception just as they would from any other failed call, and the generator is gone.

Q Can the same generator be shared between multiple callers?

Technically yes — a generator object is just a Python object and can be referenced from multiple places. In practice, sharing a single generator between multiple callers is almost always a mistake. A generator has a single position: one currently suspended yield expression. Whichever caller calls .send() next advances that position. If two callers interleave their .send() calls, each will consume different yield points and receive the values intended for the other, with no error raised. If multiple parts of your code need to exchange values with a generator-based component, give each an independent generator instance by calling the generator function separately. For genuine fan-out, use a queue or an explicit broadcast wrapper rather than shared generator state.

Q How do you test a generator that uses .send()?

Prime the generator first, then alternate between .send() calls and assertions on the return values. Use pytest.raises(StopIteration) to confirm exhaustion at the right point, and read exc_info.value.value to assert the generator's return value. Always create a fresh generator instance per test — never reuse a generator across test cases, since its state carries over.

python
import pytest

def accumulator():
    total = 0
    while True:
        value = (yield total)
        if value is None:
            return total
        total += value

def test_accumulator_basic():
    gen = accumulator()
    assert next(gen) == 0        # primed; yields initial total
    assert gen.send(10) == 10    # total is now 10
    assert gen.send(25) == 35    # total is now 35
    assert gen.send(5)  == 40    # total is now 40

def test_accumulator_termination():
    gen = accumulator()
    next(gen)
    gen.send(7)
    with pytest.raises(StopIteration) as exc_info:
        gen.send(None)           # signals the generator to return
    assert exc_info.value.value == 7   # StopIteration.value holds the return value
Q What is the RETURN_GENERATOR opcode and why does it matter?

RETURN_GENERATOR is the first opcode in every generator function's bytecode in Python 3.11 and later. When it executes, CPython allocates a PyGenObject, copies the current interpreter frame into it as an embedded _PyInterpreterFrame, immediately returns the generator object to the caller, and discards the outer frame — none of the function body executes yet. This replaced the older GEN_START opcode (Python 3.10). In Python 3.11+, the generator's frame is embedded directly inside the PyGenObject struct, so no separate heap PyFrameObject is allocated at creation time — one is only created on demand when a debugger or introspection call requests it.

Q What opcode does yield from (and await) compile to in Python 3.11+?

In Python 3.11, the old YIELD_FROM opcode was removed and replaced by SEND. The SEND opcode implements STACK[-1] = STACK[-2].send(STACK[-1]) and handles StopIteration by extracting StopIteration.value and jumping past the delegation block. Both yield from and await compile to SEND; the distinction between the two is enforced at the type level by a separate GET_AWAITABLE opcode that validates what can be awaited, not at the SEND level itself.

Q What is gen.gi_yieldfrom?

gi_yieldfrom is an attribute on generator objects (added in Python 3.5, issue #24450) that holds a reference to the subgenerator currently being delegated to via yield from. It is None whenever the generator is not currently inside a yield from delegation. It is useful when debugging a suspended generator chain — inspecting gen.gi_yieldfrom tells you exactly which subgenerator currently holds control without disrupting execution. The coroutine equivalent is cr_await.

Q What is gen.gi_suspended and how is it different from inspect.getgeneratorstate()?

gi_suspended is a C-level integer attribute added to generator objects in Python 3.11. It is 1 if the generator is currently suspended at a yield expression and 0 otherwise. Unlike inspect.getgeneratorstate(), reading it requires no Python function call and no import — it is a direct attribute lookup on the object. The inspect module's own implementation of getgeneratorstate() uses gi_suspended internally as its fast path. Coroutines gained cr_suspended in the same Python 3.11 release; async generators gained ag_suspended in Python 3.12.

Q Why don't exceptions caught inside a generator affect the caller's sys.exc_info()?

Each PyGenObject carries its own gi_exc_state field — a private exception state independent of the calling thread's exception state. When CPython enters a generator's frame on any .send() call, it swaps the thread's active exception state for the generator's saved one, and swaps back when the generator suspends. This per-generator exception state was moved from the frame object to the generator object itself in Python 3.7 (commit ae3087c, fixing bpo-25612). Before that change, an active exception inside a generator could bleed into the caller's sys.exc_info() after a yield, causing obscure bugs.

Q What happens when you call .send(non_None_value) on a generator delegating via yield from to a plain list?

PEP 380 defines the delegation behavior explicitly: when yield from is given a plain iterable (not a generator), it wraps it in a plain iterator via iter(). If a non-None value is sent to the delegating generator, Python tries to call subiterator.send(value) — but a plain iterator has no .send() method, so AttributeError is raised and propagates to the caller. If the sent value is None, the delegation correctly falls back to calling next(subiterator). Priming and plain iteration through a yield from some_list works fine, but sending non-None values into it raises AttributeError.

Certificate of Completion
Final Exam
Pass mark: 80% · Score 80% or higher to receive your certificate

Enter your name as you want it to appear on your certificate, then start the exam. Your name is used only to generate your certificate and is never transmitted or stored anywhere.

Question 1 of 10