Show the memory benefit: "Reading a 2GB CSV with file() loads everything into memory. A generator that yields one line at a time uses a few kilobytes regardless of file size. For anything large, generators are the right tool."
Generators use the yield keyword to produce values one at a time without building the entire result set in memory. A function returning 1 million rows as an array allocates memory for all of them. A generator yields one row at a time, using constant memory regardless of dataset size. Use generators for: iterating over large files line by line, streaming database results, and any pipeline where you process items sequentially. Generators implement the Iterator interface and work with foreach. Strong candidates discuss: yield from for delegating to sub-generators, send() for two-way communication, the return value of a generator (getReturn()), and when generators are overkill (small datasets where an array is simpler).
Tests understanding of PHP memory management. Candidates who always build arrays will hit memory limits on large datasets. Those who reach for generators when processing large collections write scalable code.