I have some use cases in which I need to run generator functions without caring about the yielded items.
I cannot make them non-generaor functions because in other use cases I certainly need the yielded values.
I am currently using a trivial self-made function to exhaust the generators.
def exhaust(generator):
for _ in generator:
pass
I wondered whether there is a simpler way to do that, which I'm missing?
Following a use case:
def create_tables(fail_silently=True):
"""Create the respective tables."""
for model in MODELS:
try:
model.create_table(fail_silently=fail_silently)
except Exception:
yield (False, model)
else:
yield (True, model)
In some context, I care about the error and success values…
for success, table in create_tables():
if success:
print('Creation of table {} succeeded.'.format(table))
else:
print('Creation of table {} failed.'.format(table), file=stderr)
… and in some I just want to run the function "blindly":
exhaust(create_tables())
Setting up a for loop for this could be relatively expensive, keeping in mind that a for loop in Python is fundamentally successive execution of simple assignment statements; you'll be executing n (number of items in generator) assignments, only to discard the assignment targets afterwards.
You can instead feed the generator to a zero length deque; consumes at C-speed and does not use up memory as with list and other callables that materialise iterators/generators:
from collections import deque
def exhaust(generator):
deque(generator, maxlen=0)
Taken from the consume itertools recipe.