How do I get a line count of a large file in the most memory- and time-efficient manner?
def file_len(filename):
with open(filename) as f:
for i, _ in enumerate(f):
pass
return i + 1
You can't get any better than that.
After all, any solution will have to read the entire file, figure out how many \n
you have, and return that result.
Do you have a better way of doing that without reading the entire file? Not sure... The best solution will always be I/O-bound, best you can do is make sure you don't use unnecessary memory, but it looks like you have that covered.
[Edit May 2023]
As commented in many other answers, in Python 3 there are better alternatives. The for
loop is not the most efficient. For example, using mmap
or buffers is more efficient.