pythonos.pathfnmatchnested-for-loop

Slow file trawler -- python


I've written a short script searches a directory tree for the latest files that match "Data*.txt" but it is painfully slow. It's due to the fact I've had to nest the for loops (I suspect).

Example directory tree:

ROOT
   |-- <directoryNameFoo1>
   |     |-- from  # This stays the same in each subdir...
   |            |-- <directoryNameBar1>
   |                  |-- Data*.txt
   |
   |-- <directoryNameFoo2>
   |     |-- from  # This stays the same in each subdir...
   |            |-- <directoryNameBar2>
   |                  |-- Data*.txt
   |
   |-- <directoryNameFoo3>
   |     |-- from  # This stays the same in each subdir...
   |            |-- <directoryNameBar3>
   |                  |-- Data*.txt

My question is: Is there a better/faster way to search a directory structure in order to find the latest files matching "Data*.txt" in each subdir?

Code:

#!/usr/bin/env python
# -*- coding: utf-8 -*-

import os
import fnmatch
__basedir = os.path.abspath(os.path.dirname(__file__))

last_ctime = None
vehicle_root = None
file_list = []

for root, dirnames, filenames in os.walk(__basedir):
    vehdata = []
    for filename in fnmatch.filter(filenames, 'Data*.txt'):
        _file = os.path.join(root, filename)
        if vehicle_root == root:
            if os.path.getctime > last_ctime[1]:
                last_ctime = [_file, os.path.getctime(_file)]
            else:
                continue
        else:
            file_list.append(last_ctime)
            vehicle_root = root
            last_ctime = [_file, os.path.getctime(_file)]

        
print(file_list)

Solution

  • You can use glob to search a specific pattern data without any loop. Like,

    import glob
    glob.glob('yourdir/Data*.txt')
    

    and use glob.glob('yourdir/Data*.txt,recursive=True) when you want to search in all sub directory in your defined directory.