Is it possible (in Python under Linux) to determine whether a file is still being written and hasn't been closed yet?
I'm trying to write data to a cache (file) which isn't quite complete yet when other processes are already accessing it. The file/cache then appears corrupted to the processes reading it.
Inspired by @Hipo's answer, I went with a solution that creates a temporary file which marks the writer processes as busy. If this file exists, then the readers don't read. Once the writer process is finished, it deleted this temporary file.
I found this easier than setting fcntl.LOCK_EX
on the written file as I couldn't find a way to read this flag in the reader processes.
def lock(fname):
fname = '%s.lock' % fname
assert not os.path.exists(fname), '%s already exists.' % fname
open(fname, 'w').close()
def is_locked(fname):
fname = '%s.lock' % fname
return os.path.exists(fname)
def unlock(fname):
fname = '%s.lock' % fname
os.remove(fname)