What is your recommended best practice for a LAMP server with sort of a high simultaneous load and I need to handle file I/O without getting too hung on a file lock?
I mean, let's say I want to have a SUBSCRIBERS.CSV file that has a bunch of names and email addresses in it. But I want people to be able to fill out a form to unsubscribe. The unsubscribe action would scroll through that file to delete a matching line if it exists for a given email address. This seems like a simple task in PHP, but what happens when you have like 10 people trying to unsubscribe at once, and 10 new subscribers being added? That's where I think PHP might run into trouble and an error might be generated due to a file lock, unless Linux or PHP is more capable than I think.
Note my client wants a CSV file, not a database table. In a database table, this would not be a problem, but as file I/O, I might run into a potential issue, right?
(BTW, to prevent identity theft, I use an .htaccess trick so that one can't download the CSV over the web by guessing its name -- it must only be accessed either by my PHP script or by FTP.)
If the requirement is for the client to interface with a CSV file, you don't need to actually use the CSV file as the datastore. Instead, use a database, do all your work in a database, and let PHP generate the CSV file on demand.
So, if a client needs to access http://example.com/SUBSCRIBERS.CSV, just have PHP handle SUBSCRIBERS.CSV and use something like:
header("Content-type: text/csv");
$data = get_subscriber_data();
foreach ($data as $row) {
// $row is an array of columns
print implode(',', $row);
}