I was writing a question, but finally came up with a solution. As it might be useful for others (my future self, at least), here it is.
To run a single command in parallel in several detached screens that automatically close themselves, this works nicely:
timeslots='00_XX 01_XX 02_XX 03_XX 04_XX 05_XX 06_XX'
for timeslot in $timeslots;
do
screen -dmS $timeslot bash -c "echo '$timeslot' >> DUMP";
done
But what if, for each timeslot, we want to execute in screen not one but several (RAM-heavy) commands, one after the other?
We can write a function (in which everything is run sequentially), with an argument in our bash script.
test_function () {
# Commands to be executed sequentially, one at a time:
echo $1 >> DUMP; # technically we'd put heavy things that shouldn't be executed in parallel
echo $1 $1 >> DUMP; # these are just dummy MWE commands
# ETC
}
But, how to create detached screens that run this function with the $timelot argument?
There are lots of discussions on stackoverflow about running a distinct executable script file or on using stuff, but that's not what I want to do. Here the idea is to avoid unnecessary files, keep it all in the same small bash script, simple and clean.
test_function () {
# Commands to be executed sequentially, one at a time:
echo $1 >> DUMP; # technically we'd put heavy things that shouldn't be executed in parallel
echo $1 $1 >> DUMP; # these are just dummy MWE commands
# ETC
}
export -f test_function # < absolutely crucial bit to enable using this with screen
Now we can do
timeslots='00_XX 01_XX 02_XX 03_XX 04_XX 05_XX 06_XX'
for timeslot in $timeslots;
do
screen -dmS $timeslot bash -c "test_function $timeslot";
done
And it works.