I have a simple perl script that calls a shell script which always "hangs" when called from the browser. I want to simply force a time out after 20 seconds. When I run it in command line there is no issues. But when I do it from the browser, the script is executed but never finishes loading. So I don't get the output on the page. If I kill -9 the process in the command line, the browser finishes loading, and the content is displayed in the browser.
I did a lot of research, and it seems the web server is waiting for the shell script to finish, because the shell script still has an open file handle to the standard output.
Here is my code.
#!/usr/bin/perl
use strict;
use warnings;
use CGI;
my $q = new CGI;
print $q->header;
my $timeout = 10;
my $pid = fork;
if ( defined $pid ) {
if ( $pid ) {
# this is the parent process
local $SIG{ALRM} = sub { die "TIMEOUT" };
alarm 10;
# wait until child returns or timeout occurs
eval {
waitpid( $pid, 0 );
};
alarm 0;
if ( $@ && $@ =~ m/TIMEOUT/ ) {
# timeout, kill the child process
kill 9, $pid;
}
}
else {
# this is the child process
# this call will never return. Note the use of exec instead of system
exec "/opt/bea/domains/fsa/scripts/start.sh";
}
}
else {
die "Could not fork.";
}
How do I force the page to finish loading regardless of the status of the shell script after a certain amount of time.
Problem is that after your script finishes, webserver is still waiting for all child created (even created by you) to finish. Best way how to handle it is to create separate process group with setsid
. Or you can even try the "double fork trick", where you fork and from child then fork again. After that the first child will exit and the second child becomes child if init
. Also make sure that (in both cases) in child you do:
close STDIN;
close STDOUT;
close STDERR;