perlweb-scrapingwww-mechanizemozreplwww-mechanize-firefox

How can we tell WWW::Mechanize::Firefox to not wait for a response or a postback?


Hi I am using a loop to get various pages.

The code:

while($stm->fetch()) {
    $mech->get(**#TheURL**);

            $mech->select( 'this', 'that' );
            $mech->tick( 'this' => undef );
            $mech->tick( 'this' => undef  );
            $mech->tick( 'this' => undef  );
            $mech->tick( 'this' => undef  );
            my $button = $mech->selector('input.button', single => 1);
            $mech->click($csvbutton);
} 

The code above downloads the file after clicking the button. However it stops after the first iteration.

I have changed the configuration of Firefox to disable the download manager popup.

I received this error once:

MozRepl::RemoteObject::Object has no function addProgressListener at `$mech->get(**#TheURL**);`

Any assistance would be appreciated. Thanks.

UPDATE:

I have found that it gets stuck on the first download because WWW::Mechanize::Firefox is waiting for a response or postback which it does not get. A file is just downloaded.

$mech->click($csvbutton);

This is the place where it stop. If i manually manipulate the page (ie. go back or go to the homepage or a new URL), it will continue on the loop. So it is waiting for a new page load.

How do i tell WWW::Mechanize::Firefox to not wait for a response or new page load?

Thanks


Solution

  • Format of the line should be perfect of the form:

    $mech->click({selector => ('input.button', single => 1), synchronize => 0});
    

    You must use selector => or xpath =>

    Signature:

    $mech->click({xpath/selector => 'name' ,synchonize => 0})
    

    Then the click will not wait for a response for the webserver.

    As stated at: cpan WWW::Mechanize::Firefox::Troubleshooting