javaseleniumhtmlunitghostdriver

Scraping our site to check for Javascript errors


I am kind of desperate here. For a couple of days I have been trying to create a web scraper that can go through our website and check for Javascript errors.

The big problem is that I only know Java and it seems that GhostDriver isn't maintained anymore. The guy from Ghostdriver refers to JbrowserDriver. JBrowserDriver however has no option to collect the Javascript errors from the console.

Then I tried HTMLUnit which is way to eager on throwing errors, not javascript related. So after fiddling with it half a day, I threw in the towel on HtmlUnit.

I could revert to plain old WebDriver but that would involve too much boilerplate.

Does anyone of you guys have any suggestions?


Solution

  • You could inject some javascript into the head section of each window to log all errors into a hidden div. Selenium could then get the text from this div and parse it into a report of all errors that occurred on the page.

    For example, given the following page layout:

    <html>
        <head>
            <script>
                window.onerror = function(e) {
                    document.getElementById("hidden-selenium-log").innerText += e.toString() + ";";
                }
            </script>
        </head>
        <body>
    
            <div id="hidden-selenium-log" style="display: none;">
            </div>
    
            <div id="broken-button" onclick="unknownFunction()">broken</div>
    
        </body>
    </html>
    

    The script in the head tag would write all javascript errors into the div hidden-selenium-log. Clicking on the div broken-button would trigger the error event handler and log it into the hidden selenium log.

    After interacting with the page, you could then do something simple like:

    Driver.FindElement(By.Id("hidden-selenium-log")).text.split(";");
    

    This would get the text in the hidden selenium log and then split it by the semi-colon, a character I appended after each error logged.