javaweb-crawlerweb-content

Java - fetch web contents skipping middle pages in response reaching to the desired response


Java I have situation where i have to fetch web contents of response in result of form submit, but it is little tricky b/s the flow is not as simple as request and response, is as follows.

Submit button pressed -> Display page processing wait timer -> Display quick advertisement page -> Display page result.

Starting from "submit button pressed" i want to have "Display page result" contents and skip pages in between.

I have this sample code but it works only in one way, send request and receive response.

URL url;
InputStream is = null;
DataInputStream dis;
String line;

try {
    url = new URL("http://stackoverflow.com/");
    is = url.openStream();  // throws an IOException
    dis = new DataInputStream(new BufferedInputStream(is));

    while ((line = dis.readLine()) != null) {
        System.out.println(line);
    }
} catch (MalformedURLException mue) {
     mue.printStackTrace();
} catch (IOException ioe) {
     ioe.printStackTrace();
} finally {
    try {
        is.close();
    } catch (IOException ioe) {
        // nothing to see here
    }
}

any java library can do it for me? Thanks in advance.


Solution

  • Consider to give a try to Selenium web driver. May be it has what you are trying to implement.