excelvbaweb-scrapinginternet-explorer

Importing/scraping an website into excel


I am trying to scrape some data from a database, and I have it pretty much set. I look in IE for a tab that has me logged in into the database, and paste the query link there through vba. But how do I extract the data that it returns from the IE tab and put that into an excel cell or array.

This is the code I have for opening my query:

Sub import()
Dim row As Integer
Dim strTargetFile As String
Dim wb As Workbook
Dim test As String
Dim ie As Object


Call Fill_Array_Cultivar

 For row = 3 To 4

    Sheets.Add.Name = Cultivar_Array(row, 1)
    strTargetFile = "https://www3.wipo.int/pluto/user/jsp/select.jsp?fl=app_date%2Cden_info%2Cden_final&hl=false&json.nl=map&wt=json&type=upov&start=0&qi=3-nNCXQ6etEVv184O9nnd5yg%3D%3D&q=cc%3AIT%20AND%20latin_name%3A(zea%20mays)%20AND%20den_info%3A" & Trim(Cultivar_Array(row, 1)) & "&facet=false"

        Set ie = GetIE("https://www3.wipo.int" & "*")
         If Not ie Is Nothing Then

            ie.navigate (strTargetFile)

Else
    MsgBox "IE not found!"
End If
Next row

End Sub

And this is the appropriate function:

'Find an IE window with a matching (partial) URL
'Assumes no frames.
Function GetIE(sAddress As String) As Object

Dim objShell As Object, objShellWindows As Object, o As Object
Dim retVal As Object, sURL As String


    Set retVal = Nothing
    Set objShell = CreateObject("Shell.Application")
    Set objShellWindows = objShell.Windows

 'see if IE is already open
    For Each o In objShellWindows
        sURL = ""
        On Error Resume Next
        sURL = o.document.Location
        On Error GoTo 0
        If sURL <> "" Then
            If sURL Like sAddress & "*" Then
              Set retVal = o
              Exit For
            End If
        End If
    Next o

Set GetIE = retVal
End Function

What the website returns to me is a white page with a line of text. Here is an example:

 {"response":{"start":0,"docs":[{"den_final":"Abacus","app_date":"1998-01-13T22:59:59Z"}],"numFound":1},"qi":"3-nNCXQ6etEVv184O9nnd5yg==","sv":"bswa2.wipo.int","lastUpdated":1436333633993}

PS. I also tried using the importxml function, it will import the website, but only an error page, as it does not recognize me as logged in.


Solution

  • I found the solution, which was fairly simple but hard to find. I can just grab the ie.Document.body.innertext which is all the text I need. See the code I updated below:

    Sub import()
    Dim row As Integer
    Dim strTargetFile As String
    Dim wb As Workbook
    Dim test As String
    Dim ie As Object
    Dim pageText As String
    
    Call Fill_Array_Cultivar
    
    For row = 3 To 4
    
        Sheets.Add.Name = Cultivar_Array(row, 1)
        strTargetFile = "https://www3.wipo.int/pluto/user/jsp/select.jsp?fl=app_date%2Cden_info%2Cden_final&hl=false&json.nl=map&wt=json&type=upov&start=0&qi=3-nNCXQ6etEVv184O9nnd5yg%3D%3D&q=cc%3AIT%20AND%20latin_name%3A(zea%20mays)%20AND%20den_info%3A" & Trim(Cultivar_Array(row, 1)) & "&facet=false"
    
        Set ie = GetIE("https://www3.wipo.int" & "*")
        If Not ie Is Nothing Then
            ie.navigate (strTargetFile)
    
            Do Until ie.ReadyState = 4: DoEvents: Loop
    
            pageText = ie.Document.body.innertext
            ActiveSheet.Cells(1, 1) = pageText
            pageText = Empty
        Else
            MsgBox "IE not found!"
        End If
    Next row
    End Sub