VBA Loop through row and return value - excel

Really would appreciate some help on this. I have a vba script that scrapes data, it opens the URL that is contained in the L column, L4 in this example. Then later down the script it inputs my given value into Col E, Row 4.
Sub ImportData()
...
With CreateObject("msxml2.xmlhttp")
.Open "GET", Range("L4"), False 'Cell that contains hyperlink
.send
HTML_Content.body.innerHTML = .responseText
End With
...
'Then I want to return a value
Sheets(1).Range("E4").Value = rng1.Offset(0, 1)
End Sub
I am trying to make a loop so that the script runs automatically and cycles through column L and runs the script for every row that contains a hyperlink in Col L, and then inputs the value to its respective row in Col E.
I have tried changing the code below that another user suggested without success:
Sub ImportData(urlToOpen as string)
...
.Open "GET", urlToOpen, False 'Cell that contains hyperlink
...
'Then I want to return a value
Sheets(1).Range(E, i).Value = rng1.Offset(0, 1) ' I know that's wrong
and add a calling procedure:
Sub CallRangeL_Urls()
For Each i In Sheet1.Range("L4:L200")
Call ImportData(i)
Next i
End Sub
I keep getting ByRef type argument mismatch error on Call ImportData(i)
Also I am not sure what so ever on how to acheive calling the value to the specific row that is being processed in the loop. Any help would be greatly appreciated. Thanks

Try the following:
Public Sub ImportData(ByVal urlToOpen As String)
And
Public Sub CallRangeL_Urls()
Dim i As Range
For Each i In Sheet1.Range("L4:L200")
ImportData i.Value
Next i
End Sub
Personally, I would reference the workbook as well and I usually use Worksheets("SheetName") but I know a lot of people like to use codeName.
You only want to pass the value of the link in the cell so ByVal is the appropriate way.
As touching the sheet is expensive I would probably dump the urls in an array and loop that, adding a basic test that I am working with an url:
Public Sub CallRangeL_Urls()
Dim arr(), i As Long
arr = Application.Transpose(Sheet1.Range("L4:L200").Value)
For i = LBound(arr) To UBound(arr)
If InStr(arr(i), "http") > 0 Then ImportData arr(i)
Next i
End Sub
To write out extracted value to column E, same row as url, I think you need to convert your ImportData sub to a function that returns the extracted value. Or better still, create a class to hold the xmlhttp object which then has a method to return the value (that way you don't keep creating and destroying the object - which you do if you create the object in the function. You could also create the xmlhttp object in the first sub and pass to the function as an argument - I show you pseudo code for that below).
Public Sub CallRangeL_Urls()
Dim arr(), i As Long
'code to create xmlhttp object
arr = Application.Transpose(Sheet1.Range("L4:L200").Value)
For i = LBound(arr) To UBound(arr)
If InStr(arr(i), "http") > 0 Then
Sheet1.Cells(i + 3, "E") = ImportData(arr(i), xmlhttpObject)
End If
Next i
End Sub
Public Function ImportData(ByVal urlToOpen As String, ByVal xmlhttpObject As Object) As String
''Any declarations
'Dim extractedValue As String
'Dim html As HTMLDocument
'Set html = New HTMLDocument
With xmlhttpObject
.Open "GET", urlToOpen, False
.send
html.body.innerHTML = .responseText
''code to extract value
'extractedValue = html.querySelector("someSelector")
ImportData = extractedValue
End Function

Related

Open series of links in cells, search website content and return a value

I'm looking to carry out a task in VBA of searching a number of websites for a specified phrase.
I've got a spreadsheet with a dynamic data which changes accordingly creating a link in column F.
I am looking for a macro to open each individual link, search website content for a specifif phrase and then if the phrase is present on the website then to return a value in a cell to the right Offset by 0,1 otherwise to leave the cell blank and move to next row.
Is such task possible to carry out through vba?
I've tried researching similar requests but the result is not what I require.
Code I found online is kind of what I need but it produces the same result if the phrase is present or absent on the website.
Option Explicit
Sub SearchForString()
Dim rngURL As Range
Dim cll As Range
Dim stCheck As String
Dim xmlHttp As Object
On Error Resume Next
Set xmlHttp = CreateObject("Microsoft.XMLHTTP")
If xmlHttp Is Nothing Then
MsgBox "Unable to create XMLHTTP object, it's probably not installed on this machine", vbCritical
Exit Sub
End If
Set rngURL = Application.InputBox("Select the range of URLs to check", "Select Range", Selection, Type:=8)
On Error GoTo 0
If rngURL Is Nothing Then Exit Sub
stCheck = InputBox("Enter the text to search", "", "")
If Len(stCheck) = 0 Then Exit Sub
For Each cll In rngURL.Cells
If CheckURL(xmlHttp, cll.Value, stCheck) Then
cll.Offset(, 1).Value = 1
End If
Next cll
End Sub
Private Function CheckURL(ByRef xmlHttp As Object, ByVal URL As String, ByVal stCheck As String) As Boolean
Dim stResult As String
If Not LCase$(URL) Like "http://*" Then
URL = "http://" & URL
End If
xmlHttp.Open "GET", URL, False
xmlHttp.Send ""
If xmlHttp.readyState = 4 Then
If xmlHttp.Status = 200 Then
stResult = xmlHttp.responseText
If InStr(1, stResult, stCheck, vbBinaryCompare) > 0 Then
CheckURL = True
End If
End If
End If
End Function
Many thanks in advance.

VBA in Excel 365: XMLHTTP - Can't search for presence of multiple strings for URLs?

So I have a long list of URLs in an Excel sheet. Using a VBA macro I'm checking for the presence of a number of strings in each URL's HTML. I need to know which URL has which combination of strings in them.
I'm outputting the presence or absence of the strings in the main table as essentially: URL being looked at - col 1, presence of 1st string in that URL? - col 2, presence of 2nd string in that URL? - col 3
(And if I can work out how to get this code working, I'll be checking for a few more strings and outputting their presence or absence in further columns).
The strings I'm searching for are encoded into the VBA string.
My code works fine if I'm just searching for the 1st string.
When I adjust the code to search for a 2nd string as well, it doesn't output as it should if my 2nd string is present too.
I'm using a PC, MS Excel 2007 (it's the MS 365 suite on a Lenovo brought this year with Windows 10. I think I'm messing up something easy in the code here, but I can't quite work out how to search for multiple strings rather than just one, and output which ones are present in separate columns next to the URL in question.
Here's the whole code:
Option Explicit
Sub SearchForString()
Dim rngURL As Range
Dim cll As Range
Dim stCheck As String
Dim xmlHttp As Object
On Error Resume Next
Set xmlHttp = CreateObject("Microsoft.XMLHTTP")
If xmlHttp Is Nothing Then
MsgBox "Unable to create XMLHTTP object, it's probably not installed on this machine", vbCritical
Exit Sub
End If
Set rngURL = Application.InputBox("Select the range of URLs to check", "Select Range", Selection, Type:=8)
On Error GoTo 0
If rngURL Is Nothing Then Exit Sub
stCheck = "Stringtofind1"
If Len(stCheck) = 0 Then Exit Sub
For Each cll In rngURL.Cells
If CheckURL(xmlHttp, cll.Value, stCheck) Then
cll.Offset(, 1).Value = "1st string found" 'This all works fine
End If
' ** Start of Bit I can't get to work **
' ** I want to check for a second string, and see if that's here. But I can't seem to get this bit to work? **
If CheckURL(xmlHttp, cll.Value, "2ndStringtofind") Then
cll.Offset(, 2).Value = "2nd string also found"
End If
' ** End of Bit I can't get to work **
Next cll
End Sub
Private Function CheckURL(ByRef xmlHttp As Object, ByVal URL As String, ByVal stCheck As String) As Boolean
Dim stResult As String
If Not LCase$(URL) Like "https://*" Then
URL = "https://" & URL
End If
xmlHttp.Open "GET", URL, False
xmlHttp.Send ""
If xmlHttp.readyState = 4 Then
If xmlHttp.Status = 200 Then
stResult = xmlHttp.responseText
If InStr(1, stResult, stCheck, vbBinaryCompare) > 0 Then
CheckURL = True 'Returns position where string is found!
End If
End If
End If
End Function
To make it easier to read, here's the bit where I think that I'm struggling when I add it to the script:
If CheckURL(xmlHttp, cll.Value, "2ndStringtofind") Then
cll.Offset(, 2).Value = "2nd string also found"
End If
2ndStringtofind isn't a variable, but the actual text I'm looking for.
All help gratefully received in debugging this code. Thanks guys.

Is there a way to slow down a Web Scraper so it will pick up the code?

I wrote a macro to go to WU to get historical data and for the most part, it works. However, I believe that the macro is running too fast for it to pick up the data from the website.
https://www.wunderground.com/history/daily/us/tx/el-paso/KELP/date/2017-1-3
Is the website and the table I want to get is tablesaw-sortable.
I have tried the following: DoEvents and Application.Wait (Now + TimeValue("00:00:01")) to try to slow down the process.
Sub BrowseToWU()
Dim IE As New SHDocVw.InternetExplorer
Dim HTMLDoc As MSHTML.HTMLDocument
Dim RowAddress As Integer
Dim WebAddress As String
Dim DateSheet As Date
Dim WkDay As Integer
Dim DateSheetName As String
'Application.ScreenUpdating = False
'Application.StatusBar = True
RowAddress = 2
IE.Visible = True
Do Until RowAddress = 60
WebAddress = Range("A" & RowAddress)
DateSheet = Right(WebAddress, 8)
DateSheetName = Right(WebAddress, 8)
WkDay = Weekday(DateSheet, vbSunday)
If WkDay < 3 Then
RowAddress = RowAddress + 1
ElseIf WkDay > 6 Then
RowAddress = RowAddress + 1
Else
IE.Navigate WebAddress
Do While IE.ReadyState <> READYSTATE_COMPLETE
Loop
Set HTMLDoc = IE.Document
DoEvents
Application.Wait (Now + TimeValue("00:00:05"))
DoEvents
ProcessHTMLPage HTMLDoc
DateSheet = Right(WebAddress, 8)
DoEvents
Application.Wait (Now + TimeValue("00:00:01"))
ActiveSheet.Name = DateSheetName
DoEvents
RowAddress = RowAddress + 1
'IE.Quit
Worksheets("Sheet1").Activate
End If
Loop
End Sub
Option Explicit
Sub ProcessHTMLPage(HTMLPage As MSHTML.HTMLDocument)
Dim HTMLTable As MSHTML.IHTMLElement
Dim HTMLTables As MSHTML.IHTMLElementCollection
Dim HTMLRow As MSHTML.IHTMLElement
Dim HTMLCell As MSHTML.IHTMLElement
Dim RowNum As Long, ColNum As Integer
'Dim IE As New SHDocVw.InternetExplorer
'Dim Ws As Worksheet
Set HTMLTables = HTMLPage.getElementsByClassName("tablesaw-sortable")
'DoEvents
For Each HTMLTable In HTMLTables
Worksheets.Add
DoEvents
Range("A1").Value = HTMLTable.className
Range("B1").Value = Now
RowNum = 2
For Each HTMLRow In HTMLTable.getElementsByTagName("tr")
'Debug.Print vbTab & HTMLRow.innerText
ColNum = 1
For Each HTMLCell In HTMLRow.Children
Cells(RowNum, ColNum) = HTMLCell.innerText
ColNum = ColNum + 1
Next HTMLCell
RowNum = RowNum + 1
Next HTMLRow
Next HTMLTable
DoEvents
'IE.Quit
End Sub
The macro is supposed to run through sheet1 picking up the web address to the historical data if it satisfies the criteria of being a certain day of the week.
IE will open and then it will kick over to the next module that will take in the data.
A new worksheet is created and the data pasted into the new worksheet.
The worksheet is renamed to the date of the data.
The web address sheet is activated again and the process starts over again.
The error I get is that the data isn't taken from the website, so the For statement ends and the web address sheet is renamed and an error occurs.
One way around this is to call the API that the page is using to get that info.
The API returns json which you can parse with a json parser. I use jsonconverter.bas. After installing the code from that link in a standard module called JsonConverter, go to VBE > Tools > References > Add a reference to Microsoft Scripting Runtime.
Finding the API:
If you press F12 to open developer tools and go to the Network tab and then press F5 to refresh any url of interest you will see the recorded web traffic. You can find the API call there.
See my answer here on how to search the network traffic using a specific observation value you expect to see in the response - this will filter the list of network traffic to those items containing the value of interest. Be sensible in selecting the value - you want something unlikely to occur elsewhere. You can also filter the network traffic to XHR only.
The API response:
The API returns json. More specifically, it returns a dictionary containing 2 keys. The second key, "observations", can be used to return a collection (denoted by []) of dictionaries (denoted by {}).
Each dictionary represents a row of the table (daily observations). You can loop this collection, and then loop the inner dictionaries, to access the table row values and reconstruct the table by populating an array. Explore example json response here.
Explanation of json structure:
click here to enlarge
Explanation of code:
The code is broken down into a number of helper subs and functions, allocating certains tasks to each, to
make code easier to debug and follow, as well as better align with Object Oriented Programming Principles.
Overall the process is:
Gather urls for Worksheet("Sheet1"). Helper function GetAllUrls.
Process those urls and only retain the dates which correspond with Tue-Thur. These are kept as strings formatted as "yyyymmdd" so can be passed to API later. This is handled by helper functions GetOnlyQualifyingUrlsDates and IncludeThisDate. IncludeThisDate performs the check for whether to include; GetOnlyQualifyingUrlsDates handles the looping and formatting of results.
Issue xmlhttp requests by looping over qualifying url dates and concatenating those into the url for the API call, then issuing the request. This is performed by the main sub GetTables.
Sheet creation, for output, is handled by helper function CreateWorksheet. This function calls another helper function, SheetExists, to ensure sheets are only created if they don't already exist, otherwise, the existing sheet by that name is used.
The resultant json response, from step 3, is passed to a helper sub WriteOutResults which accepts the json variable and the output sheet object as arguments. It extracts all the info from the json response; essentially reconstructing the table. It adds the table and headers to the appropriate sheet.
It calls helper function Epoch2Date, which handles the unix timestamp to datetime conversion for the two unix fields in the json object.
TODO:
The API key may be time limited. Add a helper function which returns the current valid key.
The API accepts start date and end date parameters in the url construct. It would be far better to issue one request for the entire range if possible, or chunked ranges e.g. months, to reduce the number of requests made. This would also reduce the likelihood of being blocked. This would mean some additional code would need to be written, before writing out results, to ensure only dates of interest are being written to sheets. Though you could write out all then simply loop all sheets and delete those that aren't wanted (perfectly doable if we are talking about 365 dates total). Personally, I would handle the include date part in the construction of the table from a single request (if possible) that has the min and max dates for entire urls listed passed as start and end date parameters. I would then write a single flat table out to one sheet as this will be much easier for later data analysis.
VBA:
Option Explicit
Public Sub GetTables()
'VBE > Tools > References > Microsoft Scripting Runtime
Dim json As Object, qualifyingUrlsDates(), urls(), url As String
Dim ws As Worksheet, wsOutput As Worksheet, i As Long, startDate As String, endDate As String
Set ws = ThisWorkbook.Worksheets("Sheet1")
urls = GetAllUrls(2, ws, "A")
qualifyingUrlsDates = GetOnlyQualifyingUrlsDates(urls)
'API key may be not be valid over time so look at obtaining by prior request
With CreateObject("MSXML2.XMLHTTP") 'issue xmlhttp request for each valid date (this would be better done using start and enddate to specify entire range _
of batches e.g. months within total range to cut down on requests
For i = LBound(qualifyingUrlsDates) To UBound(qualifyingUrlsDates)
startDate = qualifyingUrlsDates(i)
endDate = startDate ' a little verbose but useful for explaining
url = "https://api.weather.com/v1/geocode/31.76/-106.49/observations/historical.json?apiKey=6532d6454b8aa370768e63d6ba5a832e&startDate=" & startDate & "&endDate=" & endDate & "&units=e"
.Open "GET", url, False
.send
Set json = JsonConverter.ParseJson(.responseText)("observations")
Set wsOutput = CreateWorksheet(qualifyingUrlsDates(i))
WriteOutResults wsOutput, json
Next
End With
End Sub
Public Sub WriteOutResults(ByVal wsOutput As Worksheet, ByVal json As Object)
'json is a collection of dictionaries. Each dictionary is a time period reading from the day i.e. one row in output
Dim results(), item As Object, headers(), r As Long, c As Long, key As Variant
headers = json.item(1).keys 'get the headers which are the keys of each dictionary
ReDim results(1 To json.Count, 1 To UBound(headers) + 1)
For Each item In json
r = r + 1: c = 0 'increase row in results array to store results for table row
For Each key In item.keys
c = c + 1 'increase column number in results array for writing out results
Select Case key
Case "valid_time_gmt", "expire_time_gmt" 'convert unix timestamp fields to datetime
results(r, c) = Epoch2Date(item(key))
Case Else
results(r, c) = item(key)
End Select
Next
Next
With wsOutput
.Cells(1, 1).Resize(1, UBound(headers) + 1) = headers
.Cells(2, 1).Resize(UBound(results, 1), UBound(results, 2)) = results
End With
End Sub
Public Function GetOnlyQualifyingUrlsDates(ByVal urls As Variant) As Variant
Dim i As Long, output(), counter As Long
ReDim output(1 To UBound(urls))
For i = LBound(urls) To UBound(urls)
If IncludeThisDate(urls(i)) Then 'check if weekday is to be included
counter = counter + 1
output(counter) = Format$(Right$(urls(i), 8), "yyyymmdd") 'if to include then add to output array of urls of interest
End If
Next
ReDim Preserve output(1 To counter)
GetOnlyQualifyingUrlsDates = output
End Function
Public Function IncludeThisDate(ByVal url As String) As Boolean
'tue, wed, thurs are valid
IncludeThisDate = Not IsError(Application.Match(Weekday(Right$(url, 8), vbSunday), Array(3, 4, 5)))
End Function
Public Function SheetExists(ByVal sheetName As String) As Boolean '<== function by #Rory
SheetExists = Evaluate("ISREF('" & sheetName & "'!A1)")
End Function
Public Function GetAllUrls(ByVal startRow As Long, ByVal ws As Worksheet, ByVal columnName As String) As Variant
'transpose used based on premise no more than a couple of years of dates
'startRow is start row for urls, ws is sheet where urls found, columnName is string representation of column for urls e.g. "A"
With ws
GetAllUrls = Application.Transpose(ws.Range("A" & startRow & ":A" & .Cells(.rows.Count, columnName).End(xlUp).Row).Value)
End With
End Function
Public Function CreateWorksheet(ByVal sheetName As String) As Worksheet
Dim ws As Worksheet
If SheetExists(sheetName) Then
Set ws = ThisWorkbook.Worksheets(sheetName)
'do something.... clear it? Then add new data to it?
Else
Set ws = ThisWorkbook.Worksheets.Add
ws.Name = sheetName
End If
Set CreateWorksheet = ws
End Function
Public Function Epoch2Date(ByVal E As Currency, Optional msFrac) As Date '# Schmidt http://www.vbforums.com/showthread.php?805245-EPOCH-to-Date-and-vice-versa
Const Estart As Double = #1/1/1970#
msFrac = 0
If E > 10000000000# Then E = E * 0.001: msFrac = E - Int(E)
Epoch2Date = Estart + (E - msFrac) / 86400
End Function

Search a website with Excel data to extract results and then loop

I have 8000 values in an Excel spreadsheet.
I need to search a website and then record a specific line of data from the website to in the Excel spreadsheet.
I found code which searches for data excel macro to search a website and extract results
Sub URL_Get_ABN_Query()
strSearch = Range("a1")
With ActiveSheet.QueryTables.Add( _
Connection:="URL;http://www.abr.business.gov.au/SearchByABN.aspx?SearchText=" & _
strSearch & "&safe=active", _
Destination:=Range("a5"))
.BackgroundQuery = True
.TablesOnlyFromHTML = True
.Refresh BackgroundQuery:=False
.SaveData = True
End With
'enter code here
End Sub
It collects the data from the website like this.
I only want the 'entity type' data line.
I can not find how to extend the code to only grab this line and input to the corresponding cell. i.e. ABN(b2)search, find input 'entity type' and paste into Company Type(c2).
Alternatively, I tried to find how to fill the information vertically instead of horizontally. I could delete the columns that are not needed. I thought this may be simpler.
I tried to record the macro with developer.
I also need to loop to the next ABN and populate the corresponding field and so on (B3>C3, B4>C4, etc.).
This is absolutely possible. You've got what I often find the hardest part, sourcing the information from another platform. To make this work I would separate it out a little bit and for simplicity use 2 sheets (Sheet1 with your known data and Sheet2 for the web data).
Loop through your table of ~8000 businesses. We can identify this from the UsedRange number of Rows. We know that the ABN is in column 2 (also known as B) so we copy that into the variable to pass to the function. The function will return the "Entity type:" to column 3 (C) of the same row.
Sub LoopThroughBusinesses()
Dim i As Integer
Dim ABN As String
For i = 2 To Sheet1.UsedRange.Rows.Count
ABN = Sheet1.Cells(i, 2)
Sheet1.Cells(i, 3) = URL_Get_ABN_Query(ABN)
Next i
End Sub
Change the subroutine you created to a Function so it returns the entity type you are after. The function will save the data into Sheet2 and then return just the Entity data that we are after.
Function URL_Get_ABN_Query(strSearch As String) As String ' Change it from a Sub to a Function that returns the desired string
' strSearch = Range("a1") ' This is now passed as a parameter into the Function
Dim entityRange As Range
With Sheet2.QueryTables.Add( _
Connection:="URL;http://www.abr.business.gov.au/SearchByABN.aspx?SearchText=" & strSearch & "&safe=active", _
Destination:=Sheet2.Range("A1")) ' Change this destination to Sheet2
.BackgroundQuery = True
.TablesOnlyFromHTML = True
.Refresh BackgroundQuery:=False
.SaveData = True
End With
' Find the Range that has "Entity Type:"
Set entityRange = Sheet2.UsedRange.Find("Entity type:")
' Then return the value of the cell to its' right
URL_Get_ABN_Query = entityRange.Offset(0, 1).Value2
' Clear Sheet2 for the next run
Sheet2.UsedRange.Delete
End Function
You do not want a load of connections (queryTables) set up in this way. It will be so slow if even possible. At 8000 requests, provided xmlhttp is not blocked or throttled, the below method will be significantly faster. If there does appear to be slowing/blocking then add in a small wait every x requests.
If possible use xmlhttp to gather data. Use css selectors to specifically target the entity type. Store values in an array and write out with loop at end. Use a class to hold the xmlhttp object for greater efficiency. Provide your class with methods including how to handle not found (example given). Add some further optimizations e.g. given is switching off screen-updating. This assumes your search numbers are in column B from B2. The code below also does some basic checks that there is something present in column B and handles the case of there being 1 or more numbers.
Good code is modular and you want a function to return something and a sub to perform actions. A single sub/function shouldn't complete lots of tasks. You want to easily debug with code that follows the principle of single responsibility (or close to it).
class clsHTTP
Option Explicit
Private http As Object
Private Sub Class_Initialize()
Set http = CreateObject("MSXML2.XMLHTTP")
End Sub
Public Function GetHTML(ByVal URL As String) As String
Dim sResponse As String
With http
.Open "GET", URL, False
.send
GetHTML = StrConv(.responseBody, vbUnicode)
End With
End Function
Public Function GetEntityType(ByVal html As HTMLDocument) As String
On Error GoTo errhand:
GetEntityType = html.querySelector("a[href*='EntityTypeDescription']").innerText
Exit Function
errhand:
GetEntityType = "Not Found"
End Function
Standard module:
Option Explicit
Public Sub GetInfo()
Dim http As clsHTTP, sResponse As String, lastRow As Long, groupResults(), i As Long, html As HTMLDocument
Set html = New HTMLDocument
Set http = New clsHTTP
Const BASE_URL As String = "http://www.abr.business.gov.au/ABN/View/"
Application.ScreenUpdating = False
With ThisWorkbook.Worksheets("Sheet1")
lastRow = .Cells(.rows.Count, "B").End(xlUp).Row
Select Case lastRow
Case 1
Exit Sub
Case 2
ReDim arr(1, 1): arr(1, 1) = .Range("B2").Value
Case Else
arr = .Range("B2:B" & lastRow).Value
End Select
ReDim groupResults(1 To lastRow - 1)
With http
For i = LBound(arr, 1) To UBound(arr, 1)
If Len(BASE_URL & arr(i, 1)) > Len(BASE_URL) Then
sResponse = .GetHTML(BASE_URL & arr(i, 1))
html.body.innerHTML = sResponse
groupResults(i) = .GetEntityType(html)
sResponse = vbNullString: html.body.innerHTML = vbNullString
End If
Next
End With
For i = LBound(groupResults) To UBound(groupResults)
.Cells(i + 1, "C") = groupResults(i)
Next
End With
Application.ScreenUpdating = True
End Sub
References (VBE> Tools > References):
Microsoft HTML Object Library
CSS selectors:
I use the fact the entity description is a hyperlink (a tag) and that its value contains the string EntityTypeDescription to use a css attribute = value with contains (*) operator to target.

Sort dead hyperlinks in Excel with VBA?

The title says it:
I have an excel Sheet with an column full of hyperlinks. Now I want that an VBA Script checks which hyperlinks are dead or work and makes an entry into the next columns either with the text 404 Error or active.
Hopefully someone can help me because I am not really good at VB.
EDIT:
I found # http://www.utteraccess.com/forums/printthread.php?Cat=&Board=84&main=1037294&type=thread
A solution which is made for word but the Problem is that I need this solution for Excel. Can someone translate this to Excel solution?
Private Sub testHyperlinks()
Dim thisHyperlink As Hyperlink
For Each thisHyperlink In ActiveDocument.Hyperlinks
If thisHyperlink.Address <> "" And Left(thisHyperlink.Address, 6) <> "mailto" Then
If Not IsURLGood(thisHyperlink.Address) Then
Debug.Print thisHyperlink.Address
End If
End If
Next
End Sub
Private Function IsURLGood(url As String) As Boolean
' Test the URL to see if it is good
Dim request As New WinHttpRequest
On Error GoTo IsURLGoodError
request.Open "GET", url
request.Send
If request.Status = 200 Then
IsURLGood = True
Else
IsURLGood = False
End If
Exit Function
IsURLGoodError:
IsURLGood = False
End Function
First add a reference to Microsoft XML V3 (or above), using Tools->References. Then paste this code:
Option Explicit
Sub CheckHyperlinks()
Dim oColumn As Range
Set oColumn = GetColumn() ' replace this with code to get the relevant column
Dim oCell As Range
For Each oCell In oColumn.Cells
If oCell.Hyperlinks.Count > 0 Then
Dim oHyperlink As Hyperlink
Set oHyperlink = oCell.Hyperlinks(1) ' I assume only 1 hyperlink per cell
Dim strResult As String
strResult = GetResult(oHyperlink.Address)
oCell.Offset(0, 1).Value = strResult
End If
Next oCell
End Sub
Private Function GetResult(ByVal strUrl As String) As String
On Error Goto ErrorHandler
Dim oHttp As New MSXML2.XMLHTTP30
oHttp.Open "HEAD", strUrl, False
oHttp.send
GetResult = oHttp.Status & " " & oHttp.statusText
Exit Function
ErrorHandler:
GetResult = "Error: " & Err.Description
End Function
Private Function GetColumn() As Range
Set GetColumn = ActiveWorkbook.Worksheets(1).Range("A:A")
End Function
Gary's code is perfect, but I would rather use a public function in a module and use it in a cell as function. The advantage is that you can use it in a cell of your choice or anyother more complex function.
In the code below I have adjusted Gary's code to return a boolean and you can then use this output in an =IF(CHECKHYPERLINK(A1);"OK";"FAILED"). Alternatively you could return an Integer and return the status itself (eg.: =IF(CHECKHYPERLINK(A1)=200;"OK";"FAILED"))
A1: http://www.whatever.com
A2: =IF(CHECKHYPERLINK(A1);"OK";"FAILED")
To use this code please follow Gary's instructions and additionally add a module to the workbook (right click on the VBAProject --> Insert --> Module) and paste the code into the module.
Option Explicit
Public Function CheckHyperlink(ByVal strUrl As String) As Boolean
Dim oHttp As New MSXML2.XMLHTTP30
On Error GoTo ErrorHandler
oHttp.Open "HEAD", strUrl, False
oHttp.send
If Not oHttp.Status = 200 Then CheckHyperlink = False Else CheckHyperlink = True
Exit Function
ErrorHandler:
CheckHyperlink = False
End Function
Please also be aware that, if the page is down, the timeout can be long.

Resources