I am beginner at VBA. I have done the below code by referring to lot of articles found online.
I am trying to fetch API data from a website. It is taking the first fetch and I need the data to be fetched every 5 mins. But it is not refreshing at all. What can I do? Can anyone have a look at the code and advise?
I am using the below code to get the JSON data and later I am extracting using a JSON parser.
Sub FetchOptionChain()
Dim Json As Object
Dim webURL, webURL2 As String, mainString, subString
Dim i As Integer
Dim j As Integer
Dim k As Integer
Dim l As Integer
Dim dtArr() As String
Dim request, request2 As Object
Dim HTML_Content As Object
Dim requestString As String
webURL2 = "https://www.nseindia.com/"
webURL = "https://www.nseindia.com/api/option-chain-indices?symbol=BANKNIFTY"
subString = "Resource not found"
Set HTML_Content = CreateObject("htmlfile")
'Get the WebPage Content to HTMLFile Object
With CreateObject("msxml2.xmlhttp")
.Open "GET", webURL2, False
.send
End With
FetchAgain:
With CreateObject("msxml2.xmlhttp")
.Open "GET", webURL, False
'Found online that I have to add the below to remove the cached results. Adding this is hanging the excel and it never comes out of it. Excel is hanging here
.setRequestHeader "Content-Type", "application/json"
.setRequestHeader "If-Modified-Since", "Sat, 1 Jan 2000 00:00:00 GMT"
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.141 Safari/537.36"
.send
mainString = .ResponseText
If InStr(mainString, subString) <> 0 Then
' Data has not been fetched properly. Will wait two seconds and try again.
Application.Wait (Now + TimeValue("0:00:2"))
GoTo FetchAgain
Added end with, end if and end sub. And fixed indenting to make code easier to read.
Sub FetchOptionChain()
Dim Json As Object
Dim webURL, webURL2 As String, mainString, subString
Dim i As Integer
Dim j As Integer
Dim k As Integer
Dim l As Integer
Dim dtArr() As String
Dim request, request2 As Object
Dim HTML_Content As Object
Dim requestString As String
webURL2 = "https://www.nseindia.com/"
webURL = "https://www.nseindia.com/api/option-chain-indices?symbol=BANKNIFTY"
subString = "Resource not found"
'''''''''''''''''''''''''''''''''''''''''''''
''' I don't understand this part though '''
'''''''''''''''''''''''''''''''''''''''''''''
Set HTML_Content = CreateObject("htmlfile")
'Get the WebPage Content to HTMLFile Object
With CreateObject("msxml2.xmlhttp")
.Open "GET", webURL2, False
.send
End With
'''''''''''''''''''''''''''''''''''''''''''''
'''''''''''''''' To here ''''''''''''''''''
'''''''''''''''''''''''''''''''''''''''''''''
FetchAgain:
With CreateObject("msxml2.xmlhttp")
.Open "GET", webURL, False
'Found online that I have to add the below to remove the cached results. Adding this is hanging the excel and it never comes out of it. Excel is hanging here
.setRequestHeader "Content-Type", "application/json"
.setRequestHeader "If-Modified-Since", "Sat, 1 Jan 2000 00:00:00 GMT"
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.141 Safari/537.36"
.send
mainString = .responseText
End With
If InStr(mainString, subString) <> 0 Then
' Data has not been fetched properly. Will wait two seconds and try again.
Application.Wait (Now + TimeValue("00:00:02"))
GoTo FetchAgain
End If
End Sub
But it runs and works as expected for me.
Related
Help gratefully received on this one. I have some VBA running in Excel that inspects a series of webpages displaying betting odds for football matches and puts the odds into my spreadsheet. It has been working perfectly for months and has stopped working in the last few weeks. Here is a simplified version of the code I'm using:
Sub TestImport()
Dim http As New MSXML2.XMLHTTP60
Dim html As New MSHTML.HTMLDocument
Dim htmlEle1 As MSHTML.IHTMLElement
Dim columncounter As Integer
Dim rowccounter As Integer
Dim targetwebpage As String
Dim ColumnHeader As Variant
On Error GoTo ErrorStop
trowNum = 1
targetwebpage = "https://www.oddschecker.com/football/english/premier-league"
With http
.Open "get", targetwebpage, False
.send
End With
Set table_data = html.getElementsByTagName("tr")
If table_data.Length = 0 Then GoTo SkipLeague
For Each trow In table_data
For Each tcell In trow.Children
If tcell.innerText <> "TIP" Then 'Ignore this
tcellNum = tcellNum + 1
Cells(trowNum, tcellNum) = tcell.innerText
End If
Next tcell
Cells(trowNum, 1) = Worksheets("Leagues").Cells(j, 1)
trowNum = trowNum + 1
tcellNum = 1
Next trow
SkipLeague:
ErrorStop:
End Sub
No data gets returned because [table_data] is always null. It's always null because there are no tags in my variable, [html]. Instead, [html] seems to be simply this:
"<HEAD></HEAD>
<BODY>
<P> </P></BODY>"
Why would [html] return this value when the actual webpage (https://www.oddschecker.com/football/english/premier-league) is much more complex when displayed in my browser? And why has this problem only started in the last few weeks?
I'd be grateful for any help on this.
I did a quick test and had no issue. Some page, like Google require the User-Agent to be sent, but not the oddschecker page.
Sub TestURL()
Debug.Print GetResult("https://www.oddschecker.com/football/english/premier-league")
End Sub
Function GetResult(url As String) As String
Dim XMLHTTP As Object, ret As String
Set XMLHTTP = CreateObject("MSXML2.ServerXMLHTTP")
XMLHTTP.Open "GET", url, False
XMLHTTP.setRequestHeader "Content-Type", "text/xml"
XMLHTTP.setRequestHeader "Cache-Control", "no-cache"
XMLHTTP.setRequestHeader "Pragma", "no-cache"
XMLHTTP.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 6.1; rv:25.0) Gecko/20100101 Firefox/25.0"
XMLHTTP.send
ret = XMLHTTP.responseText
GetResult = ret
End Function
I am trying to get data from Yahoo Finance on Excel on Mac.
As far as I know, the usual approach to get web data on Mac is WebQuery. However, sometimes it works without issues, sometimes throws an error 1004 for the same set of tickers it worked before without issue. Text of the error:
"Microsoft Excel cannot access the file %link%. There are several possible reasons:"
I have no clue why does that happen. The only suggestion is because the URL does not contain a cookie / crumb Yahoo needs.
For testing purposes, I used WinHttpRequest on Windows. It works - Excel successfully gets the data.
There's an alternative on Mac - Tim Hall's WebHelpers. I was able to get the cookie and the crumb on Mac with this great set of tools.
But when I try downloading the CSV from Yahoo the response.Content has this string: {"finance":{"result":null,"error":{"code":"Not Acceptable","description":"HTTP 406 Not Acceptable"}}}.
Generally, I have several questions:
Is there a way to add a cookie to the WebQuery approach? Still, I am not sure if that works and helps to evade the error.
Why does Response return Error 406? Particularly this code snippet:
client.BaseUrl = tickerURL
request.Method = HttpGet
request.Format = PlainText
request.AddCookie "Cookie", cookie
request.AddHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0"
Set response = client.Execute(request)
resultFromYahoo = response.Content
Here's a code to receive Yahoo Finance data using either WinHttpRequest on Windows or Tim Hall's package on Mac:
Sub getYahooFinanceData(stockTicker As String, StartDate As String, EndDate As String, frequency As String, cookie As String, crumb As String)
' forked from:
' http://investexcel.net/multiple-stock-quote-downloader-for-excel/
Dim resultFromYahoo As String
Dim objRequest
Dim csv_rows() As String
Dim tickerURL As String
'Make URL
tickerURL = "https://query1.finance.yahoo.com/v7/finance/download/" & stockTicker & _
"?period1=" & StartDate & _
"&period2=" & EndDate & _
"&interval=" & frequency & "&events=history" & "&crumb=" & crumb
'***************************************************
'Get data from Yahoo
#If Mac Then
Dim client As New WebClient
Dim response As WebResponse
Dim request As New WebRequest
client.BaseUrl = tickerURL
request.Method = HttpGet
request.Format = PlainText
request.AddCookie "Cookie", cookie
request.AddHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0"
Set response = client.Execute(request)
DoEvents
'' ERROR 406 on MAC ''
If response.StatusCode = Ok Then
resultFromYahoo = response.Content
Else
MsgBox "An error occured while getting data for " & stockTicker & "'", vbInformation
Exit Sub
End If
#Else
Set objRequest = CreateObject("WinHttp.WinHttpRequest.5.1")
With objRequest
.Open "GET", tickerURL, False
.SetRequestHeader "Cookie", cookie
.Send
.WaitForResponse
resultFromYahoo = .ResponseText
End With
#End If
'***************************************************
csv_rows() = Split(resultFromYahoo, Chr(10))
End Sub
Finally came to a solution! Found the answer in similar topic related to Python: https://stackoverflow.com/a/68259438/8524164
In short, we need to modify user-agent and other request parameters to emulate a real browser. Instead of this one line:
request.AddHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0"
We need to add 5 lines:
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36"
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"
request.AddHeader "Accept-Language", "en-US,en;q=0.5"
request.AddHeader "DNT", "1"
request.AddHeader "Connection", "close"
The final sub:
Sub getYahooFinanceData(stockTicker As String, StartDate As String, EndDate As String, frequency As String, cookie As String, crumb As String)
' forked from:
' http://investexcel.net/multiple-stock-quote-downloader-for-excel/
Dim resultFromYahoo As String
Dim objRequest
Dim csv_rows() As String
Dim tickerURL As String
'Make URL
tickerURL = "https://query1.finance.yahoo.com/v7/finance/download/" & stockTicker & _
"?period1=" & StartDate & _
"&period2=" & EndDate & _
"&interval=" & frequency & "&events=history" & "&crumb=" & crumb
'***************************************************
'Get data from Yahoo
#If Mac Then
Dim client As New WebClient
Dim response As WebResponse
Dim request As New WebRequest
client.BaseUrl = tickerURL
request.Method = HttpGet
request.Format = PlainText
request.AddCookie "Cookie", cookie
request.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36"
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"
request.AddHeader "Accept-Language", "en-US,en;q=0.5"
request.AddHeader "DNT", "1"
request.AddHeader "Connection", "close"
Set response = client.Execute(request)
DoEvents
If response.StatusCode = Ok Then
resultFromYahoo = response.Content
Else
MsgBox "An error occured while getting data for '" & stockTicker & "'", vbInformation
Exit Sub
End If
#Else
Set objRequest = CreateObject("WinHttp.WinHttpRequest.5.1")
With objRequest
.Open "GET", tickerURL, False
.SetRequestHeader "Cookie", cookie
.Send
.WaitForResponse
resultFromYahoo = .ResponseText
End With
#End If
'***************************************************
csv_rows() = Split(resultFromYahoo, Chr(10))
End Sub
I'm trying to create a macro to fetch some content from a webpage and write the same in an excel file in a customized manner. I've used two identical links from the same website. Here is one of them. I'm interested in three fields Name,Recipe and Ingredients.
The script that I've created can parse the data accordingly. However, I wanna arrange them in an excel file like this.
I've written so far (working flawlessly):
Sub GetAndArrangeData()
Dim HTML As New HTMLDocument, oPost As Object
Dim HTMLDoc As New HTMLDocument, ws As Worksheet
Dim oTitle As Object, oPosts As Object
Dim linklist As Variant, url As Variant
linklist = Array( _
"https://www.chelseasmessyapron.com/avocado-chicken-salad-2/", _
"https://www.chelseasmessyapron.com/caprese-quinoa-salad/" _
)
Set ws = ThisWorkbook.Worksheets("Sheet1")
For Each url In linklist
With CreateObject("MSXML2.XMLHTTP")
.Open "GET", url, False
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36"
.send
HTML.body.innerHTML = .responseText
End With
Set oTitle = HTML.querySelector("h1.entry-title")
Debug.Print oTitle.innerText
Set oPost = HTML.querySelectorAll(".cma-recipe-nutrition > .wprm-nutrition-label-container > span[class*='nutrition-container']")
For I = 0 To oPost.Length - 1
HTMLDoc.body.innerHTML = oPost(I).outerHTML
Debug.Print HTMLDoc.querySelector("span.wprm-nutrition-label-text-nutrition-label").innerText
Debug.Print HTMLDoc.querySelector("span[class*='nutrition-value']").innerText
Next I
Set oPosts = HTML.querySelectorAll(".wprm-recipe-block-container")
For I = 0 To oPosts.Length - 1
HTMLDoc.body.innerHTML = oPosts(I).outerHTML
On Error Resume Next
Debug.Print HTMLDoc.querySelector("span.wprm-recipe-details-label").innerText
Debug.Print HTMLDoc.querySelector("span.wprm-recipe-details").innerText
On Error GoTo 0
Next I
Next url
End Sub
How can I write the data in an excel file the way I've shown in the image above?
Btw, this is the result I got in the immediate window:
Avocado Chicken Salad
Calories:
542
Carbohydrates:
30
Protein:
11
Fat:
45
Saturated Fat:
7
Cholesterol:
16
Sodium:
285
Potassium:
687
Fiber:
8
Sugar:
9
Vitamin A:
945
Vitamin C:
19
Calcium:
36
Iron:
1
Course
Cuisine
Keyword
Prep Time
20
Cook Time
15
Total Time
35
Servings
2
Calories
542
Cost
$6.82
Caprese Quinoa Salad
Calories:
375
Carbohydrates:
30
Protein:
11
Fat:
26
Saturated Fat:
4
Cholesterol:
7
Sodium:
73
Potassium:
996
Fiber:
9
Sugar:
7
Vitamin A:
17616
Vitamin C:
32
Calcium:
168
Iron:
4
Course
Cuisine
Keyword
Prep Time
35
Cook Time
25
Chilling Time (Quinoa)
1
Total Time
2
Servings
6
Calories
375
Cost
$6.84
Basically, you just need to keep track where to write the data. I define a variable row that is set to the first row where you want to put data into. After every recipe, the number of rows written is added to row. To keep track of the number of rows, I am using two separate variables oPostNut and oPostsRecipe(instead of only oneoPosts`) and add the number of entries of the larger list - that's basically all.
(...)
Dim row As Long
row = 1 ' Change to whatever row you want to start
For Each url In linklist
With CreateObject("MSXML2.XMLHTTP")
.Open "GET", url, False
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36"
.send
HTML.body.innerHTML = .responseText
End With
Set oTitle = HTML.querySelector("h1.entry-title")
ws.Cells(row, 1) = oTitle.innerText
Dim i As long
Dim oPostsNut As Object
Set oPostsNut = HTML.querySelectorAll(".cma-recipe-nutrition > .wprm-nutrition-label-container > span[class*='nutrition-container']")
For i = 0 To oPostsNut.Length - 1
HTMLDoc.body.innerHTML = oPostsNut(i).outerHTML
ws.Cells(row + i, 2) = HTMLDoc.querySelector("span.wprm-nutrition-label-text-nutrition-label").innerText
ws.Cells(row + i, 3) = HTMLDoc.querySelector("span[class*='nutrition-value']").innerText
Next i
Dim oPostsRecipe As Object
Set oPostsRecipe = HTML.querySelectorAll(".wprm-recipe-block-container")
For i = 0 To oPostsRecipe.Length - 1
HTMLDoc.body.innerHTML = oPostsRecipe(i).outerHTML
On Error Resume Next
ws.Cells(row + i, 4) = HTMLDoc.querySelector("span.wprm-recipe-details-label").innerText
ws.Cells(row + i, 5) = HTMLDoc.querySelector("span.wprm-recipe-details").innerText
On Error GoTo 0
Next i
row = row + IIf(oPostsNut.Length > oPostsRecipe.Length, oPostsNut.Length, oPostsRecipe.Length)
Next url
I think we can do better. If we use a more selective css selector we can get rid of the additional info that I am seeing in other answer (12/02/21) and your original attempt. Using the selector below I remove that additional info and only return desired info. I work with an array as is faster than writing to sheet all the time. I remove the need for re-creating xmlhttp object and the additional HTMLDocument.
Option Explicit
Public Sub GetAndArrangeData()
Dim html As New MSHTML.HTMLDocument, xhr As Object, ws As Worksheet
Dim linklist As Variant, url As Variant, totalRows
linklist = Array( _
"https://www.chelseasmessyapron.com/avocado-chicken-salad-2/", _
"https://www.chelseasmessyapron.com/caprese-quinoa-salad/" _
)
Set ws = ThisWorkbook.Worksheets("Sheet1")
Set xhr = CreateObject("MSXML2.XMLHTTP")
totalRows = 1
For Each url In linklist
With xhr
.Open "GET", url, False
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36"
.send
html.body.innerHTML = .responseText
End With
Dim title As String
title = html.querySelector("h1.entry-title").innerText
Dim nutritionRows As Object, timesOtherRows As Object, maxRows As Long
Set nutritionRows = html.querySelectorAll(".wprm-nutrition-label-container .wprm-nutrition-label-text-nutrition-container")
Set timesOtherRows = html.querySelectorAll(".cma-recipe-mobile .wprm-recipe-times-container .wprm-recipe-block-container-columns, .wprm-recipe-meta-container ~ .wprm-recipe-block-container-columns")
maxRows = IIf(nutritionRows.Length > timesOtherRows.Length, nutritionRows.Length, timesOtherRows.Length) - 1
Dim recipeInfo(), i As Long
ReDim recipeInfo(1 To maxRows, 1 To 5)
On Error Resume Next
For i = 0 To maxRows
recipeInfo(i + 1, 1) = IIf(i = 0, title, vbNullString)
recipeInfo(i + 1, 2) = nutritionRows.Item(i).Children(0).innerText
recipeInfo(i + 1, 3) = nutritionRows.Item(i).Children(1).innerText
recipeInfo(i + 1, 4) = timesOtherRows.Item(i).Children(1).innerText
recipeInfo(i + 1, 5) = timesOtherRows.Item(i).Children(2).innerText
Next
On Error GoTo 0
ws.Cells(totalRows, 1).Resize(UBound(recipeInfo, 1), UBound(recipeInfo, 2)) = recipeInfo
totalRows = totalRows + maxRows
Next url
End Sub
JSON:
Perhaps easier though is to grab all the info as json from a script tag in the HEAD part of the response. You will need to wrap the response in body tags to prevent the HTML parser stripping this content out when you add it into the MSHTML.HTMLDocument object's body.innerHTML.
I am not going to show the json parsing as there are plenty of example but will show extracting it.
Option Explicit
Public Sub GetAndArrangeData()
Dim html As New MSHTML.HTMLDocument, xhr As Object
Dim linklist As Variant, url As Variant
linklist = Array( _
"https://www.chelseasmessyapron.com/avocado-chicken-salad-2/", _
"https://www.chelseasmessyapron.com/caprese-quinoa-salad/" _
)
Set xhr = CreateObject("MSXML2.XMLHTTP")
For Each url In linklist
With xhr
.Open "GET", url, False
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36"
.send
html.body.innerHTML = "<body>" & .responseText & "</body>"
End With
Debug.Print html.querySelector(".yoast-schema-graph").innerHTML
Next url
End Sub
I'm trying to parse the text within yellow colored area from a website which is visible when you fill in the inputbox next to parcel id and hit the search button. Here is an example parcel id 01-01-350000 for your test.
I've created a macro using xmlhttp requests to scrape the very content. It seems I've done everything in the right way but for some reason the macro is not working. It is still in the landing page even after making a post requests.
I've tried with:
Sub GetStatus()
Const Url$ = "https://obftax.baltimorecountymd.gov/(S(m15cp5mubgqql1yzzjrxez45))/Default.aspx"
Dim Html As New HTMLDocument
Dim elem As Object, sVal$, payload As Variant
sVal = "01-01-350000"
payload = "RetryCounter=0&Action=MainMenu&ParcelType=RE&ParcelID=" & sVal & "&ParcelAddress=&PageNumber=1&SearchType=ParcelID&SearchParcel=" & sVal & "&SearchTaxNumber=&SearchStreetNumber=&SearchStreetName="
With CreateObject("MSXML2.XMLHTTP")
.Open "POST", Url, False
.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.121 Safari/537.36"
.setRequestHeader "Content-Type", "application/x-www-form-urlencoded"
.Send payload
Html.body.innerHTML = .responseText
End With
Set elem = Html.querySelector("#tvrMessage")
If Not elem Is Nothing Then
MsgBox elem.innerText
Else:
MsgBox "failed to parse"
End If
End Sub
How can I scrape the text from the yellow colored area using vba making use of xmlhttp requests?
When executing this code it is filling the required text box on the website but pressing the find button gives the output 'not found' in the message box.
Subsequently, if i manually just click in the text box on the filled value and then click the find button, it shows the desired result.
How can I make this work?
Public Sub experiment()
Dim ie As InternetExplorer
Set ie = New InternetExplorer
ie.navigate "https://www.latlong.net/"
Do
DoEvents
Loop Until ie.readyState = READYSTATE_COMPLETE
Dim doc As HTMLDocument
Set doc = ie.document
Dim inputElement As HTMLInputElement
Set inputElement = doc.getElementsByClassName("width70")(0)
inputElement.Value = "Delhi Airport, India"
ie.Visible = True
doc.getElementById("btnfind").Click
Do
DoEvents
Loop Until ie.readyState = READYSTATE_COMPLETE
End Sub
Rather than IE automation and interacting with the page as a user would, the code below should emulate the request triggered by the 'Find' button on the page, but you need to assign a value to placeName in the code (currently it is "Delhi Airport, India").
If you are interested in only the co-ordinates (and no other information on the rest of the page), then this approach might be okay for you.
You'll need to add a reference (Tools > References > Scroll down and tick Microsoft XML, v6.0 > OK) before trying to run the code.
Option Explicit
Private Sub Experiment()
Dim placeName As String
placeName = "Delhi Airport, India"
Dim WebClient As MSXML2.ServerXMLHTTP60
Set WebClient = New MSXML2.ServerXMLHTTP60
With WebClient
.Open "POST", "https://www.latlong.net/_spm4.php", True
.setRequestHeader ":authority", "www.latlong.net"
.setRequestHeader ":method", "POST"
.setRequestHeader ":path", "/_spm4.php"
.setRequestHeader ":scheme", "https"
.setRequestHeader "accept", "*/*"
.setRequestHeader "content-type", "application/x-www-form-urlencoded"
.setRequestHeader "origin", "https://www.latlong.net"
.setRequestHeader "referer", "https://www.latlong.net/"
.setRequestHeader "user-agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.102 Safari/537.36"
.setRequestHeader "x-requested-with", "XMLHttpRequest"
Dim bodyToSend As String
bodyToSend = "c1=" & Application.EncodeURL(placeName) & "&action=gpcm&cp="
.send bodyToSend
.waitForResponse
MsgBox ("Server's response to the request for Place Name '" & placeName & "' is " & _
vbNewLine & vbNewLine & .responseText)
End With
End Sub
You can access the server's response (which will contain the co-ordinates if the request was successful) with WebClient.responseText (or just .responseText inside the With statement) -- and then do what you need to with it.