so, im programming a python AI and im trying to make it as similar as JARVIS from iron man. its going great, but i want to do the following:
So, if you go now and search for "How did Gorge Harrison die?", google will show "Lung Cancer", followed by other websites. I want the AI, when asked this questions, to go to google and store the answer in a variable.
I want to store it in a variable because i am using the os module and i am using it to make the AI talk(os.system("spd-say 'hello'") or os.system("spd-say '" + variable + "'"))
Use the requests module to construct an http request then parse the json that comes back.
Related
I feel silly asking this.. but its doing my head..
if I use 'https://maps.googleapis.com/maps/api/place/autocomplete/json' and set the input parameter to say - 'Palazzo Cast' I will get about 5 suggestions - none of which will be the one I'm looking for. if I set input to 'Palazzo Castellania' I will get zero results - even though there is a place called this (see below). I've set the region parameter to 'mt'...
If I use 'https://maps.googleapis.com/maps/api/place/findplacefromtext' and set the input parameter to 'Palazzo Castellania' - I will get 'the Ministry of Health' - which is correct - however, if I put a partial string in I'll get only a single candidate which will be something different - there doesn't seem to be a way to get multiple place candidates?
I'm guessing from an API side - I have to do a multi-step process - but it would be good to get some input.
My thoughts:
I start with 'https://maps.googleapis.com/maps/api/place/autocomplete/json' - if I get an empty result, I try 'https://maps.googleapis.com/maps/api/place/findplacefromtext'
if I get a single result from either then I can pass the placeID to the places API to get more detailed data.
Make sense? It feels argly..
Edit
So watching how https://www.google.com.mt/ does it... while typing it uses suggest (and never gives the right answer, just like the API) and then when I hit enter it uses search and gives the correct answer... leading me to the conclusion that there is actually two databases happening!
Basically "its by design".. there is no fix as of Feb 2023.. My thoughts are to cache results and do a first search against that otherwise I'll probably use bing or here
I am trying to make a python code with tweepy that will track all the tweets from a specific country from a date which will have some of the chosen specific keywords. And I have chosen a lot of keywords like 24-25.
My keywords are vigilance anticipation interesting ecstacy joy serenity admiration trust acceptance terror fear apprehensive amazement surprize distraction grief sadness pensiveness loathing disgust boredom rage anger annoyance.
for more understanding, my code till now is:
places = api.geo_search(query="Canada",granularity="country")
place_id = places[0].id
public_tweets = tweepy.Cursor(api.search,
q="place:"+place_id+" since:2020-03-01",
lang="en",
).items(num_tweets)
Please help me with this question as soon as possible.
Thank You
I am using abenassi/Google-Search-API https://github.com/abenassi/Google-Search-API to make multiple Google queries in a small python script. I typically only need the first result (link) but the program is built to collect whole pages of results. So far I have been limiting the result as such:
results = google.search(query)
for result in iter(results[0:1]):
loc = result.link
The problem is that the script is slow as a result (I think) of having to wade through the whole page before I get my one link. Does anyone see something obvious I'm missing, or alternately, a simple way to modify the standard_search module https://github.com/abenassi/Google-Search-API/blob/master/google/modules/standard_search.py to limit results to first link only? Thanks!
I'm currently working on a behavioral targeting application and I need a considerably large keyword database/tool/provider that enables applications to reach to the similar keywords via given keyword for my app. I've recently found that Freebase, which had been providing a similar service before Google acquired them and then integrated to their Knowledge Graph. I was wondering if it's possible to have a list of related topics/keywords for the given entity.
import json
import urllib
api_key = 'API_KEY_HERE'
query = 'Yoga'
service_url = 'https://kgsearch.googleapis.com/v1/entities:search'
params = {
'query': query,
'limit': 10,
'indent': True,
'key': api_key,
}
url = service_url + '?' + urllib.urlencode(params)
response = json.loads(urllib.urlopen(url).read())
for element in response['itemListElement']:
print element['result']['name'] + ' (' + str(element['resultScore']) + ')'
The script above returns the queries below, though I'd like to receive related topics to yoga, such as health, fitness, gym and so on, rather than the things that has the word "Yoga" in their name.
Yoga Sutras of Patanjali (71.245544)
Yōga, Tokyo (28.808222)
Sri Aurobindo (28.727333)
Yoga Vasistha (28.637642)
Yoga Hosers (28.253984)
Yoga Lin (27.524054)
Patanjali (27.061115)
Yoga Journal (26.635073)
Kripalu Center (26.074436)
Yōga Station (25.10318)
I'd really appreciate any suggestions, and I'm also open to using any other API if there is any that I could make use of. Cheers.
See your point:) So here's the script I use for that using Serpstat's API. Here's how it works:
Script collects the keywords from Serpstat's database
Then, collects search suggestions from Serpstat's database
Finally, collects search suggestions from Google's suggestions
Note that to make script work correctly, it's preferable to fill all input boxes. But not all of them are required.
Keyword — required keyword
Search Engine — a search engine for which the analysis will be carried out. For example, for the US Google, you need to set the g_us. The entire list of available search engines can be found here.
Limit the maximum number of phrases from the organic issue, which will participate in the analysis. You cannot set more than 1000 here.
Default keys — list of two-word keywords. You should give each of them some "weight" to receive some kind of result if something goes wrong.
Format: type, keyword, "weight". Every keyword should be written from a new line.
Types:
w — one word
p — two words
Examples:
"w; bottle; 50" — initial weight of word bottle is 50.
"p; plastic bottle; 30" — initial weight of phrase plastic bottle is 30.
"w; plastic bottle; 20" — incorrect. You cannot use a two-word phrase for the "w" type.
Bad words — comma-separated list of words you want the script to exclude from the results.
Token — here you need to enter your token for API access. It can be found on your profile page.
You can download the source code for script here
I was just curious if anyone has had some success installing and importing the Geocoding module into Python. I believe I have successfully done this, but then the results of my queries are atypical.
For instance, whenever I enter an address with just a street name and number, it gives me a Query Error like this:
geopy.geocoders.googlev3.GQueryError: The geocode was successful but returned no results. This may occur if the geocode was passed a non-existent address or a latlng in a remote location.
However, if I include a city and/or state in the address, the query will run and return the place as only the city and state, and will basically disregard the street number and street name and will give me the lat and lng of the center of that city.
Example:
place,(lat,lng)= g.geocode("4224 Evans to Locks Road Augusta Georgia")
I just was wondering if there was perhaps a problem in my installation or something that is preventing the geocode from working. I honestly have no idea how this kind of problem occurs and am very new to Geocoding. I have moderate experience working with Python however. Any help is appreciated.
I'm not too familiar with Geopy, however if you use Geocoder on GitHub/PyPi you should be able to accomplish everything you need, lots of documentation to help you get started.
Here's how you would use it following your example:
>>> import geocoder
>>> g = geocoder.google("4224 Evans to Locks Road Augusta Georgia")
>>> g.latlng
(33.5411157, -82.11359039999999)
>>> g.json
...
How to install:
pip install geocoder