Hi I am using python/beautiful soup to scrap information from foursquare's XML
it seems that after a certain number is reached, I cannot access foursquare at all.
Does anyone know if I need an access token even if I'm not trying to develop anything?
Scrapping foursquare content is prohibited ("You shall not (directly or indirectly)... harvest or scrape any Content from the Service;") and you probably triggered the protection mechanism guarding against developers doing this.
See the foursquare Terms of Service for more details
Related
I want to know what keywords brings users to our website. The result should be such that, every time a user clicks on a link of the company's website, the page URL, timestamp and keywords entered in search are recorded.
I'm not really much of a coder, but I do understand the basics of Google Tag Manager. So I'd appreciate some solutions that can allow me to implement this in GTM's interface itself.
Thanks!
You don't track them. Well, that is unless you can deploy your GTM on Google's search result pages. Which you're extremely unlikely to be able to do.
HTTPS prevents query parameters to get populated for referrers, which is what the core reason for it is.
You still can, technically, track Google search keywords for the extremely rare users who manage to use Google via http, but again, no need to do anything in GTM. GA will automatically track it with its legacy keyword tracking.
Finally, you can use Google Search Console where Google reports what keywords were used to get to your site. That information, however, is so heavily sampled that it's just not joinable to any of the GA data. It is possible, however, to join GSC with GA, but that will only lead to GA having a separate report from GSC and that's it. No real data joins.
My primary question is: Can connected apps add relevant information to venue pages?
I am a coder and avid Foursquare user. The basic information about venues is cool (location, photos, tips, etc.), but while I have my meal (in the case of a restaurant) I'd like to have more to read about the venue, such as the back-story, i.e., what's the history of the place, when was it founded, by who, and other interesting facts about the venue.
I thought connected apps would be the answer and that perhaps I could write a simple wiki to integrate with the venue page for users to provide their knowledge about the venue. But it seems from what I've read that's not the the intent of a connected app or the API. Am I correct is this assumption? And if so, can this idea be dropped into the Foursquare suggestion box? I think it would make a great value added feature - especially for us nerds who like to read.
This is a great use case for connected apps. Connected apps can reply to check-ins with up to 200 characters of text, and a link to more content. This can be used to provide additional information about the venue. Take a look at https://foursquare.com/apps/ to see examples of connected apps, and the kinds of responses they give to check-ins.
Is anyone aware of a way to get historical location info from Foursquare Venues. The problem is that if i look at a users checkin history and one of the venues has changed addresses, there is no way to get the old address / location via the api. For apps built on that data it means when a venue moves all the checkins prior to the move will have the wrong venue info.
Has anyone figured out a way to get that info (obviously this isnt a problem if the user has authed into our service before the change because we have it stored, but after we have no way to access it)?
There's no way to access the old location information, but information should only get strictly better with time. If you have a situation where location information for a venue has become less accurate, you should flag the venue and the super user community will try to fix it.
I'm looking to pull some information off of the people that check into my location to learn a bit about them. The plan is to offer them a special through foursquare once they've completed the form. Has anyone done this? Is it even possible?
See https://foursquare.com/business/merchants/claiming for information about claiming your venue and https://developer.foursquare.com/overview/merchants for the relevant API endpoints.
I'm looking for way to verify whether a site is categorized in any url filtering databases.
Does somebody know such database with free API?
Check out the Google Safe Browsing API. It allows you to check URLs against Google's blacklists of suspected phishing and malware pages. Here is the developer guide.
BrightCloud just released an API for URL classification that is more comprehensive than the Google Safe Browsing API - it does cost money but it's cheap. In addition to categories like "Adult" and "Gambling" it also has security oriented categories like phishing, malware, and spam.
The full list of categories is here: http://brightcloud.com/masterdburllist.asp
Checkout the SimilarWeb API. The API will perform domain level classification and is based on SimilarSite’s content analysis and machine learning.
Link to the API: https://developer.similarweb.com/
Disclosure, I’m affiliated with SimilarGroup, the creator of SimilarWeb.