Hey I was wondering if there was a specific query that I could enter into google that would result in google giving me a list of the computer science home pages of all the universities with a '.edu' address.
I require this because I have to crawl these websites to accumulate lecture notes for courses.
Hence any help would be much appreciated.
Or if anyone knows of any index that is maintained with a list of urls of all such university computer science websites that would do as well.
Thanks in advance.
You could try:
site:.edu computer science
Example: https://www.google.com/search?q=site:.edu%20computer%20science
Related
I have an e-commerce website and I need to display products in homepage based on user interest like advertisement showing on facebook and google based on the search we did on the internet.
Is there any API from facebook or google or any website for fetching user interests?
I have wondered how facebook, google, and booking.com collaboratively tracking us even if they are different companies.
Is there any common gateway or common interface where these big companies share user info by making a system with cookies to track among all as a centralized big data model for behavioral advertising?
I am looking for answers from experts for this. I really need to build a system for tracking user interest based on their search on google and other websites.
So you need to use something called recommender systems, using a machine learning algorithm, you'll recommend products to people based on ratings and interest. by using previous data of the same user or other users (just like when you get recommended videos on youtube).
this topic is too big for me to explain it step by step in here, and you need to first have a good understanding of primitive machine learning such as classification, regression ... etc
so if you're interested make sure to check a coursera course called Machine Learning (Stanford University) it's taught by machine learning rockstar Andrew NG, and it doesn't only teach you machine learning, but it takes you from somebody with no idea on the subject to an expert (technically) the course used MATLAB/OCTAVE and it has an entire section on Recommender Systems which is what you need, after you've finished just implement what you learned in the language of your choice !
PS:
you can always look up tutorial online for implementing Recommender Systems, but you will waste so much time because you would have no idea about what you're doing without understanding the theory which you will master in the course I've recommended, the course can also be found easily on youtube. but taking it for FREE in coursera will help you more because you'll have hands-on programming experience, on the different subjects.
For that google and facebook has their algorithm to track user movement and they use it for showing ads on their website.
i don't think that is available for common use.
I think you won't be able to Track there interest live Facebook, Google, Amazon, twitter. They are collecting your Interest form there own platform.
Also they manage large Add provider. SO once a Add has been clicked by you, it has been tracked. Also Google Play, itunes has access to your Phone.
I am trying to develop an application where users will post content. It is a user-generated application, so every post will have a location attached to it, so that it can be filtered later for other users in that area or city.
For example: say users can list books on my website to sell. Now while listing I want to provide them a text box where they can enter a location. Now the entered location should be valid, so how do I verify that?
Also after posting the book, someone else searches for a book in his location then he/she should not only get results for his location but other nearby locations too.
These are few of my questions. If someone can answer them and guide me, I'd really appreciate it. Thanks.
To verify people's location, you'll want to use the HTML5 geolocation capabilities. Take a look here for a demo: http://merged.ca/iphone/html5-geolocation
Searching nearby is a bit trickier, but there are a few options. You could use a geocoding service (Google and Bing for example both offer geocoding REST APIs) to determine if people are in the same city, zip code, etc. Perhaps a better solution is to use database queries to search for nearby posts. Many databases now offer built-in geospatial data types to support exactly this kind of scenario. MySQL for example: http://dev.mysql.com/doc/refman/5.0/en/mysql-spatial-datatypes.html
I'm going to write a Web parser (an application that crawles on the web from one site to another).
How Can I find list of available domains/IPs in the internet (as complete as possible)?
How search engines find websites (What they use as a reliable list of registred IP/Domains for starting point)?
Thanks
As Michael P's comment indicates, depends on what your objective is.
My company recently wanted to answer a question about third-party tools used on leading websites. I used Alexa as a starting point to find the top (by traffic) websites, and created a parser that can answer the specific question my company asked. If you start from such a list, you can program your web crawler to follow the links it encounters to broaden your knowledge of sites on the web.
Hopefully that helps you think about the problem.
I am working on a website i will like to know the number of people who has visited the website. Can someone tell me what to do?
Use google analytics: http://www.google.com/analytics/
I would give you a code to insert but to be honest the best option is to use something like Google Analytics. It gives you a very good analysis of your website visits and has many features that will take you a very long time to develop
Since you've tagged this with asp.net, I presume you're running on IIS. Make sure logging is enabled for the site you're working with and then you can determine from the log files how many users are coming to your site by IP addresses.
Since it wasn't yet mentioned here in years, let me add that AWStats is very different from Google Analytics, but may anyway be a good web server traffic analysis tool for network administrators.
i working on a project which compare between load time of websites on different server around the world.
in my project i need to buy a spot for a website in several servers but i don't know in which country the server is actually located.
i found an add-on in firefox called "flagfox" (i'm not related to the program, to check it visit: http://flagfox.net/) which indicate in which country the site is located.
i want to know:
a) how can "flagfox" know in which country the site is located? (in is not by the extention e.g .com .uk)
b) how can i know where in the country itself the site is located, i.e in the u.s.a is not very helpful because the server can be in new-york or in los angeles which are several time zones apart.
c) if i don't know the answer to a, how can i verify that the data from "flagfox" or other software for that matter, is reliable?
thanks
From the Flagfox site:
It works by accessing an internal IP
address location database, basically a
rough map of the physical layout of
the Internet, based on data provided
by Maxmind.
You can get access to the maxmind dataset at http://www.maxmind.com/app/ip-location.
There's a good article about this here, complete with sample code. It references a database that is available here.
The database can also be obtained here.