I'm just beginning a project that involves working with a few of Google's APIs (for .NET), specifically the Contacts List, Calendar and Gmail. While Google does provide a wealth of information through their code.google.com network, finding what I need to get started has thus far proven to be a monumental task. What I'm hoping to find is a "big picture" look at what Google makes available to developers as well as a few sample pieces of code to ease me off the starting block.
Does anyone know where I can find a handful of simple, useful examples developing basic applications with Google's API (I've come across a few examples within code.google.com, but they're so rudimentary that they're not helpful)? Are there any resources (in print or online) that can spoon feed a Google novice without burying them? Is there a special, hidden nook within code.google.com for Google beginners?
Any information anyone could share would be very helpful.
http://code.google.com/apis/ajax/playground/
Related
I'm a 24yo Web Developer trying to improve my knowledge in this field.
I've been working on web since I was 12 and feel like I lack some fundamentals.
Many times I'm being rejected in interviews not because lack of talent, programming knowledge or a small portfolio (In fact, my portfolio is pretty big for a 24 yo dev), but because I can't answer many fundamental questions such as difference between/terminology about CRUD, REST, SOAP, OOP-related questions and such..
Going to university right now is impossible for many reasons so I was trying to get my hands on some books about dev fundamentals (mainly oriented to web dev). What are the best ones, and why? Which resources (Shouldn't necessary be books) should I look deeply into? And in the end.. What suggestions could you give to become a better developer?
I can only share from my own experience. During technical interviews, i had a cheat sheet printed and ready. That helped a lot on the telephone interview but also as a study guide. I can recommend the "PHP Zend Certification Study Guide" and php-fig.org to freshen up on Design Patterns and other things.
When the interviewer thinks you are qualified, you need to write code anyway. During the code writing you will probably write in OOP PHP with no framework. Prepare a simple mvc with some simple crud functionality, sessions and user login.
I have an e-commerce website and I need to display products in homepage based on user interest like advertisement showing on facebook and google based on the search we did on the internet.
Is there any API from facebook or google or any website for fetching user interests?
I have wondered how facebook, google, and booking.com collaboratively tracking us even if they are different companies.
Is there any common gateway or common interface where these big companies share user info by making a system with cookies to track among all as a centralized big data model for behavioral advertising?
I am looking for answers from experts for this. I really need to build a system for tracking user interest based on their search on google and other websites.
So you need to use something called recommender systems, using a machine learning algorithm, you'll recommend products to people based on ratings and interest. by using previous data of the same user or other users (just like when you get recommended videos on youtube).
this topic is too big for me to explain it step by step in here, and you need to first have a good understanding of primitive machine learning such as classification, regression ... etc
so if you're interested make sure to check a coursera course called Machine Learning (Stanford University) it's taught by machine learning rockstar Andrew NG, and it doesn't only teach you machine learning, but it takes you from somebody with no idea on the subject to an expert (technically) the course used MATLAB/OCTAVE and it has an entire section on Recommender Systems which is what you need, after you've finished just implement what you learned in the language of your choice !
PS:
you can always look up tutorial online for implementing Recommender Systems, but you will waste so much time because you would have no idea about what you're doing without understanding the theory which you will master in the course I've recommended, the course can also be found easily on youtube. but taking it for FREE in coursera will help you more because you'll have hands-on programming experience, on the different subjects.
For that google and facebook has their algorithm to track user movement and they use it for showing ads on their website.
i don't think that is available for common use.
I think you won't be able to Track there interest live Facebook, Google, Amazon, twitter. They are collecting your Interest form there own platform.
Also they manage large Add provider. SO once a Add has been clicked by you, it has been tracked. Also Google Play, itunes has access to your Phone.
I understand that same work should not be repeated when Google CSE is already there, so what may be the reasons to should consider implementing a dedicated search engine for a public facing website similar to SO(& why probably StackOverflow did that ?). Paid version of CSE(Google site Search), already eliminates several drawbacks that forced dedicated implementation. Cost may be one reason to not choose Google CSE, but what are other reasons ?
Another thing I want to ask is my site is similar kind as StackOverflow, so when Google indexes its content every now & then, won't that overload my database servers with lots of queries may be when there is peak traffic time?
I look forward to use Google Custom search API but I need to clarify whether the 1000 paid queries that I get for 5$ are valid only for 1 day or they get adjusted to extra queries(beyond free ones) on the next day & so on. Can anyone clarify on this too?
This depends on the content of your site, the frequency of the updates, and the kind of search you want to provide.
For example, with StackOverflow, there'd probably be no way to search for questions of an individual user through Google, but it can be done with an internal search engine easily.
Similarly, Google can outdate their API at any time; in fact, if past experience is any indication, Google has already done so with their Google Web Search API, where a lot of non-profits that had projects based on such API were left on the street with no Google options for continuation of their services (paying 100 USD/year for only 20'000 search queries per year, may be fine for a posh blog indeed, but greatly limits what you can actually use the search API for).
On the other hand, you probably already want to have Google index all of your pages, to get the organic search traffic, so Google CSE would probably use rather minimal resources of your server, compared to having a complete in-house search engine.
Now that Google Site Search is gone, the best search tool alternative for all the loyal Google fans is Google Custom Search (CSE)
Some of the features of Google Custom Search that I loved the most, were :-
Its free (with ads)
Ability to monetise those ads with your AdSense Account
Tons of Customization options, including removing the Google branding,
Ability to link it with Google Analytics account, for highly comprehensive analytical report,
Powerful auto correct feature to understand the real intention behind the typos,
Cons : Lacks customer Support…
Read More: https://www.techrbun.com/2019/05/google-custom-search-features.html
I am trying to build a search engine comparison tool between bing and google that will analyze which of the top n results are matching. Since I don't have much web-development experience, (most of my experience lies in Windows Application development and lower level stuff.) I was wondering if somebody could point me in the right direction. I'm guessing that one way of doing this would be to download the search results and somehow find all of the links which are results and then comparing them.
What language can I use to do this?
You could use a language of your choice and build upon APIs. Bing already has one
Although Google doesn't have a direct search API (at least none that I know of), if you are a student planning to do some research, you can sign up for their university program and they'll expose you an API. Trying to download the page and parsing it would be difficult, since Google uses some security measures to avoid direct crawls.
I am a newbie when it comes to information extraction. For the past several days, I have read a lot of academic papers and ordered a book on NLP. I want to figure out how I can build a FlipDog.com like system (hopefully not from scratch). They extract job openings from more than 60,000 company web sites. How do I get started?
I am open to learning any programming language. Has anybody used Mallet/GATE/MinorThird or RoadRunner? Ideally, I want to be able to train a system with the data set particular to my domain and have it extract information based on that. Which platform would you recommend for this purpose?
Thanks!
The faster way to extract job offerings is to use dapper.net (a web scraping service from websites). You can very easily to teach dapper to extract data using visual editor. It works very well when on your target websites you have tables.
To learn Information Extraction, I suggest to start from lingpipe. It is a java framework for Information Extraction, so you do not need to learn architectural specific features of the framework, such as Gate or Apache UIMA. On lingpipe website you will find a lot of tutorials which will help you to learn various Information Extraction approaches. After that I suggest to learn Gate and UIMA.
If you want to realize such a website, you also need to learn how to use web crawler frameworks (e.g., nutch), web search engines (yahoo, google, bing), and Information Retrieval engines (such as, apache lucene) to provide a search service on the top of extracted data.
Update:
For python, it is the best to start with: http://www.nltk.org/