Which language is used to make google docs and box.net? - google-docs

I want to know how and which things are used to make google docs and box.net ?

Most of the UI functionality comes from using Javascript and HTML's DOM together with AJAX, a technique for using JS to make additional requests of the server without reloading the page.
In terms of the back-end languages (that provide the dynamic content) box.net returns PHPSESSID as part of it's set-cookie http response. They're also running nginx. So I would suspect one of the many PHP frameworks as being in use.
As for google docs, Google are known to use python quite extensively. Google's "App Engine" uses Python or Java as its languages (I believe Python was added first). So I suspect they use some customised form of python based on their own instance of their own app engine. Their http headers give nothing away, except that the Server: GSE line.

According to HowStuffWorks, Google Docs uses Java for the backend and JavaScript for the front end. Of course, HTML is in the mix there as well.
As for the database it uses, Google won't say. It will use the cloud though, we can be sure of that.

Related

Google apps script in HTML

Is it possible to use google apps script in my HTML? I want to be able to write to a spreadsheet from a form in purely Javascript from an external framework such as Node.js.
https://developers.google.com/apps-script/
Google Apps Script's syntax is Javascript, however it is a unique server-side framework that does not behave as a library to applications outside of the Apps Script servers. (No, you won't be able to use Google Apps Script in your node.js app.)
However, that doesn't mean that your node.js app (or any other app on the web) can't interact with your spreadsheet. For instance, your app could authenticate as you using the OAuth API, then access the spreadsheet through the Google Drive API. For an example of this, see Accessing Google Spreadsheets from Node.js
Alternatively, you could roll your own spreadsheet API in Google Apps Script, to support read / write of your sheet via HTTP requests from your node.js app. There are plenty of examples of that, for example Insert new rows into Google Spreadsheet via cURL/PHP - HOW?.
Sure you can. You can use HtmlService to create your web form, then send the submission data to your Spreadsheet with server functions.
Nowadays you could use the Google Apps Script API to call your Google Apps Script code from other platforms like Node.js, actually the official docs include a quickstart for Node.js.
You can you use HtmlService, but maybe can be helpfull to read the Google Hosted Libraries https://developers.google.com/speed/libraries/
To use a Javascrtip library inside GAS, I recommend JQuery.
But Maybe, you can use Node.js inside your external website and make a AJAX Request (get or post) to a GAS and return from GAS this:
ContentService.createTextOutput(e.parameter.callback + "("+Utilities.jsonStringify(JSONDATA)+")").setMimeType(ContentService.MimeType.JSON);
After that, process it inside your AJAX request...
Mogsad is right that you might be better of with Google Drive API to interact with your Spreadsheet!
...But depending on your exact need you might have some possible interaction between external service and google apps scrip using Content service Google Dev link.
Content Service can send back several information upon GET request (ATOM, CSV, ICAL, JAVASCRIPT, JSON, RSS, TEXT, VCARD, XML). By playing around with url parameters you can get information out and in a spreadsheet, send an email, trigger some action etc!.
But that is far from a real external library and direct interaction with server side functions!

Emitting node.js views and scripts as snippets

I have built a node.js app for which i would like to realize "snippets" to be included in external web applications. It means that i must create some javascript scripts to be included and called from external apps that call a node.js view and its scripts/css .
Does node.js provide a way to do it natively or do i have to create the script that embeds the view and the related client libraries?
enable cross-origin resource sharing:
Cross-Origin Resource Sharing (CORS) is a specification that enables truly open access across domain-boundaries. If you serve public content, please consider using CORS to open it up for universal JavaScript/browser access.
Must read: http://enable-cors.org/#how-expressJS
Important stuff:
Access-Control-Allow-Origin
Access-Control-Allow-Headers
Sounds like components might be your answer:
https://github.com/component/component
http://tjholowaychuk.com/post/27984551477/components
I hope I understand your question - You want to display an html-like snippet on a different site.
One way of doing it is to provide an API, but it will probably be a
JSON API, and the other site will have to display it on its own
(somebody already noted CORS is needed for this). You could just serve a JSON with html in it (though you need to make sure the other app doesn't escape it)
You could have your server serve an image (like they do in travis CI), but then the other site will show it as an image (copy paste the text won't be possible)
You could use Iframe, serving an html to this other site.
There's the possibility you meant something totally different, like reusing your server an client code - in that case I recommend http://browserify.org, or the already mentioned component.js.

How to control web browser using some programming language?

I am looking for a way to control a web browser such as firefox or chrome. I need something like "selenium webdriver" but that will allow me to open many instances URL load, get http headers, response code, get response content, load time, etc.
Is there any library, framework, api that I could use to do it? I couldn't find one exactly that does all, selenium opens browser and go to url but I can't get http headers
Selenium and Jellyfish are strong options in general. Jellyfish is an option that uses Node.js - although I have no experience with it, I've heard good things from my colleagues.
If you just want to get headers and such, you could use the cURL library or wget. I've used cURL with NuSOAP to query XML web services in PHP, for example. The downside is that these are not functional browsers, and merely perform the HTTP requests and consume the response.
http://seleniumhq.org/
https://github.com/admc/jellyfish
http://curl.haxx.se/

Using Google AppEngine Urlfetch instead of urllib2

What is the difference between Google's urlfetch over the python lib urllib2 ?
When I came upon Google's urlfetch I thought maybe there were security reasons. Perhaps Google is safer in terms of malicous urls or something?
Is there any reason why I should choose Google's urlfetch over urllib2?
Note that in GAE urllib, urllib2 and httplib are just wrappers around UrlFetch (See Fetching urls in Python).
One difference of the urlfetch module is that provides you with an interface for making Asynchronous requests.
I don't work for Google, so this is just a guess from various GAE posts I've read. App Engine instances don't face the internet directly, but are buried behind layers of Google infrastructure. When a browser makes an HTTP request, it doesn't go straight to your instance, but rather it hits a Google edge server which eventually routes the request to a GAE instance.
Likewise when making an HTTP request out, your instance doesn't just open a socket (which urllib2 will normally do), but rather it sends the HTTP request to some other Google edge server which goes makes that HTTP request. Using urllib2 on GAE will use a GAE specific version which runs on top of urlfetch.
There is no problem to use standard libraries in App Engine. Url Fetch Api is just a service to make HTTP request more "easily" than urlib2. It is more understable for a novice in Python and you can easily use a non blocking request for example.
I suggest you to read some complementary information here: https://developers.google.com/appengine/docs/python/urlfetch/overview
If google found some security problem on a Python standard libraries. I guess It will send a fix ;)
The difference is : urlfetch only has a functional interface and urllib and httplib have a OO interface. An OO interface can be very usefull. I have seen a good example in the oauth2 client lib, where the request instance is passed to the client lib to check if the token is valid and authorized.

Ember.js on the server

I'm developing a very dynamic web application via ember.js. The client-side communicates with a server-side JSON API. A user can make various choices and see diced & filtered data from all kinds of perspectives, where all of this data is brought from said API.
Thing is, I also need to generate static pages (that Google can understand) from the same data. These static pages represent pre-defined views and don't allow much interaction; they are meant to serve as landing pages for users arriving from search engines.
Naturally, I'd like to reuse as much as I can from my dynamic web application to generate these static pages, so the natural direction I thought of going for is implementing a server-side module to render these pages which would reuse as much as possible of my Ember.js views & code.
However - I can't find any material on that. Ember's docs say "Although it is possible to use Ember.js on the server side, that is beyond the scope of this guide."
Can anyone point out what would be possible to reuse on the server-end, and best practices for designing the app in a way to enable maximal such reuse?
Of course, if you think my thinking here doesn't make sense, I'd be glad to hear this (and why) too :-)
Thanks!
C.
Handlebars - Ember's templating engine - does run on the server (at least under Node.js). I've used it in my own projects.
When serving an HTTP request for a page, you could quite possibly use your existing templates: pull the relevant data from the DB, massage it into a JSON object, feed it to handlebars along with the right template, then send the result to the client.
Have a look at http://phantomjs.org/
You could use it to render the pages on the server and return a plain html version.
You have to make it follow googles ajax crawling guides: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started

Resources