Chrome Extension Database Storage - google-chrome-extension

I am working on a page action extension and would like to store information that all users of the extension can access. The information will be key:value pairs, where the key is a web url and the value is an array of links.
I have to be able to update the database without redeploying the extension to the chrome store. What is it that I should look into using? The storage APIs seem oriented towards user data rather than data stored by the app and updated by the developer.

If you want something to be updated without deploying an updated version through CWS, you'll need to host the data yourself somewhere and have the extension query it.
Using chrome.storage.local as a cache for said data would be totally appropriate.

the question is pretty broad so ill give you some ideas Ive done before.
since you say you dont want to republish when the db changes, you need to store the data for the db yourself. this doesnt mean you need to store an actual db, just a way for the extension to get the data.
ideally, you are only adding new pairs. if so, an easy way is to store your pairs in a public google spreadsheet. the extension then remembers the last row synced and uses the row feed to get data incrementally.
there a few tricks to get right the spreadsheet sync. take a look at my github "plus for trello" for a full implementation.
this is a good way to incrementally sync, thou if the db isnt huge you could just host a csv file and get it periodically from the extension.
now that you can get the data into the extension, decide how to store it. chrome.storage.local or indexedDb should be fine thou indexedDb is usually best for later querying more complex things than just a hash table.

Related

Sync two Shopify stores using API

I'm currently working on two Shopify stores but I want to synchronize the customer accounts between these two stores.
I saw in the Shopify dev doc that there is an API to retrieve all the customers and I managed to make it work.
My problem is how can I use the JSON data returned to update my 2nd store database?
It is very easy. I did it like this:
Download all the customers from the store you consider the source. Bulk download or using cursors, does not matter.
For each customer encountered, search the other store for the customer using the customer email for example. You either get back a record or you don't. If you do, you can update it, if you don't you can create it.
Unfortunately, as an exercise in programming there are 1001 ways to do this, and we have no idea what your skills or choices are there.

chrome.storage.sync limits vs. Google Keep

I understand the limitations of QUOTA_BYTES_PER_ITEM and QUOTA_BYTES when using chrome.storage.sync. I'm finding them quite limiting for a annotated history related extension I am writing. I understand that local storage could avoid this problem, but I need a user to be able to maintain their data as they move to other devices or someday replace their machine. My question is - are their other storage methods to get around this? What about Google Keep? It is an extension, but it appears capable of a "unlimited" storage of notes, or at least far more than the limitations of chrome.storage.sync. Is it simply not playing by the same rules, or are there other methods I could be using? Currently I'm concatenating information into large strings in javascript and storing these using chrome.storage.sync. Then parsing that information later as my database.
Thanks for any help!

Saving user data from Chrome Extension to global variable, then shared for all users

Wondering if this is at all possible. I'm working on a Chrome extension where, as users browse a particular site, certain elements on the page are saved to chrome.storage.local (or chrome.storage.sync). Those elements are then called again later on a different page. However, it would be useful to allow all users to save this data to 1 global variable/source, and all users be able to read from that variable/source. Do Chrome extensions have any method of accomplishing this?
The data in question isn't anything sensitive, it's not authentication info or anything. The reason I'm hoping to do this and not just save static variables or JSON objects within a content script is that the website I'm building this for changes fairly frequently, and I would rather that data not be completely static.
Thank you!
Not possible natively but there are lots of ways to do it for free (given you have few users and load and assuming you dont surpass their free quotas or rate limits) like a google appengine backend or a public google spreadsheet as sync. For the spreadsheet case, you can store as rows or put everything on a single cell. For appengine, the datastore has free quotas for read/write and free store quota (with limits and rate limits of course).

storing quick analytics using redis and node.js

I am new to redis and would like to store the web analytic of web site globally and per user activity .
Below is what i am stuck with.
// to get all unique ips
client.sadd('visitors',ip);
// to records hits per ip
client.hincrby('hits',ip,1);
The above so far works fine and i do get number of different ips and hit counter per ip.
the problem comes to store the activities made by each ip. i.e. Storing the link he clicked, searches he did, with datetime
Can some one please throw light on how to best manage it.
Thanks
the problem comes to store the activities made by each
You will need a separate structure for storing these.
The simplest rational structure is to have a "list of actions by session". Take a look at the sorted sets commands which provide a basic framework for creating a list of actions within a session.
This will get you something quickly. However, this is probably not what you really want. In fact redis is probably not useful for this at all.
If you want to re-trace an entire site visit you really want to connect to some sort of true analytics framework. There are dozens of website tracking tools that provide this type of functionality, so it's not really clear that building one is very efficient.

Automate the export of Facebook Insights data

I'm looking for a way of programmatically exporting Facebook insights data for my pages, in a way that I can automate it. Specifically, I'd like to create a scheduled task that runs daily, and that can save a CSV or Excel file of a page's insights data using a Facebook API. I would then have an ETL job that puts that data into a database.
I checked out the oData service for Excel, which appears to be broken. Does anyone know of a way to programmatically automate the export of insights data for Facebook pages?
It's possible and not too complicated once you know how to access the insights.
Here is how I proceed:
Login the user with the offline_access and read_insights.
read_insights allows me to access the insights for all the pages and applications the user is admin of.
offline_access gives me a permanent token that I can use to update the insights without having to wait for the user to login.
Retrieve the list of pages and applications the user is admin of, and store those in database.
When I want to get the insights for a page or application, I don't query FQL, I query the Graph API: First I calculate how many queries to graph.facebook.com/[object_id]/insights are necessary, according to the date range chosen. Then I generate a query to use with the Batch API (http://developers.facebook.com/docs/reference/api/batch/). That allows me to get all the data for all the available insights, for all the days in the date range, in only one query.
I parse the rather huge json object obtained (which weight a few Mb, be aware of that) and store everything in database.
Now that you have all the insights parsed and stored in database, you're just a few SQL queries away from manipulating the data the way you want, like displaying charts, or exporting in CSV or Excel format.
I have the code already made (and published as a temporarily free tool on www.social-insights.net), so exporting to excel would be quite fast and easy.
Let me know if I can help you with that.
It can be done before the week-end.
You would need to write something that uses the Insights part of the Facebook Graph API. I haven't seen something already written for this.
Check out http://megalytic.com. This is a service that exports FB Insights (along with Google Analytics, Twitter, and some others) to Excel.
A new tool is available: the Analytics Edge add-ins now have a Facebook connector that makes downloads a snap.
http://www.analyticsedge.com/facebook-connector/
There are a number of ways that you could do this. I would suggest your choice depends on two factors:
What is your level of coding skill?
How much data are you looking to move?
I can't answer 1 for you, but in your case you aren't moving that much data (in relative terms). I will still share three options of many.
HARD CODE IT
This would require a script that accesses Facebook's GraphAPI
AND a computer/server to process that request automatically.
I primarily use AWS and would suggest that you could launch an EC2
and have it scheduled to launch your script at X times. I haven't used AWS Pipeline, but I do know that it is designed in a way that you can have it run a script automatically as well... supposedly with a little less server know-how
USE THIRD PARTY ADD-ON
There are a lot of people who have similar data needs. It has led to a number of easy-to-use tools. I use Supermetrics Free to run occasional audits and make sure that our tools are running properly. Supermetrics is fast and has a really easy interface to access Facebooks API's and several others. I believe that you can also schedule refreshes and updates with it.
USE THIRD PARTY FULL-SERVICE ETL
There are also several services or freelancers that can set this up for you at little to no work on your own. Depending on where you want the data. Stitch is a service I have worked with on FB-ads. There might be better services, but it has fulfilled our needs for now.
MY SUGGESTION
You would probably be best served by using a third-party add-on like Supermetrics. It's fast and easy to use. The other methods might be more worth looking into if you had a lot more data to move, or needed it to be refreshed more often than daily.

Resources