Dynamically update stored chrome extension config/global data, not in prefs? - google-chrome-extension

I'd like my extension to have stored "config" or "meta/global" Constant data, and have the extension update this config data daily from my web server. Things like class names, patterns to match, identifiers, etc.
I can store this data in prefs, but they are async and won't be available when the extension starts up. I need them to be available immediately so the extension can act on them.
Right now I store this data in my extension source itself, so it cannot be updated. But it is available immediately, not async.
Is there a better fix that would allow my extension to update its internal data?

In extensions, you can still have localStorage access in non-content scripts, which is synchonous.
However, what's wrong with async chrome.storage? It's not going to take a long time to read, you just need to restructure your program to wait for this quite fast initialization.
If you do not want massive restructuring, you could make a synchronous "cache" for it in the script.
// Old:
var settings = { /* hard-coded values*/ }
doSomething(settings[key]);
settings[key] = value;
// New:
function updateCache(changes, area){
if(area != "local") return;
for(key in changes){
settings[key] = changes[key].newValue;
}
}
function setOption(key, value){
settings[key] = value; // Synchronous cache update
var update = {};
update[key] = value;
chrome.storage.local.set(update);
}
var settings = { /* set defaults here */ };
chrome.storage.local.get(settings, function(data){
for(key in data) settings[key] = data[key];
chrome.storage.onChanged(updateCache);
/* Old top level code here */
});
doSomething(settings[key]); // Still synchronous
setOption(key, value);

Related

Meteor Client calling findOne in Server Method

I have a client-side form that can create a document upon submission. I want to see if one of the input fields already exists on a Document in the DB though. This would then alert the user and ask them if they want to continue creating the record.
Client-side event
Template.createDoc.events({
'click button[type=submit]'(e, template) {
//This particular example is checking to see if a Doc with its `name` property set to `value` already exists
const value = $('#name');
const fieldName = 'name';
const exists = Meteor.call('checkIfFieldExistsOnDoc', fieldName, value);
if (exists) {
if (confirm(`Doc with ${value} as its ${fieldName} already exists. Are you sure you want to continue creating Doc?`) {
//db.Docs.insert....
}
}
}
});
Server-side Meteor Method
'checkIfFieldExistsOnDoc'(field, val) {
if (this.isServer) {
this.unblock();
check(field, String);
check(val, String);
if (!this.userId) {
throw new Meteor.Error('not-authorized', 'You are not authorized.');
}
const findObj = {};
findObj[field] = val;
const fieldsObj = {};
fieldsObj[fieldsObj] = 1;
const doc = Docs.findOne(findObj, {fields: fieldsObj});
return doc;
}
},
My issue is that the client-side code always gets undefined back when calling the Server method. I now understand why, however, I'm not keen on wrapping all of my subsequent client-code into a callback yet.
So - any other ideas on how I can attempt to do this simple feature?
Also - I was thinking of having the client-side page's onCreated do a 1-time server call to get ALL names for all Docs, storing this in memory, and then doing the check upon form submission using this. Obviously, this is inefficient and not-scalable, although it would work
Meteor.call in the client side is always an async call. Then you need implement a callback.
See docs: https://docs.meteor.com/api/methods.html#Meteor-call
Meteor.call('checkIfFieldExistsOnDoc', fieldName, value, function(error, result) {
if (result) {
if (confirm(`Doc with ${value} as its ${fieldName} already exists. Are you sure you want to continue creating Doc?`) {
//db.Docs.insert....
}
}
});
On the client, you can wrap any Meteor.call with a Promise and then use it with async/await. There are some packages on Atmosphere that do this for you to.
I've used this package for years: https://atmospherejs.com/deanius/promise
On the client I often just use await Meteor.callPromise() which returns a response nicely.
Here are a couple of the best write-ups on the many options available to you:
https://blog.meteor.com/using-promises-on-the-client-in-meteor-fb4f1c155f84
https://forums.meteor.com/t/meteor-methods-return-values-via-promise-async/42060
https://dev.to/jankapunkt/async-meteor-method-calls-24f9

Avoid triggering Firebase functions by real-time database on special cases

Sometimes we use the firebase functions triggered by real-time database (onCreate/onDelete/onUpdate ...) to do some logic (like counting, etc).
My question, would it be possible to avoid this trigger in some cases. Mainly, when I would like to allow a user to import a huge JSON to firebase?
Example:
a function E triggered on the creation of a new child in /examples. Normally, users add examples one by one to /examples and function E runs to do some logic. However, I would like to allow a user (from the front-end) to import 2000 children to /examples and the logic which is done by function E is possible at import time without the need for E. Then, I do not need E to be triggered for such a case where a high number of functions could be executed. (Note: I am aware of the 1000 limit)
Update:
based on the accepted answer, submitted my answer down.
As far as I know, there is no way to disable a Cloud Function programmatically without just deleting it. However this introduces an edge case where data is added to the database while the import is taking place.
A compromise would be to signal that the data you are uploading should be post-processed. Let's say you were uploading to /examples/{pushId}, instead of attaching the database trigger to /examples/{pushId}, attach it to /examples/{pushId}/needsProcessing (or something similar). Unfortunately this has the trade-off of not being able to make use of change objects for onUpdate() and onWrite().
const result = await firebase.database.ref('/examples').push({
title: "Example 1A",
desc: "This is an example",
attachments: { /* ... */ },
class: "-MTjzAKMcJzhhtxwUbFw",
author: "johndoe1970",
needsProcessing: true
});
async function handleExampleProcessing(snapshot, context) {
// do post processing if needsProcessing is truthy
if (!snapshot.exists() || !snapshot.val()) {
console.log('No processing needed, exiting.');
return;
}
const exampleRef = admin.database().ref(change.ref.parent); // /examples/{pushId}, as admin
const data = await exampleRef.once('value');
// do something with data, like mutate it
// commit changes
return exampleRef.update({
...data,
needsProcessing: null /* delete needsProcessing value */
});
}
const functionsExampleProcessingRef = functions.database.ref("examples/{pushId}/needsProcessing");
export const handleExampleNeedingProcessingOnCreate = functionsExampleProcessingRef.onCreate(handleExampleProcessing);
// this is only needed if you ever intend on writing `needsProcessing = /* some falsy value */`, I recommend just creating and deleting it, then you can use just the above trigger.
export const handleExampleNeedingProcessingOnUpdate = functionsExampleProcessingRef.onUpdate((change, context) => handleExampleProcessing(change.after, context));
An alternative to Sam's approach is to use feature flags to determine if a Cloud Function performs its main function. I often have this in my code:
exports.onUpload = functions.database
.ref("/uploads/{uploadId}")
.onWrite((event) => {
return ifEnabled("transcribe").then(() => {
console.log("transcription is enabled: calling Cloud Speech");
...
})
});
The ifEnabled is a simple helper function that checks (also in Realtime Database) if the feature is enabled:
function ifEnabled(feature) {
console.log("Checking if feature '"+feature+"' is enabled");
return new Promise((resolve, reject) => {
admin.database().ref("/config/features")
.child(feature)
.once('value')
.then(snapshot => {
if (snapshot.val()) {
resolve(snapshot.val());
}
else {
reject("No value or 'falsy' value found");
}
});
});
}
Most of my usage of this is during talks at conferences, to enable the Cloud Functions at the right time (as a deploy takes a bit longer than we'd like for a demo). But the same approach should work to temporarily disable features during for example data import.
Okay, another solution would be
A: Add a new table in firebase like /triggers-queue where all CRUD that should fire a background function are added. In this table, we add a key for each table that should have triggers - in our example /examples table. Any key that represents a table should also have /created, /updated, and /deleted keys as follows.
/examples
.../example-id-1
/triggers-queue
.../examples
....../created
........./example-id
....../updated
........./example-id
............old-value
....../deleted
........./example-id
............old-value
Note that the old-value should be added from app (front-end, etc).
We set triggers always onCreate on
/triggers-queue/examples/created/{exampleID} (simulate onCreate)
/triggers-queue/examples/updated/{exampleID} (simulate onUpdate)
/triggers-queue/examples/deleted/{exampleID} (simulate onDelete)
The fired function can know all the necessary info to handle the logic as follows:
Operation type: from the path (either: created, updated, or deleted)
key of the object: from the path
current data: by reading the corresponding table (i.e., /examples/id)
old data: from the triggers table
Good Points:
You can import a huge data to /examples table without firing any function as we do not add to the /triggers-queue
you can fanout functions to pass the limit 1000/sec. That is by setting triggers on (as an example to fanout on-create)
/triggers-queue/examples/created0/{exampleID} and
/triggers-queue/examples/created1/{exampleID}
bad-points:
more difficult to implement
need to write more data to firebase (like old-data) from the app.
B- Another way (although not an answer for this) is to move the login in the background function to an HTTP function and call it on every crud ops.

Why doesn't Meteor.user() work inside publish functions?

Mainly out of curiosity, but also for a better understanding of Meteor security, what is the reason(ing) behind Meteor.user() not working inside publish functions?
The reason is in this piece of code (from meteor source code)
Meteor.user = function () {
var userId = Meteor.userId();
if (!userId)
return null;
return Meteor.users.findOne(userId);
};
Meteor.userId = function () {
// This function only works if called inside a method. In theory, it
// could also be called from publish statements, since they also
// have a userId associated with them. However, given that publish
// functions aren't reactive, using any of the infomation from
// Meteor.user() in a publish function will always use the value
// from when the function first runs. This is likely not what the
// user expects. The way to make this work in a publish is to do
// Meteor.find(this.userId()).observe and recompute when the user
// record changes.
var currentInvocation = DDP._CurrentInvocation.get();
if (!currentInvocation)
throw new Error("Meteor.userId can only be invoked in method calls. Use this.userId in publish functions.");
return currentInvocation.userId;
};

Get all defined variables in scope in node.js

I am trying to get all the variables that have been defined, i tried using the global object
but it seems to be missing the ones defined as var token='44'; and only includes the ones defined as token='44';. What i am looking for idealy is something like the get_defined_vars() function of php. I need to access the variables because i need to stop the node process and then restart at the same point without having to recalculate all the variables, so i want to dump them somewhere and access them later.
It's impossible within the language itself.
However:
1. If you have an access to the entire source code, you can use some library to get a list of global variables like this:
var ast = require('uglify-js').parse(source)
ast.figure_out_scope()
console.log(ast.globals).map(function (node, name) {
return name
})
2. If you can connect to node.js/v8 debugger, you can get a list of local variables as well, see _debugger.js source code in node.js project.
As you stated
I want to dump them somewhere and access them later.
It seems like you should work towards a database (as Jonathan mentioned in the comments), but if this is a one off thing you can use JSON files to store values. You can then require the JSON file back into your script and Node will handle the rest.
I wouldn't recommend this, but basically create a variable that will hold all the data / variables that you define. Some might call this a God Object. Just make sure that before you exit the script, export the values to a JSON file. If you're worried about your application crashing, perform backups to that file more frequently.
Here is a demo you can play around with:
var fs = require('fs');
var globalData = loadData();
function loadData() {
try { return require('./globals.json'); } catch(e) {}
return {};
}
function dumpGlobalData(callback) {
fs.writeFile(
__dirname + '/globals.json', JSON.stringify(globalData), callback);
}
function randomToken() {
globalData.token = parseInt(Math.random() * 1000, 10);
}
console.log('token was', globalData.token)
randomToken();
console.log('token is now', globalData.token)
dumpGlobalData(function(error) {
process.exit(error ? 1 : 0);
});

RequireJS module for SignalR

Rather than copy and pasting my code onto here, I have uploaded it to github. The RequireJS module does have a dependency on jquery.signalr and in tern has a dependency on jquery but also have a dependency on the javascript held in /signalr/hubs. There is a bit of config to do with Require.Config.
Basically what is happening is on the first time you load the page the connection is made to the hubs within signalr and the "server side" code is executed and does the desired thing. When you refresh the page it does not. All client side code is called, so for example:
var myViewModel = new MyViewMode();
myViewModel.init();
and within your init method you have
var connection = $.connection.myHub;
this.init = function() {
connection.server.myMethod();
}
this would then go off to
public MyHub : Hub
{
public void MyMethod()
{
Client.Request.populateSomeInformation() // I think it's request but I'm doing this from memory!
}
}
and then call
connection.client.populateSomeInformation = function () { .. )
but doesn't call this :(
It looks like a connection has been made (using the good old console.log() to see what it outputs) and indeed debugging the project it executes the code within the hub but there is no response made back to the javascript.
So wonderful people of the internet, where am I going wrong? Do I need to check the state of $.connection.hub.start(); before attempting to start it again?
Time for beer :)
I believe it should be
connection.client.populateSomeInformation = function () { .. )
(not connection.server)
http://www.asp.net/signalr/overview/hubs-api/hubs-api-guide-javascript-client#callclient
(observations on the code you have on github right now)
var isLoaded = false;
// ... some code that doesn't change isLoaded ...
if (isLoaded == false) {
scrollIntervalId = window.setInterval(function () {
signalRLoaded();
}, 30);
}
I think isLoaded will always be false at this point. Not sure what you intended this to accomplish.
var connection = $.connection.hub.start();
I don't think you're supposed to open the connection before defining any client functions. I don't see any client functions being defined here, so maybe you're doing that somewhere else? I don't know if it really matters other than if the server attempts to call a client function that hasn't yet been defined...
function SignalRReady(callback) {
if (isLoaded) {
callback(connection);
} else {
readyCalls = callback;
}
return SignalRReady;
}
SignalRReady.version = "1.0.0";
SignalRReady.load = function(name, request, onLoad, config) {
if (config.isBuild) {
onLoad();
} else {
SignalRReady(onLoad);
}
};
return SignalRReady;
I'm confused by this bit of code, probably because I don't see how it's being used. Is this an attempt at a kind of singleton? I see that SignalRReady is the "class" being returned for the module. You're not really returning an object, you're returning a constructor which implies that you're instantiating it in other places, something like
define(['SignalRReady'], function(sigR)
{
var srr = new sigR();
});
But then you have that load function defined that calls the constructor and makes this look all weird. How are you using this?
Anyways, I'm thinking you might be hitting some kind of race condition where the client function may not always be available at the time the server is trying to call it.
(additional comments/code 2013-09-06)
Your connection object is actually a jQuery promise ( http://api.jquery.com/category/deferred-object/ ).
If you're unfamiliar with promises, think of them generically as a queue of callbacks to be executed later. In this case, when connected, all the callbacks will be executed (in the order they were added). If a callback is added after being connected, it will get executed immediately. This is how your code is working now. You add the callback to the .done queue after the connection is made and is executed immediately.
If you insist on creating the connection object yourself, then you do not need to use the stateChanged event. You just add the callback to the .done queue:
define(function()
{
function signalRReady(callback)
{
if (window.connection == undefined) {
window.connection = $.connection.hub.start();
}
window.connection.done(callback);
}
signalRReady.version = "1.0.0";
return signalRReady;
});
However, I believe it's not a good idea to initiate the connection yourself. Because your module isn't a complete wrapper around SignalR such that people would only use your module to do SignalR stuff, you are not guaranteed (and cannot expect) other code will not initiate the connection. Especially if someone is adding your module to an existing codebase.
Your module is simply adding a new event, so keep it simple. Take the callback and execute it yourself when appropriate:
define(function()
{
function signalRReady(callback)
{
$.connection.hub.stateChanged(function (state)
{
if(state.newState === $.signalR.connectionState.connected)
{
callback();
}
});
}
signalRReady.version = "1.0.0";
return signalRReady;
});
Nowadays, promises are pretty popular. You might want to implement a promise-based module like:
define(function()
{
var deferred = $.Deferred();
$.connection.hub.stateChanged(function (state)
{
if(state.newState === $.signalR.connectionState.connected)
{
// executes all callbacks attached by the "ready" function below
deferred.resolve();
}
});
return {
ready: function(callback)
{
deferred.done(callback);
},
version: "1.0.0"
};
});
If callbacks are attached after the connection has been made, they are executed immediately.
Also, notice this example module's init function returns an object instead of a function. Since RequireJS will pass the same instance around to any module that requires it, state is maintained - we can use local variables instead of global.

Resources