I have an admin component with a number field which holds data of a model. Loading & saving just works fine with the default shopware components. But if I am typing a value into this field while the admin worker is executed my input gets reseted to its original value. Even if I just wait without clicking out of the input field or into the next one, the same effect happens. The change event needs to be triggered to safe my new input value, otherwise "the admin worker kills it". How could this be connected? How can I avoid this effect, without deactivating the admin worker?
<sw-field
v-model="values[value.id]"
:label="value.name"
type="number"
/>
I also tried to store the value to the object with registering and event subscription on #keydown.up or v-on:keydown="storeMaterial" but both didn't get triggered.
The predefined values of this array are preloaded from database (which I left out here) or predefined as 0 already
getValues() {
let storedValue = 0;
this.values[value.id] = storedValue;
},
also tried this.$set, with and without predefined null
getValues() {
this.$set(this.values, value.id, null);
let storedValue = 0;
this.$set(this.values, value.id, storedValue);
},
That is odd but sounds like it could be related to non-reactive mutation of values. You have to initially set the keys for values otherwise their respective values will not be stored reactively and will be lost on a next tick. (which I assume is caused by the admin worker)
As an example inside your component:
data() {
return {
originalValues: [{ id: 'foo', name: 'bar' }],
values: {},
};
},
created() {
this.originalValues.forEach((value) => {
if (this.values.hasOwnProperty(value.id)) {
return;
}
// set to null initially, will make `this.values[value.id]` reactive
this.$set(this.values, value.id, null);
});
},
Related
I am trying to use the deleteConfimation function option but I find that the default confirmation box pops up before I even get into the deleteConfimation function - what am I missing?
In the code below I can set break points and watch the data object being set up correctly with its new defaultConfirmMessage, but the basic jtable default delete confirmation box has already appeared and I never see an altered one.
$(container).jtable({
title: tablename,
paging: true,
pageSize: 100,
sorting: true,
defaultSorting: sortvar + ' ASC',
selecting: false,
deleteConfirmation: function(data) {
var defaultMessage = 'This record will be deleted - along with all its assignments!<br>Are you sure?';
if(data.record.Item) { // deleting an item
// Check whether item is in any preset lists
var url = 'CampingTablesData.php?action=CheckPresets&Table=items';
$.when(
ReturnAjax(url, {'ID':data.record.ID}, MyError)
).done(
function(retdata, status) {
if(status=='success') {
if(retdata.PresetList) {
data.deleteConfirmMessage = 'Item is in the following lists: ' + retdata.PresetList + 'Do you still want to delete it?';
}
} else {
data.cancel = true;
data.cancelMessage = retdata.Message;
}
}
);
} else {
data.deleteConfirmMessage = defaultMessage;
}
},
messages: {
addNewRecord: 'Add new',
deleteText: deleteTxt
},
actions: {
listAction: function(postData, jtParams) {
<list action code>
},
createAction: function(postData) {
<create action code>
},
updateAction: 'CampingTablesData.php?action=update&Table=' + tablename,
deleteAction: 'CampingTablesData.php?action=delete&Table=' + tablename
},
fields: tableFields --- preset variable
});
==========
After further testing the problem is only when deleting an item and it goes through the $.when().done() section of code. The Ajax call to the deletion url does not wait for this to complete - how do I overcome this?
i don't think you can get your design to work. What does the A in ajax stand for? Asynchronous! Synchronous Ajax has been deprecated for all sorts of good design and performance reasons.
You need to design you application to function asynchronously. Looking at your code, it feels you are misusing the deleteConfirmation event.
Consider changing the default deleteConfirmation message to inform the user, that the delete might not succeed if certain condition are met. Say
messages: {
deleteConfirmation: "This record will be deleted - along with all its assignments, unless in a preset list. Do you wish to try to delete this record?"
},
Then on the server, check the preset lists, and if not deletable, return an error message for jTable to display.
Depending on how dynamic your preset lists are, another approach might be to let the list function return an additional flag or code indicating which, if any, preset lists the item is already in, then your confirmation function can check this flag / indicator without further access to the server.
Thanks to MisterP for his observation and suggestions. I also considered his last approach but ended up setting deleteConfirmation to false (so as not to generate a system prompt) then writing a delete function that did not actually delete, but returned the information I needed to construct my own deleteConfimation message. Then a simple if confirm(myMessage) go ahead and delete with another Ajax call.
Here's my problem : I'm doing a background work, where I parse some JSON and write some Objects into my Realm, and in the main thread I try to update the UI (reloading the TableView, it's linked to an array of Object). But when I reload the UI, my tableView doesn't update, like my Realm wasn't updated. I have the reload my View to see the updates. Here's my code :
if (Realm().objects(Objects).filter("...").count > 0)
{
var results = Realm().objects(Objects) // I get the existing objects but it's empty
tableView.reloadData()
}
request(.GET, url).responseJSON() {
(request, response, data, error) in
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)) {
// Parsing my JSON
Realm().write {
Realm().add(object)
}
dispatch_sync(dispatch_get_main_queue()) {
// Updating the UI
if (Realm().objects(Objects).filter("...").count > 0)
{
results = Realm().objects(Objects) // I get the existing objects but it's empty
tableView.reloadData()
}
}
}
}
I have to do something bad with my threads, but I couldn't find what. Can someone know what's wrong?
Thank you for your answer!
such workflow makes more sense to me for your case:
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0)) {
// Parsing my JSON
Realm().write {
Realm().add(object)
dispatch_sync(dispatch_get_main_queue()) {
// Updating the UI
if (Realm().objects(Objects).filter("...").count > 0)
{
results = Realm().objects(Objects) // I get the existing objects but it's empty
tableView.reloadData()
}
}
}
}
NOTE: you have a problem with timing in your original workflow: the UI might be updated before the write's block executed, that is why your UI looks abandoned; this idea above would be a more synchronised way between tasks, according their performance's schedule.
You are getting some new objects and storing them into "results".
How is tableView.reloadData () supposed to access that variable? You must change something that your tableView delegate will access.
PS. Every dispatch_sync () is a potential deadlock. You are using one that is absolutely pointless. Avoid dispatch_sync unless you have a very, very good reason to use it.
I am having difficulties looping over an object of constituency data, finding existing entries in a MongoDB and doing something with them. It always ends up being the same entry being passed to be found in the DB over and over again.
I am assuming this is a problem of scope and timing.
My code:
for (key in jsonObj) {
var newConstituent = new Constituent({
name : jsonObj[key]["Name"],
email : jsonObj[key]["Email"],
social : {
twitter: {
twitter_handle : jsonObj[key]["Twitter handle"],
twitter_id : jsonObj[key]["User id"],
timestamp : jsonObj[key]["Timestamp"]
}
}
});
console.log(jsonObj[key]["Email"]); // this is fine here!
Constituent.findOne({ email : jsonObj[key]["Email"] }, function(err, constitutents){
console.log(jsonObj[key]["Email"]); // here it's always the same record
if (err) {
console.log(err)
}
if (constitutents === 'null') {
console.log("Constituent not found. Create new entry .. ");
// console.log(newConstituent);
newConstituent.save(function (err) {
if (err) {
console.log('db save error');
}
});
} else {
console.log("Constituent already exists .. ");
}
});
}
I have a suspicion that the for loop finishes sooner than .findOne() is executing and therefor always and only gets the last item of the object passed to find.
Could someone point me into the right direction?
A couple of this.
Don't use for ... in, especially in node. You can use Object.keys() and any of the array methods at that point. for ... in can include values you don't wish to loop over unless you're using hasOwnProperty since it'll include values from the prototype chain.
The reason the email is the same is that you're just printing out your query again. jsonObj is included in the scope of your callback to findOne since you're not re-declaring it inside the findOne callback. So whatever the value of key happens to be (my guess is that it's the last one in your list) when the callback is invoked is the email you're getting. Since, in javascript, inner function scope always includes, implicitly, the scope of the surrounding context, you're just accessing the jsonObj from your enclosing scope.
To clarify about this point, your for ... in loop is synchronous -- that is the interpreter finishes running all the instructions in it before it will process any new instructions. findOne, how ever is asynchronous. Very simply, When you call it in this loop, it's not actually doing ANYTHING immediately -- the interpreter is still running your for ... in loop. It is, however, adding more tasks to the execution stack to run after it's finished your loop. So the loop is finished, AND THEN your callbacks will start to execute. Since the for ... in loop is totally finished, key is set to whatever the final value of it was. So, for example, if it's last value was foo that means EVERYTIME your callback is invoked, you will be printing out jsonObj.foo since the for ... in loop is already complete.
So it's like you asked your friend to say the letters from A to J, and you left the room to do 10 things. To do something. He totally finished going to J since that is much faster than doing 1 of the 10 things you're doing. Now every time you're done doing one of your things, you come back and say "what's the latest letter you said". The answer will ALWAYS be J. If you need to know what letter he is on for each task you either need to get him to stop counting while you're doing it or somehow get the information about what letter corresponds with the number of task that you're performing.
Having them wait is not a good idea -- it's a waste of their time. However, if you wrap your findOne in a new function where you pass in the value of key, this would work. See the updated code below.
I'm not sure about your data but findOne will return one record. You're putting it into a variable with a plural (constitutents). From reading your code I would expect back a single value here. (It might still be wrapped in an array however.)
Since you're calling findOne and assigning the results of the find operation to constituent, you should be examining that object in the console.log.
e.g.
console.log(constitutents.email); // or console.log(constitutents[0].email)
rather than
console.log(jsonObj[key]["Email"]);
(Assuming email is a property on constituants).
You might just try logging the constituants entirely to verify what you're looking for.
The reason this following code will work is that you're passing the current value of key to the function for each invocation. This means there is a local copy of that variable created for each time you call findConstituent rather than using the closure value of the variable.
var newConstituent;
function findConstituent(key){
Constituent.findOne({ email : jsonObj[key]["Email"] }, function(err, constitutents){
console.log(jsonObj[key]["Email"]); // here it's always the same record
if (err) {
console.log(err)
}
if (constitutents === 'null') {
console.log("Constituent not found. Create new entry .. ");
// console.log(newConstituent);
newConstituent.save(function (err) {
if (err) {
console.log('db save error');
}
});
} else {
console.log("Constituent already exists .. ");
}
});
}
for (key in jsonObj) {
newConstituent = new Constituent({
name : jsonObj[key]["Name"],
email : jsonObj[key]["Email"],
social : {
twitter: {
twitter_handle : jsonObj[key]["Twitter handle"],
twitter_id : jsonObj[key]["User id"],
timestamp : jsonObj[key]["Timestamp"]
}
}
});
findConstituent(key);
}
This may be a vary bad idea, or a possible solution that we have to a database concurrency problem.
We have a method that is called to do an update of a mongo record. We are seeing some concurrency problems - process A reads the record, process B reads the record, process A makes mods and saves the record, process makes B mods and saves the record. Because B reads after A, before A writes, it doesn't know about the changes A made, and we lose the data from A.
I'm wondering if we could not use a database semaphore, basically a field on the collection, that is a boolean. If we read the record at the start of the method, and the field is true, it's being edited. At that point, re-call the method using process.nexttick(), with the same data. Otherwise, set the semaphore, and carry on.
There would still be a bit of time between the read and the save, but it should be/could be faster than what we are doing now.
Be something like this. Any thoughts, anyone done anything like this? Will it even work?
function remove_source(service_id,session, next)
{
var User = Mongoose.model("User");
/* get the user, based on the session user id */
User.findById(session.me,function(err,user_info)
{
if (user_info.semaphore === true)
{
process.nextTick(remove_source(service_id,session,next));
}
else
{
user_info.semaphore = true;
user_info.save(function(err,user_new)
{
if (err) next(err,user_new);
else continue_on(null,user_new);
});
}
function continue_on(user_new)
{
etc.......
}
Edit: New Code:
The function now looks as follows. I'm doing individual updates to the arrays. This of course means that I now have the possibility, if the transaction fails between the first and second transactions, of having data out of sync. I'm thinking that I could simply resave the user object that I retrieved on entry into the function, overwriting my changes. I don't know if Mongoose/Mongo will not do the save if I have not changed that object, will have to try and see. Any more thoughts?
var User = Mongoose.model("User");
/* get the user, based on the session user id */
User.findById(session.me,function(err,user_info)
{
if (err)
{
next(err,user_info,null);
return;
}
if (!user_info || user_info.length === 0)
{
next(_e("ACCOUNT_NOT_FOUND"),"user_id: " + session.me);
return;
}
var source_service_info = _.where(user_info.credentials, {"source_service_id": service_id});
var source_service = source_service_info.source_service;
User.findByIdAndUpdate(session.me,{$pull: {"credentials": {"source_service_id": service_id}}},{},function(err,user_credential_removed)
{
if (err)
{
next(err,user_info,null);
return;
}
User.findByIdAndUpdate(session.me,{$pull: {"criteria": {"source_service": source_service}}},{},function(err,user_criteria_removed)
{
if (err)
{
next(err,user_info,null);
return;
}
else
{
next(null,user_criteria_removed);
}
});
});
});
};
The problem with your approach is that it just shortens the time during which the data could be read by a second process, it doesn't eliminate the problem.
The solution to this would be to set your semaphore in the same action as the read. I haven't used Mongoose, but in MongoDB you can use findAndModify to only return a User record if the semaphore is false, and if it is false, in one atomic operation, set the semaphore to true.
If you don't want to use findAndModify, you could first do an update that sets the semaphore true (or to some specific ID value so you know that it is YOUR semaphore) only if the semaphore is not set. Then, if that process succeeds, you could do the find (perhaps passing your semaphore ID as a criterion in the find). However, findAndModify, if it is available in Mongoose, would do that in one step.
A variation of that is described here: http://docs.mongodb.org/manual/tutorial/isolate-sequence-of-operations/ where you do a form of optimistic locking that checks that the old values are unchanged before changing them to the new values.
There is a variation on this that uses a separate table to simulate a two-phase commit: http://docs.mongodb.org/manual/tutorial/perform-two-phase-commits/
Edited: Upon interchange below, this seems to be a schema and updating issue. Question may become something like: I have some entries in an array, and the ordinal index to those entries relates to some other arrays as well. How do I perform deletes without having mismatches?
Three off the top possibilities occur, depending on frequency in the real world vs QA test scenarios.
Consider adding a deleted flag but keeping the records in the same order. If someone toggles, reuse the same record, but fix however you want.
Use an associative array (JS object) for each element (not a feature from relational world.) If you need an order, add an array that lists the keys in order. Both have syntax to update without touching anything other that what has changed, and will not overwrite changes to different fields.
Use an associative array where the keys are numbers. Actual deletion won't hurt retrieval.
stuff = {}
stuff[1] = {some:'details'}
stuff[2] = {some:'details2'}
Was
1) Are you making changes to the same field? Make that into an array, and push changes, and pop the latest to read the current value.
2) Are you changing different fields, but data is getting trounced? Then there is better syntax to use for the updating. you can update field by field.
$set: { 'fielda': 'valuea' }
won't lose edits on previous fields
3) change your schema
4) change the timing on the processes so they don't overlap. Or so they do so in smaller subsets, that you can manage to prevent from overlapping.
I'd like to know, just out of interest, what multiple processes are needed to make updates on the same record? I don't work with anything that looks like that.
I am developing an extension for all the browsers. How do I store tab specific values in the session? I solved this problem in Firefox with an NSISessionStore object. In Safari and Google Chrome, I used SessionStorage; this object stores values for a specific tab with a specific domain. I want a solution for how to store values for a specific tab.
If you're asking how to manage data throughout the life of a tab you can simply create an object for the tab when it's created and delete it when it is closed.
// Create data store
var tabDataStore = {};
// Create listeners
chrome.tabs.onCreated.addListener(function (tab) {
tabDataStore['tab_' + tab.id] = {
urls: []
};
});
chrome.tabs.onRemoved.addListener(function (tabId) {
delete tabDataStore['tab_' + tabId];
});
// Save something against that tab's data
function saveUrl(tab) {
tabDataStore['tab_' + tab.id].urls.push(tab.url);
}
// Load something from tab's data
function loadOriginalUrl(tab) {
tabDataStore['tab_' + tab.id].urls[0];
}
However, this is all an assumption and you may want something completely different. Also, it depends when and what exactly you want to store.
Further information on tabs can be found in the official documentation.
If you want to persist anything you can use localStorage.
a simple way to do it, though not ideal as it would look messy would be to store the values in the URL hash
say the URL of the tab was http://whatever.com/
you could store the value in the hash like so
http://whatever.com/#value1=12&value2=10&value3=15212
there will also be a problem if the website uses the hash object for anyhting such as "in page" anchors, or ajaxy type stuff
Safari Answer
In your global page save directly to the tab.. so for instance on message from injected script
// global page
safari.application.addEventListener("message", function(event){
switch(event.name){
case "saveData":
event.target.page.tabData = { data: myData }
break;
case "getData":
event.target.page.dispatchMessage("tabData", myData);
break;
}
}, false);
-
// injected page
// first save data
safari.self.tab.dispatchMessage("saveData", {firstname:"mike", age: 25} );
// setup listner to recevie data
safari.self.addEventListener("message", function(event){
switch(event.name){
case "tabData":
// get data for page
console.debug(event.message);
// { firstname: "mike", age: 25 }
break;
}
}, false);
// send message to trigger response
safari.self.tab.dispatchMessage("getData", {} );