I have an object in localStore
products = { apples: 2, tomatoes: 3, potatoes: 1}
How to change tomatoes into 5?
What did you try? If you already have that in localStorage, you must have serialized it into JSON first. You didn't say as much in your post, but as you can only store strings you must have. So I assume you did this to get the value as data:
let products = JSON.parse(localStorage.getItem('products'));
At this point, products is a regular old JavaScript object, so you could do:
products.tomatoes = 5;
To persist it back, you convert it back into JSON and use setItem.
localStorage.setItem('products', JSON.stringify(products));
Related
I need to transform a large array of JSON (that can have over 100k positions) into a CSV.
This array is created directly in the application, it's not the result of an uploaded file.
Looking at the documentation, I've thought on using parser but it says that:
For that reason is rarely a good reason to use it until your data is very small or your application doesn't do anything else.
Because the data is not small and my app will do other things than creating the csv, I don't think it'll be the best approach but I may be misunderstanding the documentation.
Is it possible to use the others options (async parser or transform) with an already created data (and not a stream of data)?
FYI: It's a nest application but I'm using this node.js lib.
Update: I've tryied to insert with an array with over 300k positions, and it went smoothly.
Why do you need any external modules?
Converting JSON into a javascript array of javascript objects is a piece of cake with the native JSON.parse() function.
let jsontxt=await fs.readFile('mythings.json','uft8');
let mythings = JSON.parse(jsontxt);
if (!Array.isArray(mythings)) throw "Oooops, stranger things happen!"
And, then, converting a javascript array into a CSV is very straightforward.
The most obvious and absurd case is just mapping every element of the array into a string that is the JSON representation of the object element. You end up with a useless CSV with a single column containing every element of your original array. And then joining the resulting strings array into a single string, separated by newlines \n. It's good for nothing but, heck, it's a CSV!
let csvtxt = mythings.map(JSON.stringify).join("\n");
await fs.writeFile("mythings.csv",csvtxt,"utf8");
Now, you can feel that you are almost there. Replace the useless mapping function into your own
let csvtxt = mythings.map(mapElementToColumns).join("\n");
and choose a good mapping between the fields of the objects of your array, and the columns of your csv.
function mapElementToColumns(element) {
return `${JSON.stringify(element.id)},${JSON.stringify(element.name)},${JSON.stringify(element.value)}`;
}
or, in a more thorough way
function mapElementToColumns(fieldNames) {
return function (element) {
let fields = fieldnames.map(n => element[n] ? JSON.stringify(element[n]) : '""');
return fields.join(',');
}
}
that you may invoke in your map
mythings.map(mapElementToColumns(["id","name","element"])).join("\n");
Finally, you might decide to use an automated for "all fields in all objects" approach; which requires that all the objects in the original array maintain a similar fields schema.
You extract all the fields of the first object of the array, and use them as the header row of the csv and as the template for extracting the rest of the elements.
let fieldnames = Object.keys(mythings[0]);
and then use this field names array as parameter of your map function
let csvtxt= mythings.map(mapElementToColumns(fieldnames)).join("\n");
and, also, prepending them as the CSV header
csvtxt.unshift(fieldnames.join(','))
Putting all the pieces together...
function mapElementToColumns(fieldNames) {
return function (element) {
let fields = fieldnames.map(n => element[n] ? JSON.stringify(element[n]) : '""');
return fields.join(',');
}
}
let jsontxt=await fs.readFile('mythings.json','uft8');
let mythings = JSON.parse(jsontxt);
if (!Array.isArray(mythings)) throw "Oooops, stranger things happen!";
let fieldnames = Object.keys(mythings[0]);
let csvtxt= mythings.map(mapElementToColumns(fieldnames)).join("\n");
csvtxt.unshift(fieldnames.join(','));
await fs.writeFile("mythings.csv",csvtxt,"utf8");
And that's it. Pretty neat, uh?
I have a list of valid values that I am storing in a data store. This list is about 20 items long now and will likely grow to around 100, maybe more.
I feel there are a variety of reasons it makes sense to store this in a data store rather than just storing in code. I want to be able to maintain the list and its metadata and make it accessible to other services, so it seems like a micro-service data store.
But in code, we want to make sure only values from the list are passed, and they can typically be hardcoded. So we would like to create an enum that can be used in code to ensure that valid values are passed.
I have created a simple node.js that can generate a JS file with the enum right from the data store. This could be regenerated anytime the file changes or maybe on a schedule. But sharing the enum file with any node.js applications that use it would not be trivial.
Has anyone done anything like this? Any reason why this would be a bad approach? Any feedback is welcome.
Piggy-backing off of this answer, which describes a way of creating an "enum" in JavaScript: you can grab the list of constants from your server (via an HTTP call) and then generate the enum in code, without the need for creating and loading a JavaScript source file.
Given that you have loaded your enumConstants from the back-end (here I hard-coded them):
const enumConstants = [
'FIRST',
'SECOND',
'THIRD'
];
const temp = {};
for (const constant of enumConstants) {
temp[constant] = constant;
}
const PlaceEnum = Object.freeze(temp);
console.log(PlaceEnum.FIRST);
// Or, in one line
const PlaceEnum2 = Object.freeze(enumConstants.reduce((o, c) => { o[c] = c; return o; }, {}));
console.log(PlaceEnum2.FIRST);
It is not ideal for code analysis or when using a smart editor, because the object is not explicitly defined and the editor will complain, but it will work.
Another approach is just to use an array and look for its members.
const members = ['first', 'second', 'third'...]
// then test for the members
members.indexOf('first') // 0
members.indexOf('third') // 2
members.indexOf('zero') // -1
members.indexOf('your_variable_to_test') // does it exist in the "enum"?
Any value that is >=0 will be a member of the list. -1 will not be a member. This doesn't "lock" the object like freeze (above) but I find it suffices for most of my similar scenarios.
I have an array of Objects that I want to store in Redis. I can break up the array part and store them as objects but I am not getting how I can get somethings like
{0} : {"foo" :"bar", "qux" : "doe"}, {1} : {"name" "Saras", "age" : 23}
and then search the db based on name and get the requested key back. I need something like this. but can't come close to getting it right.
incr id //correct
(integer) 3
get id //correct
"3"
SADD id {"name" : "Saras"} //wrong
SADD myset {"name" : "Saras"} //correct
(integer) 1
First is getting this part right.
Second is somehow getting the key from the value i.e.
if name==="Saras"
then key=1
Which I find tough. Or I can store it directly as array of objects and use a simple for loop.
for (var i = 0; i < userCache.users.length; i++) {
if (userCache.users[i].userId == userId && userCache.users[i].deviceId == deviceId) {
return i;
}
}
Kindly suggest which route is best with some implementation?
The thing I found working was storing the key as a unique identifier and stringifying the whole object while storing the data and applying JSON.parse while extracting it.
Example code:
client
.setAsync(obj.deviceId.toString(), JSON.stringify(obj))
.then((doc) => {
return client.getAsync(obj.deviceId.toString());
})
.then((doc) => {
return JSON.parse(doc);
}).catch((err) => {
return err;
});
Though stringifying and then parsing it back is a computationally heavy operation and will block the Node.js server if the size of JSON becomes large. I am probably ready to take a hit for lesser complexity because I know my JSON wouldn't be huge, but that needs to be kept in mind while going for this approach.
Redis is pretty simple key-value storage. Yes, there are other data structures like sets, but it has VERY limited query capabilities. For example, if you want to get find data by name, then you would have to to something like that:
SET Name "serialized data of object"
SET Name2 "serialized data of object2"
SET Name3 "serialized data of object3"
then:
GET Name
would return data.
Of course this means that you can't store two entries with the same names.
You can do limited text matching on keys using: http://redis.io/commands/scan
To summarize: I think you should use other tool for complex queries.
The first issue you have, SADD id {"name" : "Saras"} //wrong, is obvious since the "id" key is not of type set, it is a string type.
In redis the only access point to data is through its key.
As kiss said, perhaps you should be looking for other tools.
Why does this code snippet not write the values back to Excel unless I un-comment the range.values=range.values line?
$('#run').click(function() {
invokeRun()
.catch(OfficeHelpers.logError);
});
function invokeRun() {
return Excel.run(function(context) {
var range = context.workbook.worksheets.getItem("Sheet1").getRange("A1:B3");
range.load('values');
return context.sync()
.then(function() {
range.values[1][1]=99;
console.log(JSON.stringify(range.values));
//range.values=range.values
return context.sync();
});
});
}
Array properties are special. I have added a page on my website to describe the topic: Reading and writing array properties.
Summarizing from there, the way that the proxy-object model works, whenever you set a property on an object, the Office.js runtime has a hook into the setter and getter, which is used to intercept the call and add the command to the queue.
Let's take an example of a regular property first. Per the above, whenever you set something like range.format.fill.color = "red", the setter for the color property intercepts the request and internally adds a command into the queue to set the range fill color to red (to be dispatched with the next context.sync)
On the other hand, if all you had was var color = range.format.fill.color
(after a load and a sync, of course), the getter would fire instead of the setter, and the color variable would get the range's current fill color.
Now, that was regular properties. Whenever you set an element of the array, you are effectively accessing the array value as a getter. From a runtime perspective, this line is no different from a slightly more verbose version:
var array = range.values;
array[r][c] = '-';
Because the getter for range.values returns a perfectly plain JS array object, accessing it and then setting its value does nothing to propagate it back to the original Range object.
If you want the values to get reflected back, the best thing is to get a reference to the array right after the sync (i.e., var array = range.values, just as above), then set the values on the array as needed, and then finally set it back to the object: range.values = array.
It means you could also modify the values array in place, and then assign the values property back to itself at the completion of the loop (range.values = range.values). However, this looks awkward, as if it’s a no-op, whereas in reality it is not. So personally, I prefer to retrieve the array at the beginning and assign it to its own variable, then do any necessary modifications, and finally set the full array back.
UPDATE to clarify the above:
To be very clear, the arrays returned by accessing the .values, .formulas, etc., ARE pure vanilla JS arrays. That's actually the crux of the problem: that in order for Office.js to return pure objects, it means that those pure objects can't be "spiked" with the ability to reflect changes.
For what it's worth, we actually have an upcoming feature that should be rolling out in a month or two, where we will be introducing an object.set syntax, as in:
range.set({
values: [[1, 2], [3, 4]],
format: {
fill: {
color: "purple"
}
}
}
This will make it more convenient to set multiple properties on the same object, but it might also make the array properties easier to deal with.
I've an entity with assigned string Id on NHibernate and I've a little problem when get an entity by Id.
Example...
Suppose that have a database record like this...
Id Description
-------------------
AAA MyDescription
now, if I use "Get" method using search id "aaa"...
MYENTITYTYPE entity = Session.Get<MYENTITYTYPE>("aaa")
return right entity but Id field (entity.Id) is "aaa", while I wish it were equal to "AAA".
In summary I would like that "Get" method return the id identical to the one stored in the database...with the same case.
Is possible? How can I do?
Interesting question. My guess is that it's not possible, because the Id might exist before the DB call. Consider the following:
var foo = session.Load<Foo>("aaa"); //no DB call, foo is a proxy
Console.WriteLine(foo.Id); //Prints "aaa";
var bar = foo.Bar; //Forces loading
Console.WriteLine(foo.Id); //No matter what, the Id can't change at this point
This illustrates another reason why primary keys with meaning are usually a bad idea, especially if their input is not controlled.
Now, if instead of Get you use a query, you will get the right-cased Id:
//example with LINQ; you can use HQL, Criteria, etc
var foo = session.Query<Foo>().Single(x => x.Id == "aaa");
The drawback is that you will always go to the DB, even if the entity is already loaded.
Now, if you defined your entity as {Id, Code, Description}, where Id is a synthetic POID (I recommend Hilo or Guid) and Code is the existing string Id, you will avoid potential bugs caused by using Get instead of a query with the code.