Erroneous Resharper multiple enumeration warning? - resharper

This code:
IEnumerable<IEnumerable<int>> numbas = new[] {new[] {0, 1}, new[] {2}, new[] {3, 4, 5}};
var flattened = numbas.SelectMany(a => a);
extracts a single flattened enumerable list of numbers from several sources. Resharper warns that it's possible that a (the second one) is being enumerated multiple times -- but this is silly; each source is being enumerated once only. Yes, the symbol a is going to be enumerated multiple times, but there will be a different source under it each time.
Did I miss something, or is this an erroneous warning coming out of Resharper?

Yes, this is an erroneous warning. You can see if you take a look at the implementation of SelectMany - there's only one enumeration of the nested element:
foreach (TSource element in source) {
foreach (TResult subElement in selector(element)) {
yield return subElement;
}
}
Here's the YouTrack issue for this: http://youtrack.jetbrains.com/issue/RSRP-413613

Related

Remove Member for rapid JSON nested objects

Are there examples to remove a member within a nested JSON object?
For example: Consider the below JSON snippet, how would I be able to remove member C from the rapid son library?
{
"a": 1,
"b":{"c" : 2, "d" : 3}
}
I am not looking for hardcoded removal.. Like a.RemoveMember("c"); I am looking for code examples to remove a member from a rapid JSON document using the member iterator.
All the examples I see are for ConstMemberIterator. But RemoveMember can only be called with Member Iterators
From the document https://rapidjson.org/md_doc_tutorial.html, I am looking for an examples code snippet for the following function:
MemberIterator RemoveMember(MemberIterator): Remove a member by iterator (constant time complexity).
MemberIterator EraseMember(MemberIterator): similar to the above but it preserves order of members (linear time complexity).
MemberIterator EraseMember(MemberIterator first, MemberIterator last): remove a range of members, preserves order (linear time complexity).
I don't know if this can resolve your issue or not
(http://rapidjson.org/md_doc_pointer.html)
// Erase a member or element, return true if the value exists
bool success = Pointer("/b/c").Erase(d);
assert(success);
Edited
Yes, but it will erase the entire b object. The result will be {"a": 1}

elements not getting added in list groovy script

The weird thing is happening while adding elements in list in groovy.
Scenario-
There are two List list1 and list2. List1 contains Object of X type and List2 is empty. List1 is getting populated from java file and while iterating List1 in groovy script, I am adding objects in List2.
But what happening is elements are not getting added. List2 remains empty.
If I debug the line and evaluate the expression/line it then it is getting added. But while normal debugging while executing this line, it suddenly jump to any random line.
No exception is coming.
Have created list as below:
List<X> dataToBeRemoved = new ArrayList<>()
Iterating the list as below:
for (X data in XList) {
if(something) {
dataToBeRemoved.add(data)
}
}
I am new to Groovy and If any one have ever faced this kind of issue. Please guide. Thanks.
You didn't ask, but type parameters don't get you much.
List elementsToRemove = []
And, in this case even better:
List elementsToRemove = allElements.findAll { ...some condition... }
After that, it's impossible to tell from your code. Questions such as "Why doesn't Groovy work?" are hard to answer.
You can define the an empty list by simply using
def mySmallList = []
and you may also use findAll to filter out the list
mySmallList = myBigList.findAll {//some condition }
Please check the link https://groovyconsole.appspot.com/script/5127180895911936

Mongoose/ Mongo: why does $set auto-sort the data keys on update? [duplicate]

If I create an object like this:
var obj = {};
obj.prop1 = "Foo";
obj.prop2 = "Bar";
Will the resulting object always look like this?
{ prop1 : "Foo", prop2 : "Bar" }
That is, will the properties be in the same order that I added them?
The iteration order for objects follows a certain set of rules since ES2015, but it does not (always) follow the insertion order. Simply put, the iteration order is a combination of the insertion order for strings keys, and ascending order for number-like keys:
// key order: 1, foo, bar
const obj = { "foo": "foo", "1": "1", "bar": "bar" }
Using an array or a Map object can be a better way to achieve this. Map shares some similarities with Object and guarantees the keys to be iterated in order of insertion, without exception:
The keys in Map are ordered while keys added to object are not. Thus, when iterating over it, a Map object returns keys in order of insertion. (Note that in the ECMAScript 2015 spec objects do preserve creation order for string and Symbol keys, so traversal of an object with ie only string keys would yield keys in order of insertion)
As a note, properties order in objects weren’t guaranteed at all before ES2015. Definition of an Object from ECMAScript Third Edition (pdf):
4.3.3 Object
An object is a member of the
type Object. It is an unordered collection of properties each of which
contains a primitive value, object, or
function. A function stored in a
property of an object is called a
method.
YES (but not always insertion order).
Most Browsers iterate object properties as:
Positive integer keys in ascending order (and strings like "1" that parse as ints)
String keys, in insertion order (ES2015 guarantees this and all browsers comply)
Symbol names, in insertion order (ES2015 guarantees this and all browsers comply)
Some older browsers combine categories #1 and #2, iterating all keys in insertion order. If your keys might parse as integers, it's best not to rely on any specific iteration order.
Current Language Spec (since ES2015) insertion order is preserved, except in the case of keys that parse as positive integers (eg "7" or "99"), where behavior varies between browsers. For example, Chrome/V8 does not respect insertion order when the keys are parse as numeric.
Old Language Spec (before ES2015): Iteration order was technically undefined, but all major browsers complied with the ES2015 behavior.
Note that the ES2015 behavior was a good example of the language spec being driven by existing behavior, and not the other way round. To get a deeper sense of that backwards-compatibility mindset, see http://code.google.com/p/v8/issues/detail?id=164, a Chrome bug that covers in detail the design decisions behind Chrome's iteration order behavior.
Per one of the (rather opinionated) comments on that bug report:
Standards always follow implementations, that's where XHR came from, and Google does the same thing by implementing Gears and then embracing equivalent HTML5 functionality. The right fix is to have ECMA formally incorporate the de-facto standard behavior into the next rev of the spec.
Property order in normal Objects is a complex subject in JavaScript.
While in ES5 explicitly no order has been specified, ES2015 defined an order in certain cases, and successive changes to the specification since have increasingly defined the order (even, as of ES2020, the for-in loop's order). Given is the following object:
const o = Object.create(null, {
m: {value: function() {}, enumerable: true},
"2": {value: "2", enumerable: true},
"b": {value: "b", enumerable: true},
0: {value: 0, enumerable: true},
[Symbol()]: {value: "sym", enumerable: true},
"1": {value: "1", enumerable: true},
"a": {value: "a", enumerable: true},
});
This results in the following order (in certain cases):
Object {
0: 0,
1: "1",
2: "2",
b: "b",
a: "a",
m: function() {},
Symbol(): "sym"
}
The order for "own" (non-inherited) properties is:
Positive integer-like keys in ascending order
String keys in insertion order
Symbols in insertion order
Thus, there are three segments, which may alter the insertion order (as happened in the example). And positive integer-like keys don't stick to the insertion order at all.
In ES2015, only certain methods followed the order:
Object.assign
Object.defineProperties
Object.getOwnPropertyNames
Object.getOwnPropertySymbols
Reflect.ownKeys
JSON.parse
JSON.stringify
As of ES2020, all others do (some in specs between ES2015 and ES2020, others in ES2020), which includes:
Object.keys, Object.entries, Object.values, ...
for..in
The most difficult to nail down was for-in because, uniquely, it includes inherited properties. That was done (in all but edge cases) in ES2020. The following list from the linked (now completed) proposal provides the edge cases where the order is not specified:
Neither the object being iterated nor anything in its prototype chain is a proxy, typed array, module namespace object, or host exotic object.
Neither the object nor anything in its prototype chain has its prototype change during iteration.
Neither the object nor anything in its prototype chain has a property deleted during iteration.
Nothing in the object's prototype chain has a property added during iteration.
No property of the object or anything in its prototype chain has its enumerability change during iteration.
No non-enumerable property shadows an enumerable one.
Conclusion: Even in ES2015 you shouldn't rely on the property order of normal objects in JavaScript. It is prone to errors. If you need ordered named pairs, use Map instead, which purely uses insertion order. If you just need order, use an array or Set (which also uses purely insertion order).
At the time of writing, most browsers did return properties in the same order as they were inserted, but it was explicitly not guaranteed behaviour so shouldn't have been relied upon.
The ECMAScript specification used to say:
The mechanics and order of enumerating the properties ... is not specified.
However in ES2015 and later non-integer keys will be returned in insertion order.
This whole answer is in the context of spec compliance, not what any engine does at a particular moment or historically.
Generally, no
The actual question is very vague.
will the properties be in the same order that I added them
In what context?
The answer is: it depends on a number of factors. In general, no.
Sometimes, yes
Here is where you can count on property key order for plain Objects:
ES2015 compliant engine
Own properties
Object.getOwnPropertyNames(), Reflect.ownKeys(), Object.getOwnPropertySymbols(O)
In all cases these methods include non-enumerable property keys and order keys as specified by [[OwnPropertyKeys]] (see below). They differ in the type of key values they include (String and / or Symbol). In this context String includes integer values.
Object.getOwnPropertyNames(O)
Returns O's own String-keyed properties (property names).
Reflect.ownKeys(O)
Returns O's own String- and Symbol-keyed properties.
Object.getOwnPropertySymbols(O)
Returns O's own Symbol-keyed properties.
[[OwnPropertyKeys]]
The order is essentially: integer-like Strings in ascending order, non-integer-like Strings in creation order, Symbols in creation order. Depending which function invokes this, some of these types may not be included.
The specific language is that keys are returned in the following order:
... each own property key P of O [the object being iterated] that is an integer index, in ascending numeric index order
... each own property key P of O that is a String but is not an integer index, in property creation order
... each own property key P of O that is a Symbol, in property creation order
Map
If you're interested in ordered maps you should consider using the Map type introduced in ES2015 instead of plain Objects.
As of ES2015, property order is guaranteed for certain methods that iterate over properties. but not others. Unfortunately, the methods which are not guaranteed to have an order are generally the most often used:
Object.keys, Object.values, Object.entries
for..in loops
JSON.stringify
But, as of ES2020, property order for these previously untrustworthy methods will be guaranteed by the specification to be iterated over in the same deterministic manner as the others, due to to the finished proposal: for-in mechanics.
Just like with the methods which have a guaranteed iteration order (like Reflect.ownKeys and Object.getOwnPropertyNames), the previously-unspecified methods will also iterate in the following order:
Numeric array keys, in ascending numeric order
All other non-Symbol keys, in insertion order
Symbol keys, in insertion order
This is what pretty much every implementation does already (and has done for many years), but the new proposal has made it official.
Although the current specification leaves for..in iteration order "almost totally unspecified, real engines tend to be more consistent:"
The lack of specificity in ECMA-262 does not reflect reality. In discussion going back years, implementors have observed that there are some constraints on the behavior of for-in which anyone who wants to run code on the web needs to follow.
Because every implementation already iterates over properties predictably, it can be put into the specification without breaking backwards compatibility.
There are a few weird cases which implementations currently do not agree on, and in such cases, the resulting order will continue be unspecified. For property order to be guaranteed:
Neither the object being iterated nor anything in its prototype chain is a proxy, typed array, module namespace object, or host exotic object.
Neither the object nor anything in its prototype chain has its prototype change during iteration.
Neither the object nor anything in its prototype chain has a property deleted during iteration.
Nothing in the object's prototype chain has a property added during iteration.
No property of the object or anything in its prototype chain has its enumerability change during iteration.
No non-enumerable property shadows an enumerable one.
In modern browsers you can use the Map data structure instead of a object.
Developer mozilla > Map
A Map object can iterate its elements in insertion order...
In ES2015, it does, but not to what you might think
The order of keys in an object wasn't guaranteed until ES2015. It was implementation-defined.
However, in ES2015 in was specified. Like many things in JavaScript, this was done for compatibility purposes and generally reflected an existing unofficial standard among most JS engines (with you-know-who being an exception).
The order is defined in the spec, under the abstract operation OrdinaryOwnPropertyKeys, which underpins all methods of iterating over an object's own keys. Paraphrased, the order is as follows:
All integer index keys (stuff like "1123", "55", etc) in ascending numeric order.
All string keys which are not integer indices, in order of creation (oldest-first).
All symbol keys, in order of creation (oldest-first).
It's silly to say that the order is unreliable - it is reliable, it's just probably not what you want, and modern browsers implement this order correctly.
Some exceptions include methods of enumerating inherited keys, such as the for .. in loop. The for .. in loop doesn't guarantee order according to the specification.
As others have stated, you have no guarantee as to the order when you iterate over the properties of an object. If you need an ordered list of multiple fields I suggested creating an array of objects.
var myarr = [{somfield1: 'x', somefield2: 'y'},
{somfield1: 'a', somefield2: 'b'},
{somfield1: 'i', somefield2: 'j'}];
This way you can use a regular for loop and have the insert order. You could then use the Array sort method to sort this into a new array if needed.
Major Difference between Object and MAP with Example :
it's Order of iteration in loop, In Map it follows the order as it was set while creation whereas in OBJECT does not.
SEE:
OBJECT
const obj = {};
obj.prop1 = "Foo";
obj.prop2 = "Bar";
obj['1'] = "day";
console.log(obj)
**OUTPUT: {1: "day", prop1: "Foo", prop2: "Bar"}**
MAP
const myMap = new Map()
// setting the values
myMap.set("foo", "value associated with 'a string'")
myMap.set("Bar", 'value associated with keyObj')
myMap.set("1", 'value associated with keyFunc')
OUTPUT:
**1. ▶0: Array[2]
1. 0: "foo"
2. 1: "value associated with 'a string'"
2. ▶1: Array[2]
1. 0: "Bar"
2. 1: "value associated with keyObj"
3. ▶2: Array[2]
1. 0: "1"
2. 1: "value associated with keyFunc"**
Just found this out the hard way.
Using React with Redux, the state container of which's keys I want to traverse in order to generate children is refreshed everytime the store is changed (as per Redux's immutability concepts).
Thus, in order to take Object.keys(valueFromStore) I used Object.keys(valueFromStore).sort(), so that I at least now have an alphabetical order for the keys.
For a 100% fail-safe solution you could use nested objects and do something like this:
const obj = {};
obj.prop1 = {content: "Foo", index: 0};
obj.prop2 = {content: "Bar", index: 1};
for (let i = 0; i < Object.keys(obj).length; i++)
for (const prop in obj) {
if (obj[prop].index == i) {
console.log(obj[prop].content);
break;
}
}
From the JSON standard:
An object is an unordered collection of zero or more name/value pairs, where a name is a string and a value is a string, number, boolean, null, object, or array.
(emphasis mine).
So, no you can't guarantee the order.

chrome.storage.local strange behaviour: confused by a duplicate object

After a lot of bug-hunting, I managed to narrow my problem down to this bit of code:
dup = {a: [1]}
chrome.storage.local.set({x: [dup, dup]});
chrome.storage.local.get(["x"], function(o) {console.log(JSON.stringify(o['x']));});
This prints out: [{"a":[1]},null]
Which I find to be a pretty strange behaviour. So my questions are:
Is this intentional? Is it documented?
Can you recommend a good solution to bypass this limitation?
My current idea is to use JSON.stringify (which handles this case correctly) and later parse the string. But that just seems wasteful.
Thanks.
No it is not intentional and should be reported as a bug: https://crbug.com/606955 (and now it is fixed as of Chrome 52!).
As I explained in the bug report, the cause of the bug is that the objects are identical. If your object dup only contains simple values (i.e. no nested arrays or objects, only primitive values such as strings, numbers, booleans, null, ...), then a shallow clone of the object is sufficient:
dup = {a: [1]}
dup2 = Object.assign({}, dup);
chrome.storage.local.set({x: [dup, dup2]});
If you need support for nested objects, then you have to make a deep clone. There are many existing libraries or code snippets for that, so I won't repeat it here. A simple way to prepare values for chrome.storage is by serializing it to JSON and then parsing it again (then all objects are unique).
dup = {a: [1]}
var valueToSave = JSON.parse(JSON.stringify([dup, dup]));
chrome.storage.local.set({x: valueToSave});
// Or:
var valueToSave = [ dup, JSON.parse(JSON.stringify(dup)) ];
chrome.storage.local.set({x: valueToSave});

Subsonic load objects by a list of ids

Is it possible to load objects by a list of ids using subsonic ActiveRecord?
My code looks like:
IList<Video> videos = Video.Find(v => videoIds.Contains(v.ID));
I get an exception: The method 'Contains' is not supported
Do I do something wrong ... or I hit one of subsonic's limitations?
Thanks, Radu
After more research I found a way to achieve this:
List<int> videoIds = new List<int>(){1, 2, 3, 4, 5};
SqlQuery query = new Select().From<Video>().Where("ID").In(videoIds);
List<Video> videos = query.ExecuteTypedList<Video>();
FYI: SubSonic's linq parser does not like generic Lists and Contains
// does not work
List<int> videoIds = new List<int>() {1,2,3,4,5};
var videos = Video.Find(v => videoIds.Contains(v.ID));
// should work
IEnumerable<int> videoIds = new List<int>() {1,2,3,4,5};
var videos = Video.Find(v => videoIds.Contains(v.ID));
Noticed the difference?
Sounds strange, but whenever you want to use Contains() with Subsonic, you first have to cast your List to an IEnumerable to prevent the NotSupportedException.
This is the one-liner I believe you where looking for.
var colMatchingVideos = Video.Find( objVideo => colVideoIds.Any( iVideoId => objVideo.ID == iVideoId).ToList();
Also I would strongly advise you avoid using string literals for columns, instead you could use Video.Columns.Id or expressions Where( o => o.Id). This would ensure that if you change your column names in the Database a compile time exception will occur. Help's a lot with maintainability.

Resources