I was at a meetup recently and one of the talks was about how you can use Webpack to require just the pieces of a package that you need. I believe that it is called tree shaking. I was wondering if there is a way to do this without Webpack? For instance can you specify exactly the pieces of code you need rather than the whole node module.
Any information about this would be great. I am just looking to learn something new.
Cheers,
There's a couple pretty simple ways:
In ES6, you can do what is called destructuring.
Here's an example with arrays:
var a, b, rest;
[a, b] = [10, 20];
console.log(a);
// expected output: 10
console.log(b);
// expected output: 20
[a, b, ...rest] = [10, 20, 30, 40, 50];
console.log(rest);
// expected output: [30,40,50]
This is destructuring by index, where a = array[0], b=array of index 1 (hyperlink barred bracket format), etc... Notice the ... operator, called the spread operator in ES6. Here is a link to that if you are curious about what it does, or how to use it.
You can also do the same with objects, consider:
const someRandomObject = {
a: 1,
b: 2,
};
const {a} = someRandomObject;
console.log(a) // expected output: 1
You are destructing, by name, only the properties you need from the object, so you are not pulling in a bunch of unused stuff. If you are not using ES6, you can do something similar with:
const someRandomObject = {
a: 1,
b: 2,
};
const a = someRandomObject.b;
console.log(a) // expected output: 2
Same thing as above, you are pulling out of someRandomObject the property you want, and nothing else. Note that the above way is pulling the value on the right side, so the name of the variable does not matter. These two ways are functionally equivalent (I believe).
Related
I was trying to make an interface and initialize its instance with another object properties by spread syntax.
I found something weird and wanted to find out why this can happen.
This is an example of what I want.
A: The object that I want to make as a result
A = { a: number, b: number, c: number, d: number }
B: The object I've got with DB query
B = {a: number,b: number,c: number,e: string}
C: The object that has a value I will insert manually
C = { d: boolean }
So I've declared an interface A, and used spread syntax to make an object of A with B object.
A_object = {...B_object }
What I expected was an object of A which has properties 'a', 'b', and 'c', or an error.
But the result was quite different.
Property 'e' also appeared, which doesn't exist in interface A.
Can anybody explain why this happens?
The spread operator is a JavaScript feature which does not know about the known properties of TypeScript-interfaces. All properties of B_object will therefore appear in A_object when A_object = { ...B_object }.
From a type-safety perspective, this is fine. Everything needed to construct a valid A instance is also present in B, so spreading B into A is considered to be valid.
You are probably looking for excess property checks. Those are not necessarily needed in a structural type system and there are only a select few places where TypeScript does perform them. And spreading objects is not one of those places. There is an open Issue requesting this and you can give it a thumbs up if you want to see it implemented.
Why does object destructuring throw an error if there is no var keyword in front of it?
{a, b} = {a: 1, b: 2};
throws SyntaxError: expected expression, got '='
The following three examples work without problems
var {a, b} = {a: 1, b: 2};
var [c, d] = [1, 2];
[e, f] = [1, 2];
Bonus question: Why do we not need a var for array destructuring?
I ran into the problem doing something like
function () {
var {a, b} = objectReturningFunction();
// Now a and b are local variables in the function, right?
// So why can't I assign values to them?
{a, b} = objectReturningFunction();
}
The issue stems from the {...} operators having multiple meanings in JavaScript.
When { appears at the start of a Statement, it'll always represent a block, which can't be assigned to. If it appears later in the Statement as an Expression, then it'll represent an Object.
The var helps make this distinction, since it can't be followed by a Statement, as will grouping parenthesis:
( {a, b} = objectReturningFunction() );
From their docs: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Destructuring_assignment#assignment_separate_from_declaration_2
Notes: The parentheses ( ... ) around the assignment statement are required when using object literal destructuring assignment without a declaration.
{a, b} = {a: 1, b: 2} is not valid stand-alone syntax, as the {a, b} on the left-hand side is considered a block and not an object literal.
However, ({a, b} = {a: 1, b: 2}) is valid, as is var {a, b} = {a: 1, b: 2}
Your ( ... ) expression needs to be preceded by a semicolon or it may be used to execute a function on the previous line.
If you write Javascript without semicolons, then the 'assignment without declaration' syntax should be preceded with a semicolon for it to work predictably
let a, b
;({a, b} = objectReturningFunction()) // <-- note the preceding ;
Just wanted to highlight this as it caught me out, and hopefully can save others some time figuring out why it doesn't work and/or produces weird results with code formatters like prettier.
Indeed, it's actually right there in the accepted answer (last line of the quoted docs) but easy to miss, especially without seeing an example!
Here's another way:
let {} = {a, b} = objectReturningFunction()
Pros:
No parenthesis needed
No semicolons needed
The extra assignment is a guaranteed no-op (given that no weird things are going on - also, your transpiler might not realize this)
Cons:
Looks a bit weird, although in my opinion no weirder than the !(){...}() IIFE.
Might be confusing as to why it's there. It is guaranteed to throw people off on the first encounter, so I would advise against using it as a one-off.
Consider:
colors = { r: 204, g: 51, b: 102, hex: "#cc3366" };
Here's a gist of some of the ways of destructuring:
Destructuring to new variables
let { r, g, b } = colors;
// initializes variables r, g, b
Destructuring to new variables with different names
let { r: red, g: green, b: blue } = colors;
// initializes variables red, green, blue
Destructuring to existing variables
let r, g, b;
...
({ r, g, b } = colors);
Destructuring to existing variables with different names
let red, green, blue;
...
({ r: red, g: green, b: blue } = colors);
Destructuring into another object with same property names
let myColor = { r: 0, g: 0, b: 0 };
...
({ r: myColor.r, g: myColor.g, b: myColor.b } = colors);
Destructuring into another object with different property names
let myColor = { red: 0, green: 0, blue: 0 };
...
({ r: myColor.red, g: myColor.green, b: myColor.blue } = colors);
I've noticed that if we insert an object {a: 1, b: 2, c: 3}, in database it'll be stored as {c: 3, b:2, a: 1}. Why is MongooseJS doing this?
Is it for Performance Gain (or) some other logic?
Could anyone please explain this to me in detail?
There is no such thing as properties order in object. If the order is important for you use an array.
The for...in statement iterates over the enumerable properties of an object, in original insertion order. For each distinct property [...]
However this seems to be implementation (browser) dependant.
In objects you can't rely on order of the properties as even various iteration methods may give various results.
Ordering of properties in objects is complex, as this answer explains: https://stackoverflow.com/a/38218582/893780
Even though it's not specified in the standard, normal property keys are stored in insertion order.
Internally, Mongoose uses a faster code path when cloning objects, that reverses the property order. It basically does this:
let oldObject = { a: 1, b: 2, c: 3 };
let newObject = {};
let keys = Object.keys(oldObject);
let i = keys.length;
while (i--) {
let k = keys[i];
let val = oldObject[k];
newObject[k] = val;
}
Because normal keys are stored in insertion order, newObject will be the reverse of oldObject:
{ c: 3, b: 2, a: 1 }
However, because this can cause issues in queries, Mongoose added an option retainKeyOrder to prevent this reversal.
However, according to the same answer that I linked to above, you still can't rely on any order being imposed when using Object.keys() or for..in.
From the Mozilla Developer Network:
[1,4,9].map(Math.sqrt)
will yield:
[1,2,3]
Why then does this:
['1','2','3'].map(parseInt)
yield this:
[1, NaN, NaN]
I have tested in Firefox 3.0.1 and Chrome 0.3 and just as a disclaimer, I know this is not cross-browser functionality (no IE).
I found out that the following will accomplish the desired effect. However, it still doesn’t explain the errant behavior of parseInt.
['1','2','3'].map(function(i){return +i;}) // returns [1,2,3]
The callback function in Array.map has three parameters:
From the same Mozilla page that you linked to:
callback is invoked with three arguments: the value of the element, the index of the element, and the Array object being traversed."
So if you call a function parseInt which actually expects two arguments, the second argument will be the index of the element.
In this case, you ended up calling parseInt with radix 0, 1 and 2 in turn. The first is the same as not supplying the parameter, so it defaulted based on the input (base 10, in this case). Base 1 is an impossible number base, and 3 is not a valid number in base 2:
parseInt('1', 0); // OK - gives 1
parseInt('2', 1); // FAIL - 1 isn't a legal radix
parseInt('3', 2); // FAIL - 3 isn't legal in base 2
So in this case, you need the wrapper function:
['1','2','3'].map(function(num) { return parseInt(num, 10); });
or with ES2015+ syntax:
['1','2','3'].map(num => parseInt(num, 10));
(In both cases, it's best to explicitly supply a radix to parseInt as shown, because otherwise it guesses the radix based on the input. In some older browsers, a leading 0 caused it to guess octal, which tended to be problematic. It will still guess hex if the string starts with 0x.)
map is passing along a 2nd argument, which is (in many of the cases) messing up parseInt's radix parameter.
If you're using underscore you can do:
['10','1','100'].map(_.partial(parseInt, _, 10))
Or without underscore:
['10','1','100'].map(function(x) { return parseInt(x, 10); });
You could solve this problem using Number as iteratee function:
var a = ['0', '1', '2', '10', '15', '57'].map(Number);
console.log(a);
Without the new operator, Number can be used to perform type conversion. However, it differs from parseInt: it doesn't parse the string and returns NaN if the number cannot be converted. For instance:
console.log(parseInt("19asdf"));
console.log(Number("19asf"));
I'm going to wager that it's something funky going on with the parseInt's 2nd parameter, the radix. Why it is breaking with the use of Array.map and not when you call it directly, I do not know.
// Works fine
parseInt( 4 );
parseInt( 9 );
// Breaks! Why?
[1,4,9].map( parseInt );
// Fixes the problem
[1,4,9].map( function( num ){ return parseInt( num, 10 ) } );
You can use arrow function ES2015/ES6 and just pass number to the parseInt. Default value for radix will be 10
[10, 20, 30].map(x => parseInt(x))
Or you can explicitly specify radix for better readability of your code.
[10, 20, 30].map(x => parseInt(x, 10))
In example above radix explicitly set to 10
another (working) quick fix :
var parseInt10 = function(x){return parseInt(x, 10);}
['0', '1', '2', '10', '15', '57'].map(parseInt10);
//[0, 1, 2, 10, 15, 57]
You can solve that issue like this:
array.map(x => parseInt(x))
Example:
var arr = ["3", "5", "7"];
console.log(
arr.map(x => parseInt(x))
);
parseInt IMHO should be avoided for this very reason. You can wrap it to make it more safe in these contexts like this:
const safe = {
parseInt: (s, opt) => {
const { radix = 10 } = opt ? opt : {};
return parseInt(s, radix);
}
}
console.log( ['1','2','3'].map(safe.parseInt) );
console.log(
['1', '10', '11'].map(e => safe.parseInt(e, { radix: 2 }))
);
lodash/fp caps iteratee arguments to 1 by default to avoid these gotchas. Personally I have found these workarounds to create as many bugs as they avoid. Blacklisting parseInt in favor of a safer implementation is, I think, a better approach.
There is an issue with enumerating Object.keys() in node.js that I do not understand. With the following code:
Object.prototype.tuple = function() {
var names = Object.keys(this);
console.log("Dump of names:");
console.log(names);
console.log("FOR loop using indexes:");
for (var k = 0; k < names.length; k++)
{
console.log(names[k]);
}
console.log("FOR loop using enumeration:");
for (var z in names)
{
console.log(z);
}
return this;
};
var x = {a:0, b:0, c:0}.tuple();
I get the following results on the console:
Dump of names:
[ 'a', 'b', 'c' ]
FOR loop using indexes:
a
b
c
FOR loop using enumeration:
0
1
2
tuple
Could somebody explain where does an extra "tuple" come from in the second loop? While defined as function in Object.prototype, it is neither an own property of x object, nor included in names array.
I am using node.js version 0.8.20.
The first loop goes over the properties of x (Object.keys() returns only own properties), while the second one goes over the properties or the array names, including the ones up in the prototype chain.
Thanks to Jonathan Lonowski for clarifications.
I think what #Kuba mentioned above is not correct.
Object.keys, Object.getOwnPropertyNames and other similar method would behave different sightly. Their behaviors are related to a property named enumerable.
I am going to dinner with my friends so I can only give you a helpful link illustrating it. So sorry.
https://developer.mozilla.org/en-US/docs/Enumerability_and_ownership_of_properties