Undefined variables, value == false versus !value
I have a problem with a very simple piece of code written in Javascript, could you help me please?
Here's what I think I have understand so far about javascript and variables:
I found an exercise file in a online course and I tried to do it, but I didn't got the same result expected in the lesson; the main problem was that I was comparing the value through a "if value == false { ... }" while the solution was using a "if !value { ... }"
So I decided to write a very short code in order to try it by myself, but I'm getting mixed results. Here in the example below I would expect this JS code to generate two identical alerts ("foo is equal to false"), but instead the first if statement returns "foo IS NOT equal to false" while the second if returns (as expected) "foo is equal to false".
This is what I written:
var foo = undefined;
if (foo == false) {
alert("foo is equal to false");
} else {
alert("foo is not equal to false"); // Javascript executes this row
}
if (!foo) {
alert("foo is equal to false"); // Javascript executes this row
} else {
alert("foo is not equal to false");
}
AFAIK the two IFs should do the same work, and infact when I tried it by replacing in the first line the value "var foo = undefined;" with "var foo = 0;" it worked as expected, and 0 is another value that should be evaluated to false, or at least this is what I remember.
Could you tell me what I'm doing wrong?
The ==
algorithm (Abstract Equality Comparison Algorithm) isn't something where you can simply assume an outcome unless you know the algorithm. You need to know the details of how it works.
For example, null
and undefined
are a special case. They do not do any type conversion other than to be considered equal to each other.
Otherwise there's typically a type conversion that tries to reduce both operands to a common type. This often ends up being a toNumber conversion.
That's why:
null == undefined; // true
null == 0; // false
+null == '0' // true
So if you know how the algorithm works, you know that undefined
never equals anything except for undefined
and null
, but other types that are not strictly equal may be coerced down to types that are equal.
So doing if(!x)
vs if(x==false)
are entirely different tests.
if(!x)
performs toBoolean conversion.
if(x == false)
uses a complex algorithm to decide the proper conversion.
So with...
if(x == false)
...if x
is undefined
, it is determined to not be equal to false
, yet if x
is 0
or even "0"
, it will be considered equal to false
.
0 == false; // true
"0" == false; // true
undefined does not equal to false, but when you are trying to evaulate:
if (undefined)
the whole expression is always false
more info: http://www.mapbender.org/JavaScript_pitfalls:_null,_false,_undefined,_NaN
Truth and equivalence with true
are two different things in JavaScript.
The if (...)
executes the first statement if the ...
is " truthy ", not when they are "equal" to any other particular value, so your second conditional should look like
if (!foo) {
alert("foo is falsy"); // Javascript executes this row
} else {
alert("foo is truthy");
}
There are quite a few "falsy" values in JavaScript: NaN
, ""
, 0
, -0
, false
, null
, undefined
. All other values are truthy
.
The !
operator returns false
for any truthy value and true
for any falsy
value, so !x
is the same as (x ? false : true)
for all x
.
上一篇: 当布尔函数返回false时运算符&执行
下一篇: 未定义的变量,值== false!值