It is possible to declare a javascript variable both with and without the var keyword. 
var a = 100; // this works!
 b = 200; // and this does too!
It is also possible to declare a variable without initialisation.
var c; // this is just as acceptable! 
But then why is the same NOT true for a variable without var, to be declared without initialisation.
var c; 
 d; // causes a reference error to occur!
Why?
- 
 You should read this developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/…Daniel Tate– Daniel Tate2017年12月07日 03:29:22 +00:00Commented Dec 7, 2017 at 3:29
- 
 @DanielTate Thank you!Grateful– Grateful2017年12月07日 06:46:52 +00:00Commented Dec 7, 2017 at 6:46
2 Answers 2
First, what you're seeing is legacy behavior. Assignment to an undeclared symbol traditionally meant, implicitly, that a global symbol should be created (declared) and set to the given value. Thus
x = 1;
when x has not been declared was taken to be an implicit instantiation of a global symbol.
The mention of an undeclared symbol, as in:
x;
is an error because the symbol is undeclared.
In modern JavaScript, and when "strict" mode is in force because of a
"use strict";
statement (or because of other influences, as may be the case with Node.js code), the implicit creation of global symbols is also erroneous.
Generally, implicit global symbol instantiation is considered a bad idea. Global symbols in browser JavaScript are quite problematic because the global namespace is so horribly polluted. Thankfully, it's easy to wrap code in a function scope to create a "safe space" for symbols without fear of the browser imposing weird global names.
1 Comment
You can do that in non-strict mode
 var a = 100; // this works!
 b = 200; // and this does too!
for non-strict mode, someVar = someValue, if someVar is not existed, javascript will declare it and assign a someValue to someVar.
For that case:
var c; 
 d; // causes a reference error to occur!
line 1: var c; --> declare c, it's valid syntax. line 2: d; --> you access d, but d is underfined --> causes error !!! assume d is defined anywhere before that line, so it should NOT cause error!!
Comments
Explore related questions
See similar questions with these tags.