1

I have a simply array and counter:

var data = [1];
var i = 0;

The shortcut assignment produces 2:

data[i++] *= 2
// 2

I was expecting 3. data[i++] is multipled with 2, so 1 * 2 is 2, and then that is assigned to data[i++], which now becomes 2, and then after statement evaluated the side effect of ++ causes i to be 3.

The following also gives me unexpected result. It produces NaN oddly enough.

var data [1];
var i = 0;
data[i++] = data[i++] * 2;
// NaN

I was expecting 3 again. data[i++] first evaluates to 1, and then is multipled with 2, and then that 2 value is assigned to i in data[i++], which is then incremented after statement completes, causing it to be 3.

What am I missing here?

asked Apr 9, 2015 at 4:11
3
  • 1
    Aren't you missing an = sign in var data [1]; so it becomes var data = [1];? Commented Apr 9, 2015 at 4:14
  • @GregL that was a typo when writing up this question. Surely though it was there in my tests. Commented Apr 9, 2015 at 4:15
  • i++ first returns i, and then adds 1 to i. So data[i++] first assigns data[0], and then i becomes 1. Commented Apr 9, 2015 at 4:16

2 Answers 2

5
  • var data [1]; is not a valid JavaScript. Did you mean var data = [1];?

  • data[i++] *= 2 is evaluated as follows:

    • i++, as the innermost expression resolves first: its value is i (i.e. 0), and i increments afterwards to 1.

    • data[0] is looked up, and multiplied by two; since data[0] is 1, data[0] gets assigned the value of 1 * 2, i.e. 2.

    • The value of the outermost expression is returned: 2. ++ increments only what it was applied to (i), and not the whole expression.

  • data[i++] = data[i++] * 2 evaluates as follows:

    • The first i++ evaluates to 0 and modifies i to 1, as before.

    • The second i++ evaluates to 1 and modifies i to 2

    • The expression then evaluates as data[0] = data[1] * 2. data[1] is undefined, and undefined * 2 is not a number (NaN).

  • In general, it is strongly recommended to avoid having two increment/decrement operators in the same expression. Different languages (and indeed, different compilers of the same language) have wildly different ideas of what should be done. In many languages, it is declared "undefined behaviour" in language specification.

answered Apr 9, 2015 at 4:14
Sign up to request clarification or add additional context in comments.

Comments

1

Instead of i++, use ++i. In your case, you're first returning i, then incrementing it, while you're looking for an increment, and return it after.

answered Apr 9, 2015 at 4:16

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.