##Finding the average output##
Finding the average output
Let's figure out what your average output should be. Your neural network has 3 inputs in the first layer, 2 nodes in the second layer, and one output. Each weight is randomized to a value from 0..1
, so call it 0.5
on average.
The inputs you use in the program are: 1, 0, 1
. On each layer, you also have a "bias" input of -1
. So with average weights of 0.5
, the the input layer will do the following:
inputs = 1, 0, 1, -1
output = sigmoid(1*0.5 + 0*0.5 + 1*0.5 - 1*0.5) = sigmoid(0.5) = 0.62
The second layer has two nodes, each with average input 0.62
. It will do:
inputs = 0.62, 0.62, -1
output = sigmoid(0.62*0.5 + 0.62*0.5 - 1*0.5) = sigmoid(0.12) = 0.53
So your average output should be 0.53
. I modified your program to sum the outputs and found that this was close (actual was 0.528
). Now as for the percentage of time the output is above 0.5
, that depends on the distribution of the output, and I don't know that it is easy to compute by hand. But your program shows through experimentation that the answer is roughly 60% of the time.
##Random weights##
Random weights
I think that your choice of random weights from the range 0..1
is the source of your confusion. If you were to choose random weights in the range -1..1
like this:
weights[j][i] = Math.random()*2 - 1.0;
then your output would be 0.5
on average and the percentage of outputs greater than 0.5
would be 50% (I modified your program to verify this). Perhaps that is what you were expecting.
As far as your neural net code goes, it appears to be correct as far as I can tell.
##Finding the average output##
Let's figure out what your average output should be. Your neural network has 3 inputs in the first layer, 2 nodes in the second layer, and one output. Each weight is randomized to a value from 0..1
, so call it 0.5
on average.
The inputs you use in the program are: 1, 0, 1
. On each layer, you also have a "bias" input of -1
. So with average weights of 0.5
, the the input layer will do the following:
inputs = 1, 0, 1, -1
output = sigmoid(1*0.5 + 0*0.5 + 1*0.5 - 1*0.5) = sigmoid(0.5) = 0.62
The second layer has two nodes, each with average input 0.62
. It will do:
inputs = 0.62, 0.62, -1
output = sigmoid(0.62*0.5 + 0.62*0.5 - 1*0.5) = sigmoid(0.12) = 0.53
So your average output should be 0.53
. I modified your program to sum the outputs and found that this was close (actual was 0.528
). Now as for the percentage of time the output is above 0.5
, that depends on the distribution of the output, and I don't know that it is easy to compute by hand. But your program shows through experimentation that the answer is roughly 60% of the time.
##Random weights##
I think that your choice of random weights from the range 0..1
is the source of your confusion. If you were to choose random weights in the range -1..1
like this:
weights[j][i] = Math.random()*2 - 1.0;
then your output would be 0.5
on average and the percentage of outputs greater than 0.5
would be 50% (I modified your program to verify this). Perhaps that is what you were expecting.
As far as your neural net code goes, it appears to be correct as far as I can tell.
Finding the average output
Let's figure out what your average output should be. Your neural network has 3 inputs in the first layer, 2 nodes in the second layer, and one output. Each weight is randomized to a value from 0..1
, so call it 0.5
on average.
The inputs you use in the program are: 1, 0, 1
. On each layer, you also have a "bias" input of -1
. So with average weights of 0.5
, the the input layer will do the following:
inputs = 1, 0, 1, -1
output = sigmoid(1*0.5 + 0*0.5 + 1*0.5 - 1*0.5) = sigmoid(0.5) = 0.62
The second layer has two nodes, each with average input 0.62
. It will do:
inputs = 0.62, 0.62, -1
output = sigmoid(0.62*0.5 + 0.62*0.5 - 1*0.5) = sigmoid(0.12) = 0.53
So your average output should be 0.53
. I modified your program to sum the outputs and found that this was close (actual was 0.528
). Now as for the percentage of time the output is above 0.5
, that depends on the distribution of the output, and I don't know that it is easy to compute by hand. But your program shows through experimentation that the answer is roughly 60% of the time.
Random weights
I think that your choice of random weights from the range 0..1
is the source of your confusion. If you were to choose random weights in the range -1..1
like this:
weights[j][i] = Math.random()*2 - 1.0;
then your output would be 0.5
on average and the percentage of outputs greater than 0.5
would be 50% (I modified your program to verify this). Perhaps that is what you were expecting.
As far as your neural net code goes, it appears to be correct as far as I can tell.
##Finding the average output##
Let's figure out what your average output should be. Your neural network has 3 inputs in the first layer, 2 nodes in the second layer, and one output. Each weight is randomized to a value from 0..1
, so call it 0.5
on average.
The inputs you use in the program are: 1, 0, 1
. On each layer, you also have a "bias" input of -1
. So with average weights of 0.5
, the the input layer will do the following:
inputs = 1, 0, 1, -1
output = sigmoid(1*0.5 + 0*0.5 + 1*0.5 - 1*0.5) = sigmoid(0.5) = 0.62
The second layer has two nodes, each with average input 0.62
. It will do:
inputs = 0.62, 0.62, -1
output = sigmoid(0.62*0.5 + 0.62*0.5 - 1*0.5) = sigmoid(0.12) = 0.53
So your average output should be 0.53
. I modified your program to sum the outputs and found that this was close (actual was 0.528
). Now as for the percentage of time the output is above 0.5
, that depends on the distribution of the output, and I don't know that it is easy to compute by hand. But your program shows through experimentation that the answer is roughly 60% of the time.
##Random weights##
I think that your choice of random weights from the range 0..1
is the source of your confusion. If you were to choose random weights in the range -1..1
like this:
weights[j][i] = Math.random()*2 - 1.0;
then your output would be 0.5
on average and the percentage of outputs greater than 0.5
would be 50% (I modified your program to verify this). Perhaps that is what you were expecting.
As far as your neural net code goes, it appears to be correct as far as I can tell.