Hey guys; I'm not new to neural networks -- I've been sutdying them off and on for a couple of years now. However, I've never been able to make them work quite right. I wonder if you all might be able to answer a few questions to clear things up.
First, here is a quick design of how my NN's architecture is:http://cmxx.tripod.com/nnv3/nn_arch.htm
Note the lines are the interconnections, and they all have weights which I try to train. I'm thinking that's incorrect, but none of my resources says that explicitly. So, first question:
*) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
*) If so, then what the heck is the point of the input layer?
*) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
Incidentally, I'm using C++ to implement the neural network.
Thanks all for your time,