Neural Networks

Hey guys; I'm not new to neural networks -- I've been sutdying them off and on for a couple of years now. However, I've never been able to make them work quite right. I wonder if you all might be able to answer a few questions to clear things up.
First, here is a quick design of how my NN's architecture is:
http://cmxx.tripod.com/nnv3/nn_arch.htm
Note the lines are the interconnections, and they all have weights which I try to train. I'm thinking that's incorrect, but none of my resources says that explicitly. So, first question:
*) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
*) If so, then what the heck is the point of the input layer? :)
*) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
Incidentally, I'm using C++ to implement the neural network.

Thanks all for your time,
-- CM

Comments

  • : *) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
    : *) If so, then what the heck is the point of the input layer? :)
    : *) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
    : Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
    : Incidentally, I'm using C++ to implement the neural network.
    :

    OK, here's the pro. ;-)

    I don't really know a lot, but this is my answears.

    1. The input layer shouldn't be named input layer, as it contains neurons. I may be wrong. I wouldn't call it that anyway.
    2. They could be there to make it handle complex enought thingys, like xor. (if you don't understand this, just tell me, I don't know what you know)

    [code]
    i N N
    N O
    i N N
    ^ ^
    [/code]
    Do you mean the weights marked with ^?
    If yes, then yes, they should be trained, that's what the neurons are for.
  • : : *) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
    : : *) If so, then what the heck is the point of the input layer? :)
    : : *) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
    : : Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
    : : Incidentally, I'm using C++ to implement the neural network.
    : :
    :
    : OK, here's the pro. ;-)
    :
    : I don't really know a lot, but this is my answears.
    :
    : 1. The input layer shouldn't be named input layer, as it contains neurons. I may be wrong. I wouldn't call it that anyway.
    : 2. They could be there to make it handle complex enought thingys, like xor. (if you don't understand this, just tell me, I don't know what you know)
    :
    : [code]
    : i N N
    : N O
    : i N N
    : ^ ^
    : [/code]
    : Do you mean the weights marked with ^?
    : If yes, then yes, they should be trained, that's what the neurons are for.
    :
    Yes, the weights you've marked are the ones about which I'm asking. Why do you say the input layer shouldn't be referred to as 'Input Layer'?
    Do you mean that this is actually a hidden layer? This is in direct conflict with my reference material, which states that the input layer should contain a number of neurons which coincide with the number of independent input parameters (that is, any variable which should affect the output). In fact, the number of input layer neurons is how I calculate the number of hidden layer neurons (numHiddenLayer=2/3*numInputLayer+numOutputLayer).
    XOR is actually what I'm trying to train this neural network to emulate. It is the simplest nonlinear problem with which I've been presented.

    -- CM
  • : : *) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
    : : *) If so, then what the heck is the point of the input layer? :)
    : : *) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
    : : Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
    : : Incidentally, I'm using C++ to implement the neural network.
    : :
    :
    : OK, here's the pro. ;-)
    :
    : I don't really know a lot, but this is my answears.
    :
    : 1. The input layer shouldn't be named input layer, as it contains neurons. I may be wrong. I wouldn't call it that anyway.
    : 2. They could be there to make it handle complex enought thingys, like xor. (if you don't understand this, just tell me, I don't know what you know)
    :
    : [code]
    : i N N
    : N O
    : i N N
    : ^ ^
    : [/code]
    : Do you mean the weights marked with ^?
    : If yes, then yes, they should be trained, that's what the neurons are for.
    :


    Neural network are stupid there's very little application for them and they aren't going anywhere.
  • I really shouldn't address this one, but I'm going to anyway. Please don't flame -- if you don't feel neural networks are useful, then ignore requests for help on them, unless, of course, you have some programming method that's as simple as neural networks that works as well as them on, say, non-linear artificial intelligence (If I recall correctly, the developers of Halo used a hybrid neural-network/genetics form of AI in their learning algorithms).

    -- CM
    : : : *) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
    : : : *) If so, then what the heck is the point of the input layer? :)
    : : : *) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
    : : : Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
    : : : Incidentally, I'm using C++ to implement the neural network.
    : : :
    : :
    : : OK, here's the pro. ;-)
    : :
    : : I don't really know a lot, but this is my answears.
    : :
    : : 1. The input layer shouldn't be named input layer, as it contains neurons. I may be wrong. I wouldn't call it that anyway.
    : : 2. They could be there to make it handle complex enought thingys, like xor. (if you don't understand this, just tell me, I don't know what you know)
    : :
    : : [code]
    : : i N N
    : : N O
    : : i N N
    : : ^ ^
    : : [/code]
    : : Do you mean the weights marked with ^?
    : : If yes, then yes, they should be trained, that's what the neurons are for.
    : :
    :
    :
    : Neural network are stupid there's very little application for them and they aren't going anywhere.
    :

  • : :
    : : Neural network are stupid there's very little application for them and they aren't going anywhere.
    : :
    :
    :

    Flame sounds fun.

    Some things require NNs.

    Could you teach a program that doesn't use NNs to for example speak without a sound database?
    Could you have a chess program that learns by its defeats to perform better without anything NN based?
  • : Hey guys; I'm not new to neural networks -- I've been sutdying them off and on for a couple of years now. However, I've never been able to make them work quite right. I wonder if you all might be able to answer a few questions to clear things up.
    : First, here is a quick design of how my NN's architecture is:
    : http://cmxx.tripod.com/nnv3/nn_arch.htm
    : Note the lines are the interconnections, and they all have weights which I try to train. I'm thinking that's incorrect, but none of my resources says that explicitly. So, first question:
    : *) Should the Input Layer has weighted inputs (from the inputs), or should the input layers simply pass the inputs to the hidden layer
    : *) If so, then what the heck is the point of the input layer? :)
    : *) Next, should the weights between the outputs of the hidden layer and the inputs of the output layer be trained (using the Backprop algorithm) as well as the weights between the outputs of the input layer and the inputs of the hidden layer?
    : Sorry about the semantics in the last question, but I had to be sure I was making very clear the weights to which I'm referring.
    : Incidentally, I'm using C++ to implement the neural network.
    :
    : Thanks all for your time,
    : -- CM
    :
    :


    Hi, I've been playing around with NN's a (little) while back. I'm no pro, but I have been able to do some simple character recognition with my implementation, so it should work...

    This is how I see it: The neurons in the input layer don't have input (and thus no weight), they just output your data to the first hidden layer (or to the output if there are no hidden layers) as in they have only a fixed output.
    'What the heck' the input layer is for? If the input comes from a layer, there doesn't have to be a difference between the first and other hidden layers. This makes implementation (especially in an OO-language like C++) and the mathematics behind the network easier.

    For the training of the weights, yes, train all weights you can (the weights between input and hidden, between hidden and hidden and between hidden and output). If you don't train a weight, it's useless and a waste of processor power and memory ;)

    Have fun,
    Pruyque
  • Most often in the literature, "input layer" refers to a set of straight connections to the data. The input layer does not do anything besides show up with the data. This confusion has led many to describe neural networks in terms of the number of hidden layers (where the action usually happens) rather than the total number of layers because some people count the input layer and others don't.


    -Will Dwinnell
    [link=http://matlabdatamining.blogspot.com/]Data Mining in MATLAB[/link]

Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Categories