In one of your illustrative backprop videos on engine capacity and performance, you incorporated a simple activation formula that converted input to a number between 0 and 1 i.e. 6 cylinder input became 6/max_capacity = 0.75. For the life of me, i cannot find where and how you inserted this conversion formula in the network. Do you use absolute or relative addressing of neurons as one would in Excel? Not dure?
Second question, perhaps philosophical, is the premise for ANN that there will always be a solution to any problem? In the real world, and mathematically, sometimes solutions do NOT exist?; Are ANN not just a hyped version of the science of Econometrics?
Sorry for playing devil's advocate, i love ANN and I'm registering for my phd in an application involving ANN, hence all the questions.
For the cylinders node I am using something called min-max scaling to make the node value be between 0 and 1, as you already noted. Whereas in a spreadsheet that would amount to dividing every cell in a column by 8, here it corresponds to using a weight value of 1/8=.125. So if a human-readable 8 is entered in the node that says "cylinders", it gets multiplied by the weight .125 and thus converted to a 1 in the input layer of the backprop network.
Philosophical questions are fine, I am a philosopher by training actually! There are theorems relating to neural networks that show that neural network can in principle do anything a standard digital computer can. For example, you can implement any logic gate in a simple network with threshold units. This is easy to do in Simbrain. Also there are a bunch of theorems about how certain types of network can compute certain classes of functions. So if a computer can do it, a neural network can do it in principle. But that doesn't mean neural networks are the best way to approach every problem.
There are similar correspondences between statistics and neural networks. Many neural network architectures are equivalent or nearly equivalent to certain statistical methods, so that you often have a choice of which approach to use. Bishop and Sarle are good on this.
So already neural networks are not the ultimate solution to every problem.
Then of course, not every problem can be addressed using computational methods to begin with. Neural networks are just one more computational tool in the arsenal, but if computational methods are not appropriate to a problem, then ANNs won't help.
One downside of neural networks is that they have a kind of black box feel, whereas corresponding statistical methods (like regression, which I gather is the basis of much of econometrics) are better understood. So I think there was a movement for a while away from neural networks to probability and statistics. E.g. why use the cars network when multi-variate multiple regression does the same thing and with more analytic tools behind it. So in that case neural networks might not be the best approach. However neural networks are flourishing now and tools have emerged to allow deep many-layered nets to succeed at problems where statistical approaches have not (at least that's my sense; but I'm not in industry). So my sense is they remain viable.
That was the long answer. The short answer is no, I don't think there is any premise behind ANN research that there will always be a solution to any problem.