|
|
|
|
LEADER |
03618nam a22005055i 4500 |
001 |
978-0-387-31240-8 |
003 |
DE-He213 |
005 |
20151204140642.0 |
007 |
cr nn 008mamaa |
008 |
100301s2006 xxu| s |||| 0|eng d |
020 |
|
|
|a 9780387312408
|9 978-0-387-31240-8
|
024 |
7 |
|
|a 10.1007/0-387-31240-4
|2 doi
|
040 |
|
|
|d GrThAP
|
050 |
|
4 |
|a QA75.5-76.95
|
072 |
|
7 |
|a UY
|2 bicssc
|
072 |
|
7 |
|a UYA
|2 bicssc
|
072 |
|
7 |
|a COM014000
|2 bisacsh
|
072 |
|
7 |
|a COM031000
|2 bisacsh
|
082 |
0 |
4 |
|a 004.0151
|2 23
|
100 |
1 |
|
|a Nikolaev, Nikolay Y.
|e author.
|
245 |
1 |
0 |
|a Adaptive Learning of Polynomial Networks
|h [electronic resource] :
|b Genetic Programming, Backpropagation and Bayesian Methods /
|c by Nikolay Y. Nikolaev, Hitoshi Iba.
|
264 |
|
1 |
|a Boston, MA :
|b Springer US,
|c 2006.
|
300 |
|
|
|a XIV, 316 p.
|b online resource.
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
347 |
|
|
|a text file
|b PDF
|2 rda
|
490 |
1 |
|
|a Genetic and Evolutionary Computation
|
505 |
0 |
|
|a Inductive Genetic Programming -- Tree-Like PNN Representations -- Fitness Functions and Landscapes -- Search Navigation -- Backpropagation Techniques -- Temporal Backpropagation -- Bayesian Inference Techniques -- Statistical Model Diagnostics -- Time Series Modelling -- Conclusions.
|
520 |
|
|
|a This book provides theoretical and practical knowledge for develop ment of algorithms that infer linear and nonlinear models. It offers a methodology for inductive learning of polynomial neural network mod els from data. The design of such tools contributes to better statistical data modelling when addressing tasks from various areas like system identification, chaotic time-series prediction, financial forecasting and data mining. The main claim is that the model identification process involves several equally important steps: finding the model structure, estimating the model weight parameters, and tuning these weights with respect to the adopted assumptions about the underlying data distrib ution. When the learning process is organized according to these steps, performed together one after the other or separately, one may expect to discover models that generalize well (that is, predict well). The book off'ers statisticians a shift in focus from the standard f- ear models toward highly nonlinear models that can be found by con temporary learning approaches. Speciafists in statistical learning will read about alternative probabilistic search algorithms that discover the model architecture, and neural network training techniques that identify accurate polynomial weights. They wfil be pleased to find out that the discovered models can be easily interpreted, and these models assume statistical diagnosis by standard statistical means. Covering the three fields of: evolutionary computation, neural net works and Bayesian inference, orients the book to a large audience of researchers and practitioners.
|
650 |
|
0 |
|a Computer science.
|
650 |
|
0 |
|a Computers.
|
650 |
|
0 |
|a Artificial intelligence.
|
650 |
1 |
4 |
|a Computer Science.
|
650 |
2 |
4 |
|a Theory of Computation.
|
650 |
2 |
4 |
|a Artificial Intelligence (incl. Robotics).
|
650 |
2 |
4 |
|a Computing Methodologies.
|
700 |
1 |
|
|a Iba, Hitoshi.
|e author.
|
710 |
2 |
|
|a SpringerLink (Online service)
|
773 |
0 |
|
|t Springer eBooks
|
776 |
0 |
8 |
|i Printed edition:
|z 9780387312392
|
830 |
|
0 |
|a Genetic and Evolutionary Computation
|
856 |
4 |
0 |
|u http://dx.doi.org/10.1007/0-387-31240-4
|z Full Text via HEAL-Link
|
912 |
|
|
|a ZDB-2-SCS
|
950 |
|
|
|a Computer Science (Springer-11645)
|