+612 9045 4394
 
CHECKOUT
A Connectionist Machine for Genetic Hillclimbing : Kluwer International Series in Engineering and Computer Science - David H. Ackley

A Connectionist Machine for Genetic Hillclimbing

Kluwer International Series in Engineering and Computer Science

Hardcover

Published: December 2009
Ships: 7 to 10 business days
7 to 10 business days
RRP $488.99
$338.25
31%
OFF
or 4 easy payments of $84.56 with Learn more

Other Available Formats (Hide)

  • Paperback View Product Published: 17th October 2011
    $169.73

' It is a good source for those interested in a concrete application of Boltzmann machines or (at several places) thoughtful treatise on their potential impact on the broader fields of artificial intelligence and machine learning. ' B.P. Buckles, Computing Reviews, January 1989

` It is a good source for those interested in a concrete application of Boltzmann machines or (at several places) thoughtful treatise on their potential impact on the broader fields of artificial intelligence and machine learning. ' B.P. Buckles, Computing Reviews, January 1989

1. Introduction.- 1.1. Satisfying hidden strong constraints.- 1.2. Function optimization.- 1.2.1. The methodology of heuristic search.- 1.2.2. The shape of function spaces.- 1.3. High-dimensional binary vector spaces.- 1.3.1. Graph partitioning.- 1.4. Dissertation overview.- 1.5. Summary.- 2. The model.- 2.1. Design goal: Learning while searching.- 2.1.1. Knowledge representation.- 2.1.2. Point-based search strategies.- 2.1.3. Population-based search strategies.- 2.1.4. Combination rules.- 2.1.5. Election rules.- 2.1.6. Summary: Learning while searching.- 2.2. Design goal: Sustained exploration.- 2.2.1. Searching broadly.- 2.2.2. Convergence and divergence.- 2.2.3. Mode transitions.- 2.2.4. Resource allocation via taxation.- 2.2.5. Summary: Sustained exploration.- 2.3. Connectionist computation.- 2.3.1. Units and links.- 2.3.2. A three-state stochastic unit.- 2.3.3. Receptive fields.- 2.4. Stochastic iterated genetic hillclimbing.- 2.4.1. Knowledge representation in SIGH.- 2.4.2. The SIGH control algorithm.- 2.4.3. Formal definition.- 2.5. Summary.- 3. Empirical demonstrations.- 3.1. Methodology.- 3.1.1. Notation.- 3.1.2. Parameter tuning.- 3.1.3. Non-termination.- 3.2. Seven algorithms.- 3.2.1. Iterated hillclimbing-steepest ascent (IHC-SA).- 3.2.2. Iterated hillclimbing-next ascent (IHC-NA).- 3.2.3. Stochastic hillclimbing (SHC).- 3.2.4. Iterated simulated annealing (ISA).- 3.2.5. Iterated genetic search-Uniform combination (IGS-U).- 3.2.6. Iterated genetic search-Ordered combination (IGS-O).- 3.2.7. Stochastic iterated genetic hillclimbing (SIGH).- 3.3. Six functions.- 3.3.1. A linear space-"One Max".- 3.3.2. A local maximum-"Two Max".- 3.3.3. A large local maximum-"Trap".- 3.3.4. Fine-grained local maxima-"Porcupine".- 3.3.5. Flat areas-"Plateaus".- 3.3.6. A combination space-"Mix".- 4. Analytic properties.- 4.1. Problem definition.- 4.2. Energy functions.- 4.3. Basic properties of the learning algorithm.- 4.3.1. Motivating the approach.- 4.3.2. Defining reinforcement signals.- 4.3.3. Defining similarity measures.- 4.3.4. The equilibrium distribution.- 4.4. Convergence.- 4.5. Divergence.- 5. Graph partitioning.- 5.1. Methodology.- 5.1.1. Problems.- 5.1.2. Algorithms.- 5.1.3. Data collection.- 5.1.4. Parameter tuning.- 5.2. Adding a linear component.- 5.3. Experiments on random graphs.- 5.4. Experiments on multilevel graphs.- 6. Related work.- 6.1. The problem space formulation.- 6.2. Search and learning.- 6.2.1. Learning while searching.- 6.2.2. Symbolic learning.- 6.2.3. Hillclimbing.- 6.2.4. Stochastic hillclimbing and simulated annealing.- 6.2.5. Genetic algorithms.- 6.3. Connectionist modelling.- 6.3.1. Competitive learning.- 6.3.2. Back propagation.- 6.3.3. Boltzmann machines.- 6.3.4. Stochastic iterated genetic hillclimbing.- 6.3.5. Harmony theory.- 6.3.6. Reinforcement models.- 7. Limitations and variations.- 7.1. Current limitations.- 7.1.1. The problem.- 7.1.2. The SIGH model.- 7.2. Possible variations.- 7.2.1. Exchanging parameters.- 7.2.2. Beyond symmetric connections.- 7.2.3. Simultaneous optimization.- 7.2.4. Widening the bottleneck.- 7.2.5. Temporal credit assignment.- 7.2.6. Learning a function.- 8. Discussion and conclusions.- 8.1. Stability and change.- 8.2. Architectural goals.- 8.2.1 High potential parallelism.- 8.2.2 Highly incremental.- 8.2.3 "Generalized Hebbian" learning.- 8.2.4 Unsupervised learning.- 8.2.5 "Closed loop" interactions.- 8.2.6 Emergent properties.- 8.3. Discussion.- 8.3.1 The processor/memory distinction.- 8.3.2 Physical computation systems.- 8.3.3 Between mind and brain.- 8.4. Conclusions.- 8.4.1. Recapitulation.- 8.4.2. Contributions.- References.

ISBN: 9780898382365
ISBN-10: 089838236X
Series: Kluwer International Series in Engineering and Computer Science
Audience: Professional
Format: Hardcover
Language: English
Number Of Pages: 260
Published: December 2009
Publisher: Kluwer Academic Publishers
Country of Publication: US
Dimensions (cm): 24.77 x 16.51  x 1.91
Weight (kg): 0.64