Sciweavers

TEC
2008

Function Approximation With XCS: Hyperellipsoidal Conditions, Recursive Least Squares, and Compaction

13 years 4 months ago
Function Approximation With XCS: Hyperellipsoidal Conditions, Recursive Least Squares, and Compaction
An important strength of learning classifier systems (LCSs) lies in the combination of genetic optimization techniques with gradient-based approximation techniques. The chosen approximation technique develops locally optimal approximations, such as accurate classification estimates, Q-value predictions, or linear function approximations. The genetic optimization technique is designed to distribute these local approximations efficiently over the problem space. Together, the two components develop a distributed, locally optimized problem solution in the form of a population of expert rules, often called classifiers. In function approximation problems, the XCSF classifier system develops a problem solution in the form of overlapping, piecewise linear approximations. This paper shows that XCSF performance on function approximation problems additively benefits from (1) improved representations, (2) improved genetic operators, and (3) improved approximation techniques. Additionally, this pa...
Martin V. Butz, Pier Luca Lanzi, Stewart W. Wilson
Added 15 Dec 2010
Updated 15 Dec 2010
Type Journal
Year 2008
Where TEC
Authors Martin V. Butz, Pier Luca Lanzi, Stewart W. Wilson
Comments (0)