The Cerebellar Model Articulation Controller (CMAC) |

Contents |

Introduction |

How CMAC Works |

Using CMAC for Automated Classification |

Using CMAC Output to Represent Class Label |

Kernel Addition Training Algorithm |

New Grid Spacings |

CMAC for Weka |

Other CMAC Links |

Introduction |

The Cerebellar Model Articulation Controller, or CMAC, is a class of sparse coarse-coded associative memory algorithms that mimic the functionality of the mammalian cerebellum. Originally CMAC was proposed as a function modeller for robotic controllers by James Albus in 1975, but has been extensively used in reinforcement learning and also as a classifier. |

How CMAC Works |

The input space is quantised using a set of overlapping tiles as shown below. For input spaces of high dimensionality, the tiles form hypercubes. A query is performed by first activating all the tiles that contain a query point. The activated tiles in turn activate memory cells, which contain stored values; the weights of the system. The summing of these values produces an overall output. The figure below depicts a one-dimensional input in the horizontal direction. |

A change of value of the input vector results in a change in the set of activated tiles, and therefore a change in the set of memory cells participating in the CMAC output. The CMAC output is therefore stored in a distributed fashion, such that the output corresponding to any point in input space is derived from the value stored in a number of memory cells. |

The memory size required by a CMAC depends on the number of tilings and the size of tiles. If the tiles are large, such that each tile covers a large proportion of the input space, a coarse division of input space is achieved, but local phenomena have a wide area of influence. If the tiles are small, a fine division of input space is achieved and local phenomena have a small area of influence. |

Using CMAC for Automated Classification |

My work on CMAC has only concentrated on its use as an automated classifier. By this I mean the problem where we have a dataset consisting of records. Some of these records have been labelled with a class, but some have not. The problem is to put the unlabelled records into the appropriate class, using what we know about the labelled records. An example is that an insurance company might want to classify its customers into different risk categories, based on its past knowledge about the size of claims that people have made. Somehow we have to train the CMAC so that its output will predict the right class. |

Using CMAC Output to Represent Class Label |

In order to function as a classifier, there must be a way of mapping
the output of a CMAC onto a class label. The most common way is by using
a set of thresholds, so that class i is indicated when the output
falls between threshold i and threshold i+1. In effect we
map from a scalar variable to a nominal variable. Unfortunately,
this suffers from problems. For example, what if an unlabelled instance
is equally close to class i and to class i+2? The output
is likely to be half-way between the two, so may be classified as class
i+1. |

An alternative is to use a vector mapping, where each class
is represented on a scale of 0 to 1 (or some other arbitrary maximum).
The output corresponding to class i can be regarded as a
probability of belonging to class i. In the example above, we
now have an output of 0.5 for class i, and an output of 0.5 for
class i+2. Not only does the classifier now provide information
on uncertainty, but its possible to easily accomodate Bayes' Theorum to
weight the class probabilty with the a priori class probability,
estimated from the number of training instances. |

Kernel Addition Training Algorithm (KATA) |

If the vector representation of outputs is chosen, its possible to use one CMAC to represent each class. Each CMAC can build a density function for one class only, representing the density of training instances belonging to that class. If this is done, there is no point in an iterative training algorithm, as the density function is just a histogram, and can be formed by counting the number of training instances that fall into each cell. As cells are overlapped, the discretisation of the histogram is smoothed. This smoothing is increased by using alternative kernel functions, such as Cosine or polynomial. Building a smoothed histogram by adding kernel functions is well known and often attributed to Parzen. |

The Kernel Addition Training Algorithm (KATA) is able to train a CMAC in a single pass of the training data. This is an advantage in large datasets, and is in contrast to artificial neural networks, and any other techniques relying on error minimisation, where the training set must be presented to the algorithm many times. A CMAC trained using KATA also provides a linear relationship between training time and number of samples, allowing good potential to scale up the algorithm for large datasets. |

New Grid Spacings |

Tables of grid spacings were published in Parks and Militzer (1991). These include grid spacing offsets for 2-dimensional to 10-dimensional problems. I have calculated new tables for data from 11 to 20 dimensions using a Genetic Algorithm (Cornforth, 2002), and these are linked below. The full source code is also provided below. |

Table for 11 dimensional spaces |

Table for 12 dimensional spaces |

Table for 13 dimensional spaces |

Table for 14 dimensional spaces |

Table for 15 dimensional spaces |

Table for 16 dimensional spaces |

Table for 17 dimensional spaces |

Table for 18 dimensional spaces |

Table for 19 dimensional spaces |

Table for 20 dimensional spaces |

Source code for the GA program to produce tables of grid spacings |

CMAC for Weka |

My CMAC code has been ported to Weka and can be downloaded here.
It assumes that you have Weka installed on your machine. Download Cmac for Weka version 3-4-9, Sun Java v1.4 here. Download Cmac for Weka version 3-5-4, Sun Java v5 here. To install CMAC into your version of Weka: 1. Download cmac-?-?-?.jar into the same directory as weka.jar (let's call it mypath). 2. Locate the statement "mypath/weka.jar" in your classpath, and add "mypath/cmac-?-?-?.jar" 3. To run from terminal, use java weka.classifiers.functions.CmacKata The jar files also contain source code and Javadocs |

Other CMAC Links |

My Publications on CMAC |

Wikipedia page on the Cerebellar Model Articulation Controller |

Carnegie Mellon's School of Computer Science: Intro and CMAC code |

The Web Page of James Albus, creator of the CMAC Model |

My Home Page |

Copyright David Cornforth 2007
This page not endorsed by NASA or Microsoft. |