|
Next
Previous
Contents
Connectionism is a technical term for a group of related
techniques. These techniques include areas such as Artificial
Neural Networks, Semantic Networks and a few other similar
ideas. My present focus is on neural networks (though I am
looking for resources on the other techniques). Neural
networks are programs designed to simulate the workings of the
brain. They consist of a network of small mathematical-based
nodes, which work together to form patterns of information.
They have tremendous potential and currently seem to be having
a great deal of success with image processing and robot
control.
These are libraries of code or classes for use in programming within
the Connectionist field. They are not meant as stand alone
applications, but rather as tools for building your own applications.
- ANSI-C Neural Networks
This site contains ANSC-C source code for 8 types of neural
nets, including:
- Adaline Network
- Backpropagation
- Hopfield Model
- (BAM) Bidirectional Associative Memory
- Boltzmann Machine
- Counterpropagation
- (SOM) Self-Organizing Map
- (ART1) Adaptive Resonance Theory
They were designed to help turn the theory of a particular
network model into the design for a simulator implementation ,
and to help with embeding an actual application into a
particular network model.
- BELIEF
BELIEF is a Common Lisp implementation of the Dempster and Kong
fusion and propagation algorithm for Graphical Belief Function
Models and the Lauritzen and Spiegelhalter algorithm for
Graphical Probabilistic Models. It includes code for
manipulating graphical belief models such as Bayes Nets and
Relevance Diagrams (a subset of Influence Diagrams) using both
belief functions and probabilities as basic representations of
uncertainty. It uses the Shenoy and Shafer version of the
algorithm, so one of its unique features is that it supports
both probability distributions and belief functions. It also
has limited support for second order models (probability
distributions on parameters).
- bpnn.py
A simple back-propogation ANN in Python.
- CONICAL
CONICAL is a C++ class library for building simulations common
in computational neuroscience. Currently its focus is on
compartmental modeling, with capabilities similar to GENESIS and
NEURON. A model neuron is built out of compartments, usually
with a cylindrical shape. When small enough, these open-ended
cylinders can approximate nearly any geometry. Future classes
may support reaction-diffusion kinetics and more. A key feature
of CONICAL is its cross-platform compatibility; it has been
fully co-developed and tested under Unix, DOS, and Mac OS.
- IDEAL
IDEAL is a test bed for work in influence diagrams and
Bayesian networks. It contains various inference algorithms
for belief networks and evaluation algorithms for influence
diagrams. It contains facilities for creating and editing
influence diagrams and belief networks.
IDEAL is written in pure Common Lisp and so it will run in
Common Lisp on any platform. The emphasis in writing IDEAL has
been on code clarity and providing high level programming
abstractions. It thus is very suitable for experimental
implementations which need or extend belief network
technology.
At the highest level, IDEAL can be used as a subroutine
library which provides belief network inference and influence
diagram evaluation as a package. The code is documented in a
detailed manual and so it is also possible to work at a lower
level on extensions of belief network methods.
IDEAL comes with an optional graphic interface written in
CLIM. If your Common Lisp also has CLIM, you can run the
graphic interface.
- Jet's Neural Architecture
Jet's Neural Architecture is a C++ framework for doing neural net
projects. The goals of this project were to make a fast, flexible
neural architecture that isn't stuck to one kind of net and to make
sure that end users could easily write useful applications. All the
documentation is also easily readable.
- Matrix Class
A simple, fast, efficient C++ Matrix class designed for
scientists and engineers. The Matrix class is well suited for
applications with complex math algorithms. As an demonstration
of the Matrix class, it was used to implement the backward error
propagation algorithm for a multi-layer feed-forward artificial
neural network.
- nunu
nunu is a multi-layered, scriptable, back-propagation neural network.
It is build to be used for intensive computation problems scripted
in shell scripts. It is written in C++ using the STL. nn is based
on material from the "Introduction to the Theory of Neural
Computation" by John Hertz, Anders Krogh, and Richard G. Palmer,
chapter 6.
- Pulcinella
Pulcinella is written in CommonLisp, and appears as a library of
Lisp functions for creating, modifying and evaluating valuation
systems. Alternatively, the user can choose to interact with
Pulcinella via a graphical interface (only available in Allegro
CL). Pulcinella provides primitives to build and evaluate
uncertainty models according to several uncertainty calculi,
including probability theory, possibility theory, and
Dempster-Shafer's theory of belief functions; and the
possibility theory by Zadeh, Dubois and Prade's. A User's Manual
is available on request.
- S-ElimBel
S-ElimBel is an algorithm that computes the belief in a
Bayesian network, implemented in MIT-Scheme. This algorithm has
the particularity of being rather easy to understand. Moreover,
one can apply it to any kind of Bayesian network - it being
singly connected or muliply connected. It is, however, less
powerful than the standard algorithm of belief propagation.
Indeed, the computation has to be reconducted entirely for each
new evidence added to the network. Also, one needs to run the
algorithm as many times as one has nodes for which the belief is
wanted.
- Software for Flexible Bayesian Modeling
This software implements flexible Bayesian models for regression
and classification applications that are based on multilayer
perceptron neural networks or on Gaussian processes. The
implementation uses Markov chain Monte Carlo methods. Software
modules that support Markov chain sampling are included in the
distribution, and may be useful in other applications.
- Spiderweb2
A C++ artificial neual net library. Spiderweb2 is a complete
rewrite of the original Spiderweb library, it has grown into a
much more flexible and object-oriented system. The biggest
change is that each neuron object is responsible for its own
activations and updates, with the network providing only the
scheduling aspect. This is a very powerful change, and it allows
easy modification and experimentation with various network
architectures and neuron types.
- Symbolic Probabilistic Inference (SPI)
Contains Common Lisp function libraries to implement SPI type baysean nets.
Documentation is very limited.
Features:
- Probabilities, Local Expression Language Utilities, Explanation,
Dynamic Models, and a TCL/TK based GUI.
- TresBel
Libraries containing (Allegro) Common Lisp code for Belief Functions
(aka. Dempster-Shafer evidential reasoning) as a representation
of uncertainty. Very little documentation. Has a limited GUI.
- Various (C++) Neural Networks
Example neural net codes from the book,
The Pattern Recognition Basics of AI.
These are simple example codes of these various
neural nets. They work well as a good starting point for simple
experimentation and for learning what the code is like behind the
simulators. The types of networks available on this site are:
(implemented in C++)
- The Backprop Package
- The Nearest Neighbor Algorithms
- The Interactive Activation Algorithm
- The Hopfield and Boltzman machine Algorithms
- The Linear Pattern Classifier
- ART I
- Bi-Directional Associative Memory
- The Feedforward Counter-Propagation Network
These are various applications, software kits, etc. meant for research
in the field of Connectionism. Their ease of use will vary, as they
were designed to meet some particular research interest more than as
an easy to use commercial package.
- Aspirin - MIGRAINES
(am6.tar.Z on ftp site)
The software that we are releasing now is for creating,
and evaluating, feed-forward networks such as those used with the
backpropagation learning algorithm. The software is aimed both at
the expert programmer/neural network researcher who may wish to tailor
significant portions of the system to his/her precise needs, as well
as at casual users who will wish to use the system with an absolute
minimum of effort.
- DDLab
DDLab is an interactive graphics program for research into the
dynamics of finite binary networks, relevant to the study of
complexity, emergent phenomena, neural networks, and aspects of
theoretical biology such as gene regulatory networks. A network
can be set up with any architecture between regular CA (1d or
2d) and "random Boolean networks" (networks with arbitrary
connections and heterogeneous rules). The network may also have
heterogeneous neighborhood sizes.
- GENESIS
GENESIS (short for GEneral NEural SImulation System) is a
general purpose simulation platform which was developed to
support the simulation of neural systems ranging from complex
models of single neurons to simulations of large networks made
up of more abstract neuronal components. GENESIS has provided
the basis for laboratory courses in neural simulation at both
Caltech and the Marine Biological Laboratory in Woods Hole, MA,
as well as several other institutions. Most current GENESIS
applications involve realistic simulations of biological neural
systems. Although the software can also model more abstract
networks, other simulators are more suitable for backpropagation
and similar connectionist modeling.
- JavaBayes
The JavaBayes system is a set of tools, containing a
graphical editor, a core inference engine and a parser.
JavaBayes can produce:
- the marginal distribution for any variable in a network.
- the expectations for univariate functions (for example,
expected value for variables).
- configurations with maximum a posteriori probability.
- configurations with maximum a posteriori expectation for
univariate functions.
- Jbpe
Jbpe is a back-propagation neural network editor/simulator.
Features
- Standart back-propagation networks creation.
- Saving network as a text file, which can be edited and loaded
back.
- Saving/loading binary file
- Learning from a text file (with structure specified below),
number of learning periods / desired network energy can be
specified as a criterion.
- Network recall
- neuralnets
neuralnets is a text-based program that allows someone to build,
configure, train and run a Neural Network application. The code is
written in Java and is easily extended or included within other code.
The application comes ready to go with a Back Prop algorithm included.
Well known applications...stock market, weather prediction, scheduling,
image recognition, expert systems, research...basically anywhere you
may need to make a complicated decision.
- Neural Network Generator
The Neural Network Generator is a genetic algorithm for the
topological optimization of feedforward neural networks. It
implements the Semantic Changing Genetic Algorithm and the
Unit-Cluster Model. The Semantic Changing Genetic Algorithm is
an extended genetic algorithm that allows fast dynamic
adaptation of the genetic coding through population
analysis. The Unit-Cluster Model is an approach to the
construction of modular feedforward networks with a ''backbone''
structure.
NOTE: To compile this on Linux requires one change in the Makefiles.
You will need to change '-ltermlib' to '-ltermcap'.
- Neureka ANS (nn/xnn)
nn is a high-level neural network specification language. The
current version is best suited for feed-forward nets, but
recurrent models can and have been implemented, e.g. Hopfield
nets, Jordan/Elman nets, etc. In nn, it is easy to change
network dynamics. The nn compiler can generate C code or
executable programs (so there must be a C compiler available),
with a powerful command line interface (but everything may also
be controlled via the graphical interface, xnn). It is possible
for the user to write C routines that can be called from inside
the nn specification, and to use the nn specification as a
function that is called from a C program. Please note that no
programming is necessary in order to use the network models that
come with the system (`netpack').
xnn is a graphical front end to networks generated by the nn
compiler, and to the compiler itself. The xnn graphical
interface is intuitive and easy to use for beginners, yet
powerful, with many possibilities for visualizing network data.
NOTE: You have to run the install program that comes with this
to get the license key installed. It gets put (by default) in
/usr/lib. If you (like myself) want to install the package
somewhere other than in the /usr directory structure (the
install program gives you this option) you will have to set up
some environmental variables (NNLIBDIR & NNINCLUDEDIR are
required). You can read about these (and a few other optional
variables) in appendix A of the documentation (pg 113).
- NEURON
NEURON is an extensible nerve modeling and simulation
program. It allows you to create complex nerve models by
connecting multiple one-dimensional sections together to form
arbitrary cell morphologies, and allows you to insert multiple
membrane properties into these sections (including channels,
synapses, ionic concentrations, and counters). The interface was
designed to present the neural modeler with a intuitive
environment and hide the details of the numerical methods used
in the simulation.
- PDP++
As the field of Connectionist modeling has grown, so has the need
for a comprehensive simulation environment for the development and
testing of Connectionist models. Our goal in developing PDP++ has been
to integrate several powerful software development and user interface
tools into a general purpose simulation environment that is both user
friendly and user extensible. The simulator is built in the C++
programming language, and incorporates a state of the art script
interpreter with the full expressive power of C++. The graphical user
interface is built with the Interviews toolkit, and allows full access
to the data structures and processing modules out of which the
simulator is built. We have constructed several useful graphical
modules for easy interaction with the structure and the contents of
neural networks, and we've made it possible to change and adapt many
things. At the programming level, we have set things up in such a way
as to make user extensions as painless as possible. The programmer
creates new C++ objects, which might be new kinds of units or new
kinds of processes; once compiled and linked into the simulator, these
new objects can then be accessed and used like any other.
- RNS
RNS (Recurrent Network Simulator) is a simulator for recurrent
neural networks. Regular neural networks are also supported. The
program uses a derivative of the back-propagation algorithm, but
also includes other (not that well tested) algorithms.
Features include
- freely choosable connections, no restrictions besides memory
or CPU constraints
- delayed links for recurrent networks
- fixed values or thresholds can be specified for weights
- (recurrent) back-propagation, Hebb, differential Hebb, simulated
annealing and more
- patterns can be specified with bits, floats, characters, numbers,
and random bit patterns with Hamming distances can be chosen for you
- user definable error functions
- output results can be used without modification as input
- Simple Neural Net (in Python)
Simple neural network code, which implements a class for 3-level
networks (input, hidden, and output layers). The only learning
rule implemented is simple backpropagation. No documentation (or
even comments) at all, because this is simply code that I use to
experiment with. Includes modules containing sample datasets
from Carl G. Looney's NN book. Requires the Numeric
extensions.
- SCNN
SCNN is an universal simulating system for Cellular Neural
Networks (CNN). CNN are analog processing neural networks
with regular and local interconnections, governed by a set of
nonlinear ordinary differential equations. Due to their local
connectivity, CNN are realized as VLSI chips, which operates
at very high speed.
- Semantic Networks in Python
The semnet.py module defines several simple classes for
building and using semantic networks. A semantic network is a
way of representing knowledge, and it enables the program to
do simple reasoning with very little effort on the part of the
programmer.
The following classes are defined:
- Entity: This class represents a noun; it is
something which can be related to other things, and about
which you can store facts.
- Relation: A Relation is a type of relationship
which may exist between two entities. One special relation,
"IS_A", is predefined because it has special meaning (a sort
of logical inheritance).
- Fact: A Fact is an assertion that a relationship
exists between two entities.
With these three object types, you can very quickly define knowledge
about a set of objects, and query them for logical conclusions.
- SNNS
Stuttgart Neural Net Simulator (version 4.1). An awesome neural
net simulator. Better than any commercial simulator I've seen. The
simulator kernel is written in C (it's fast!). It supports over 20
different network architectures, has 2D and 3D X-based graphical
representations, the 2D GUI has an integrated network editor, and can
generate a separate NN program in C. SNNS is very powerful, though
a bit difficult to learn at first. To help with this it comes with
example networks and tutorials for many of the architectures.
ENZO, a supplementary system allows you to evolve your networks with
genetic algorithms.
There is a
debian package of SNNS available. So just get it
(and use
alien
to convert it to RPM if you need to).
- SPRLIB/ANNLIB
SPRLIB (Statistical Pattern Recognition Library) was developed
to support the easy construction and simulation of pattern
classifiers. It consist of a library of functions (written in C)
that can be called from your own program. Most of the well-known
classifiers are present (k-nn, Fisher, Parzen, ....), as well as
error estimation and dataset generation routines.
ANNLIB (Artificial Neural Networks Library) is a neural network
simulation library based on the data architecture laid down by
SPRLIB. The library contains numerous functions for creating,
training and testing feed-forward networks. Training algorithms
include back-propagation, pseudo-Newton, Levenberg-Marquardt,
conjugate gradient descent, BFGS.... Furthermore, it is possible
- due to the datastructures' general applicability - to build
Kohonen maps and other more exotic network architectures using
the same data types.
- TOOLDIAG
TOOLDIAG is a collection of methods for statistical pattern
recognition. The main area of application is classification. The
application area is limited to multidimensional continuous
features, without any missing values. No symbolic features
(attributes) are allowed. The program in implemented in the 'C'
programming language and was tested in several computing
environments.
- XNBC
XNBC v8 is a simulation tool for the neuroscientists interested in
simulating biological neural networks using a user friendly tool.
XNBC is a software package for simulating biological neural networks.
Four neuron models are available, three phenomenologic models (xnbc,
leaky integrator and conditional burster) and an ion-conductance based
model. Inputs to the simulated neurons can be provided by experimental
data stored in files, allowing the creation of `hybrid'' networks.
Next
Previous
Contents
|