2021-10-04 Helmut Mayer <helmut@cs.sbg.ac.at>
    * introduced streaming of pattern sets, now only CSVFilter supports streaming
    * refactored topological sorting, now using Kahn's algorithm, which is much faster
    * fixed a bug in BackpropTrainer (it trained well, but sometimes too much)
    * refactored Rprop (had minor bugs in very specific situations)
    * refactored TrainingSignalGenerator (Trainer.test() now uses the set TSG for error)
    * refactored Function.ReLU, should be a bit faster

2021-08-13 Helmut Mayer <helmut@cs.sbg.ac.at>
    * removed minError and momentum from BackpropTrainer (minError also from TrainingSignalGenerator)
    * improved speed of link training
    * introduced a 'trainable' flag in BrainPart

2021-08-5 Helmut Mayer <helmut@cs.sbg.ac.at>
    * added Layer base class
    * added CNN support (based on merge from student project)
    * added ConvolutionLayer
    * added PoolingLayer (using MaxPooling)

2021-05-07 Helmut Mayer <helmut@cs.sbg.ac.at>
    * added Adam and RMSprop trainers
    * introduced batch size in trainer setBatchSize()
    * improved shuffling of pattern set using Fisher-Yates algorithm

2016-04-16 Helmut Mayer <helmut@cosy.sbg.ac.at>
    * merged DataSet and PatternSet to a unique PatternSet
    * SpikeSet is now a subclass of PatternSet
    * changed subset() into subSet(), now the last index is not included in the subset
    * fixed a bug in Neuron n; n.getLinkTo(n) returned a wrong link

2015-12-10 Helmut Mayer <helmut@cosy.sbg.ac.at>
    * refactored IO of Boone data completely using JDOM
    * in the process IOElement, XMLUtils, and VarArray have been removed
    * loading of nets and sets is now achieved by a static method, e.g., NeuralNet.load()

2015-09-02 Helmut Mayer <helmut@cosy.sbg.ac.at>
    * removed attribute "previousOutput' in Neuron
    * this was heavily used for all networks just to detect the quite special case of a stabilized recurrent network
    * simplified concept of neuron links by having only a single list of links of a neuron (two lists before)
    * simplified concept of directed link, which is now only one link (before it was two directed links)
    * removed attribute 'backwardValue' in Link
    * removed attribute 'loopOutputToInput' in Neuron
    * a self-link is now simply a Link with source = sink

2015-08-28 Helmut Mayer <helmut@cosy.sbg.ac.at>
    * refactored the VarArray in most classes
    * there is now a class NeuronList representing all lists with neurons in it, e.g., the tick list
    * added an attribute 'layer' to Neuron
    * the layer numbers are assigned via the method NeuralNet.isFeedForward()
    * so now there is a method to determine, if a net is feed-forward or recurrent
    * in the NetFactory weights and biases are now initialized to [-0.1, 0.1) instead of [-1.0, 1.0)
    * removed NeuralNet.randomize(), as there is a more general randomize() anyway

2015-05-15 Helmut Mayer <helmut@cosy.sbg.ac.at>
    * refactored Network and Pattern IO to a new filter concept
    * a filter may be set for networks and patterns handling the interpretation of data
    * the default filter is the BooneFilter, so old applications working with Boone files do not have to be changed
    * there is only one ID of a BrainPart instead of two to avoid confusions and keep it simple
    * added CSV filter for pattern files in CSV format
    * new method join() in DataSet
    * integrated LabelPatternSet into PatternSet

2015-03-27 Helmut Mayer <helmut@cosy.sbg.ac.at>
	* changed the main train() method in Trainer
	* the number of epochs is no longer given as parameter to train()
	* instead the epochs are set by setEpochs() which has been renamed from the former name setTrainingCycles()
	* the training data set has to be set with Trainer.setTrainingData()
	* the test data set has to be set with Trainer.setTestData()
	* the trainers have been simplified by eliminating the part trainers
	* the trainTurn() methods have been deleted
	* the resetTraining() is now named reset() and is abstract
	* for efficient training an errorSignal has been added to Neuron, a gradient value to BrainPart
	* training data may be shuffled now after each epoch of training with Trainer.setShuffle()`

2015-01-23 Helmut Mayer <helmut@cosy.sbg.ac.at>
	* added name field to a pattern of a DataSet, so each pattern may have a name

2015-01-23 Helmut Mayer <helmut@cosy.sbg.ac.at>
	* Spiking Neural Net implementation by J. Mory has been added in previous months
	* All spike stuff is in new dir src/spike
	* A sample test application is in test/boone/SpikingNetTest
    * Information on SNNSs has been added to the documentation

2010-10-24 August Mayer <amayer@cosy.sbg.ac.at>
	* Moved Boone java sources to src/ subdirectory,
	  tests to the test/ subdirectory.
	* More usage of Java Generics: clone() methods etc.
	* Using RuntimeExceptions with clone() to require
	  less explicit exception handling.
	* Multiple Javadoc updates and clarifications.
	* Neuron: Correctly clone, even if activation function is NULL.
	* FeedForwardNetFactory, RecurrentNetFactory: correctly handles null trainers.
	* Fixed implementation of getTargetPatternSize()
	* Dropped method Neuron.setActivationFn(String).
	* Boone v0.9.4 released.

2006-05-22 August Mayer <amayer@cosy.sbg.ac.at>
    * When the Neuron(isInputNeuron, isOutputNeuron) constructor is used, and
      isInputNeuron is true, then set the activation function to Identity.
    * Also, fixed NullPointerException when loading networks with null trainer.
    * Boone v0.9.3 released.

2006-05-19 August Mayer <amayer@cosy.sbg.ac.at>
    * Fixed SNNS network file loading. 
    * Added a cache for the hidden neuron count.
    * Removed all separate NetError error functions, except sumSquareError;
    * Fixed NRMSE calculation. 
    * Fixed setPreviousOutput typo in Neuron class.
    * Trainer.resetTraining now checks for null PartTrainers.
    * Added Workaround for loading Boone networks saved in NetJEN XS.
    * Various assorted small fixes and extensions.
    * Boone v0.9.2 released.

2006-01-13 August Mayer <amayer@cosy.sbg.ac.at>
    * (based on diff between release 0.9 and release 0.9.1)
    * Fixed: Aliases were put into the main class list in the Boone class, which caused
        problems when enumerating the available classes.
    * Function class enhancements:
        - Added interfaces in Function:
            - Function.ComboFunction (tags Function.Scaled, Function.Composition) marks
                functions that need other functions to produce meaningful results.
            - Function.Spline (implemented in Function.CatmullRomSpline,
                Function.NaturalSpline) declares the methods common for spline functions
                (double[] getData(), void setData(double[]) ).
        - Added a setData method for the Spline classes, to be able to change the
            data array.
    * NetError.netErrors method can now optionally return per-pattern and per-neuron
        error information. This makes it possible to test the network only once to get
        all desired test values, and even allows for arbitrary error calculations after
        network testing.
    * Additional methods in the PatternSet class:
        - size() returns the number of (input) patterns in the pattern set.
        - getInputPatternSize() returns the size of a single input pattern.
        - getTargetPatternSize() returns the size of a single target pattern.
        - subset(int first, int last) returns a new PatternSet with a subset of the
            patterns. (Shallow copy).
    * Additional methods for IOElement:
        - getAttributeCount() returns the number of attributes in the IOElement.
        - getIntAttribute, getFloatAttribute, getDoubleAttribute now also return the
            specified defaultValue if the attribute doesn't exist (is null)
            (...how did this slip by?!)
    * Fixed SNNSPatternFile writer:
        - Wrong No. of input/output units was written to the file in the former release.
        - Internal Null pointer fixes.
    * Added a checkSize(int) method, which expands the underlying array to the given
        size, if necessary. This can be used for optimisation, e.g. when objects are
        added in a loop and it is known beforehand how many objects will be added.

2005-05-03 August Mayer <amayer@cosy.sbg.ac.at>
    * Adding SNNS Network file reader/writer, from netJENXS.
    * Adding Catmull-Rom and Natural Spline activation functions, from netJENXS
        (written by Andreas Ban).

2005-04-26 August Mayer <amayer@cosy.sbg.ac.at>
    * Added Implementations for SSE, MSE, NRMSE, SQEP, and Classification
      Error, as well as a method to calculate all of them in a single run.

2005-04-21 August Mayer <amayer@cosy.sbg.ac.at>
    * Renaming DistanceError again to SquareError, after some discussion.
      This function is a default implementation especially for Backprop trainers and is
      meant to return only a part of the first derivative; for more info, see the Javadoc.
    * Added specific documentation about what the error signal is to
      TrainingSignalGenerator.SquareError, and referenced it from the javadocs
      of BackpropTrainer and RpropTrainer.
      

2005-04-20 August Mayer <amayer@cosy.sbg.ac.at>
    * loadObject threw an exception if an object entry existed in the file,
      but had no "class" attribute. Now also returns null. This fixes loading
      of Boone-default recurrent networks.
    * Releasing 0.8.3 . Still on Java 1.4 .

2005-04-20 August Mayer <amayer@cosy.sbg.ac.at>
    * Added "aliases" to the Boone class, for compatibility of saved files
      with newer Boone versions.
    * Added the NetError class, which provides methods for calculating the
      error of whole networks.
    * Renamed the "SquareSumError" TrainingSignalGenerator to "DifferenceError".
      The former name was historically caused, but has become completely wrong.
      With the new compatibility aliases, old files will still be loaded
      correctly.

2005-02-25 August Mayer <amayer@cosy.sbg.ac.at>
    * Releasing 0.8.2.
    * This release will be the last one to use Java 1.4 .

2005-02-08 August Mayer <amayer@cosy.sbg.ac.at>
    * Added default training cycle number to the Trainer class. This is convenient for GUIs.
      The variable isn't internally used by Boone.
      This number is also stored and loaded from/to files.

2005-02-08 August Mayer <amayer@cosy.sbg.ac.at>
    * Added Neuron.isHiddenNeuron() method, for convenience.

2005-01-17 August Mayer <amayer@cosy.sbg.ac.at>
    * Boone IO can now store and load arbitrary objects, not only Storables.
        However, only Storables can save or load object parameters and sub-objects.
        This is useful for storing objects without state.
    * Added a getHiddenNeuronCount() method to NeuralNet. Only neurons are counted which
        are neither input nor output neuron. Please note that this method iterates through
        the neuron list and is thus "slow".

2004-11-25 August Mayer <amayer@cosy.sbg.ac.at>
    * Big change: Storable.load now throws an IOElement.LoadExeption.
        This loadException is derived from IOException, so the usual
        "catch(IOException e)" around a load method is sufficient.
        LoadExceptions are now thrown if classes couldn't be found during
        loading; this makes problems much more obvious. The downside is that
        networks can't be loaded anymore if some classes are not available.
    * The above change causes many modifications in the interfaces of many
        objects (BrainPart, NeuralNet, Neuron, Link, ....).
        If you have overwritten the load method, please take care to add a
        "throws IOElement.LoadException" clause to the load(IOElement) method.
    * If an object's class isn't registered, store the full class name of the
        object; on loading, try to instantiate it using this full name.
        The consequence is that classes don't have to be registered anymore.
        YAY! Nevertheless, they can be registered, if requested.

2004-11-09 August Mayer <amayer@cosy.sbg.ac.at>
    * Added Brain class (Container for NeuralNets and PatternSets;
      can also load and save files containing these classes.)
    * Made PatternSet cloneable.
    * When cloning neural networks, also clone the Properties.
    * NeuralNet.load(File) now clears some lists before loading, so that
        there are no remains of the previous network after loading.

2004-11-09 August Mayer <amayer@cosy.sbg.ac.at>
    * NetFactory now sets the names of neurons and links to something nice.
    * NeuralNet.load and PatternSet.load aren't static anymore. They now load
      the network into their own instance.
    * IOHandler.load has now the facility to take an object instance as the
      root object of the hierarchy. This is used for the facility above.
    * Boone Release 0.8.1 .

2004-11-02 August Mayer <amayer@cosy.sbg.ac.at>
    * Preparing and releasing Boone v0.8 .
    * Note: This Boone version is mostly untested;
      please only switch if absolutely necessary

2004-10-07 August Mayer <amayer@cosy.sbg.ac.at>
    * Removed loadObject, storeObject methods from the IOHandler.
      Substitutes can be found in IOElement.
      Added a new constructor to directly create an IOElement from a Storable.
    * Re-modeled the IOHandler interface, for better programmer usability.
    * Made PatternSet a Storable, and implemented load and store methods.

2004-09-24 August Mayer <amayer@cosy.sbg.ac.at>
    * Changed the properties of NeuralNet and PatternSet back to a HashMap.
        We don't need the keys always to be Strings, but we need the ability
        to store other things there. (See above.)
    * Removed the ID, Name attributes of BrainParts.

2004-09-15 August Mayer <amayer@cosy.sbg.ac.at>
    * Made PatternSet inner classes static. Ups.
    * Made SNNSPatternFile.read return a SNNSPatternFile.PatternSet object,
        not a generic Object anymore.
    * Removed io sub-package import from Boone class. Upps.
    * Various fixes for the JavaDoc comments.
    * Release of Boone 0.7 .

2004-09-14 August Mayer <amayer@cosy.sbg.ac.at>
    * I/O framework change. Now the NeuralNet and PatternSet classes
        have simple load() and save() methods. Still kept the IOElements,
        though.
    * Adapted the other classes to the new I/O. Modified the test classes.

2004-08-04 August Mayer <amayer@cosy.sbg.ac.at>
    * Added LGPL license to Boone.
    * Updated build.xml to pack the source.
    * Boone release 0.6.2.

2004-07-13 August Mayer <amayer@cosy.sbg.ac.at>
    * Moved getBasicName() from Boone to ClassHelper class.
    * Added getField, setField methods to ClassHelper. These methods can
        be used to set object attributes in a very simple manner.
    * Javadoc updates because of these changes.
    * Release 0.6.1.

2004-07-08 August Mayer <amayer@cosy.sbg.ac.at>
    * Added toArray(), set(float[]) to Position.

2004-06-22 August Mayer <amayer@cosy.sbg.ac.at>
    * Added package descriptions.
    * Merged in some updates of boone.util classes.
    * Added changelog, README to release.
    * Release 0.6.

2004-06-22 August Mayer <amayer@cosy.sbg.ac.at>
    * Changed XML root element name "oonet" to "boone". Just modify
        the data files; they should continue working with the new I/O framework.
    * Version 0.6 .

2004-06-06 August Mayer <amayer@cosy.sbg.ac.at>
    * GREAT I/O REWRITE, part 1 of many...
    * Moving files around, out of boone.util,
        and from boone.io into boone.io.[format].
    * Position changed to use Float instead of double - sufficient and faster(?)

2004-05-24 August Mayer <amayer@cosy.sbg.ac.at>
    * Turn off zeroInputAfterFirstCycle flag by default.
    * Changed the internal NetFactory methods to protected access.
    * Assorted javadoc enhancements.
    * Removed TickList, Tickable. Renamed tickIt() to propagateOutput().
        MAJOR CHANGE!

2004-05-13 August Mayer <amayer@cosy.sbg.ac.at>
    * Linked BrainPart.setPartTrainer and PartTrainer.setPart.
        If one is called, the other one is also done automatically.
    * Corrected bug in BrainPart: clone() didn't work if PartTrainer was null.
    * Documentation fixes.
    * NetFactory: Added templates for the Neurons and Links to the 
        createFeedForward and createRecurrent methods, and also to the
        underlying factory classes. Modified SimpleFFTest to use that.
    * Added new version of doc/boone-manual.sxw . Still not final, though.
    * Boone release 0.5.2

2004-04-29 August Mayer <amayer@cosy.sbg.ac.at>
    * Fixes for replaceNeuron and replaceLink:
        - added BrainPart.takePartTrainerFrom method
        - NeuralNet.replaceNeuron, Link.replaceFrom now call this method.
    * Link() constructor without arguments now initializes weight in [ -1, 1 [.
    * RpropTrainer now tolerates null activation function.
    * build.xml now creates single release package, instead of many archives.

2004-04-21 August Mayer <amayer@cosy.sbg.ac.at>
    * Bugfix - didn't really save the props....    
    * Added descriptions of randomization interval of weights.
    * Boone release 0.5.1

2004-04-07 August Mayer <amayer@cosy.sbg.ac.at>
    * Split out Proben1PatternSet class, which was Proben1PatternFile.PatternSet before.
    * Added Properties props to PatternSet class.
        - Added support for writing and reading PatternSet properties
            to the XNetPatternFile and SNNSPatternFile handlers.
    * Added StreamParser.readSpaceOnly(), which doesn't skip over comments.
    * Added Conversion.escapeNewlines() and Conversion.unescapeNewlines(),
        which escapes \ to \\, NL to \n and CR to \r.
    * Cleaned up javadoc, Version for building with ant is 0.5 .
    * Release 0.5

2004-04-01 August Mayer <amayer@cosy.sbg.ac.at>
    * Errors in standard trainers
        Oops, when counting backwards, start at count()-1 and finish when >= 0 ...!
    * Fixes in PatternIOTest

2004-03-31 August Mayer <amayer@cosy.sbg.ac.at>
    * Boone Version 0.4.99 (0.5 pre) - needs to be TESTED!
    * Enhanced javadoc comments
    * TrainingSignalGenerator.calcSignal doesn't return a value anymore.
    * Moved boone.io.snns.SNNSPatternFile to boone.io.SNNSPatternFilter.
    * Added Registry.RegistryIterator sub-class.
    * PartTrainer.train doesn't return a value anymore either, see below.
    
    * FileHandler changes:
        - Added abstract getDefaultExtension, getMIMEType methods to boone.util.io.FileHandler
        - Added an automatic appendExtension method 
        - added getDefaultExtension, getMIMEType implementations to the standard file handlers.

    * Changes in Trainer:
        - Removed TrainingListener clutter
        - Train methods now don't return a value; use test
        - New test methods return a single value
        - Test method testing multiple patterns now sums up the error and returns the single value
        - Removed unsupervised training methods. Use "null" or similar for the target, instead.

    * Neural net:
        - Changed the calculation style: Didn't work correctly for general recurrent networks.
            automatic loop-back of values for recurrent networks now uses the linkInput,
            not the externalInput.
        - Bug fixed: link inputs weren't reset after innervate was finished (Phew!! At last!!)
        - Added option for resetting the external input to 0 after the first cycle,
            for recurrent networks. This is necessary now, as the loopback happens to the
            linkInput, not the external input anymore.
        - Added innervateCycle method.
        - Changes in replaceNeuron.
        - Always re-create tick list on innervate() when ORDER_RANDOM is selected

    * Neuron:
        - Added getLinkInput(), setLinkInput() methods
        - looping back values to the input now uses the linkInput, not the externalInput anymore.
            Also see above.

    * Link:
        - propagate() now takes a boolean instead of a neuron. That's all it needs.
        - added method getOtherNeuron to get the "other" neuron of the link.
            This is especially interesting for directed links.
        - removed the replaceSource, replaceSink methods; added a replaceNeuron method instead.
        - setUndirected now adds and removes the link correctly from the source/sink
            neurons' link lists. This was done only in setSink and setSource before
            and triggered errors.

    * Function:
        - added abstract getMinValue, getMaxValue methods, returning the minimum and maximum
            values of the function results. Useful for scaling. Use +/- Infinity if unknown.
        - added correct min, max functions for each standard function.

    * NetFactory:
        - JavaDoc fixes
        - createNeuralNet for recurrent networks now sets random calc order by default.
     
   

2004-02-25 August Mayer <amayer@cosy.sbg.ac.at>

    * Added iterator over the "kind" ClassInfo;
      also added generalized public RegistryIterator class.

2004-01-08 August Mayer <amayer@cosy.sbg.ac.at>
    
    * Added "props" Properties to NeuralNet; these are stored too.
    * Added getMinValue(), getMaxValue() functions to the Function class.
      Also, added implementations of these functions to all internal
      functions.
      Note: If you implemented your own functions, you'll need to add
      implementations for these too.
    * Release 0.4.1

2003-12-11 August Mayer <amayer@cosy.sbg.ac.at>

    * Moved Referentiable to the boone.util.registry package.
    * Corrected Javadoc.
    * Release 0.4

2003-12-10 August Mayer <amayer@cosy.sbg.ac.at>

    * New NetFactory implementation. Use like SimpleNetFactory,
        with a (nearly) unchanged interface, or use the new facilities.
        Also: Removed SimpleNetFactory - now incorporated in NetFactory.
        NOTE: Nearly untested right now!!
    * Moving files around:
        boone.util.StreamParser, boone.util.XMLUtils -> boone.util.io
        boone.util.io.{ClassInfo, Param, Registry}  -> boone.util.registry
    * Added the boone.io.Proben1PatternFilter handler.
    * Changed the way boone.Boone automatically creates its static instance,
        and how it cooperates with the boone.Standard class.
        This should work better now.
    * Small fixes - trimmed the imports, etc.

2003-10-09 August Mayer <amayer@cosy.sbg.ac.at>

    * Detected and repaired clone bug - parts of PartTrainers weren't
        correctly initialized.

2003-10-01 August Mayer <amayer@cosy.sbg.ac.at>

    * Renamed Function.Composite to Function.Scaled,
        added Function.Composition which combines two functions.
    * Added clone support to the whole package. This is much faster
        than serializing and de-serializing.
    * Added value caching to the Sigmoid function.
        Sigmoid is not a Singleton anymore, because of this.
    * Release 0.3

2003-09-23 August Mayer <amayer@cosy.sbg.ac.at>

    * Bug fixes.
        - neuron sorting works again.
        - Neuron.addNextNeurons (sic?) removed. Not needed anymore.

2003-09-22 August Mayer <amayer@cosy.sbg.ac.at>

    * Added getLinkInput, setLinkInput; this should be useful for recurrent
        networks, as a replacement for the initial activation values.
    * MAJOR CHANGE: Moved a number of classes to boone.util and boone.util.io .
        This package should contain code that can be shared in other projects;
        the intended use is for NetJEN.
        A separate .jar will be created for this sub-package.
        Classes moved were:
            To boone.util: ags.util.*
            To boone.util.io:
                boone.registry.*
                boone.io.{FileHandler, IOElement, Referentiable, Storable}
        (But NOT boone.io.PatternSet; this class is specific for neural nets).
    * Moved the implementations from Boone to IOElements, Registry
        and ExceptionHandler.

2003-09-16 August Mayer <amayer@cosy.sbg.ac.at>

    * Reworked the boone.registry package. Desc is now called ClassInfo,
        and there is no Registrable interface anymore.

2003-09-13 August Mayer <amayer@cosy.sbg.ac.at>
    
    * changed identifying names for objects in the Registry to be
        just the class name, without packages or enclosing classes.
        This is sufficient and simpler to use.
    * added a Function.create(name) to create Functions from function names.
    * added a Neuron.setActivationFn(String) method to set an activation
        function using its name.

2003-09-13 August Mayer <amayer@cosy.sbg.ac.at>

    * resetTraining isn't called in setLearnRate automatically,
        anymore. This system gets confused with strange initialization order,
        i.e. when using Boone with NetJEN.
        Note that it may be necessary to call resetTraining after a
        setLearnRate, to propagate the new learning rate to the part trainers,
        for some kinds of trainers.
    * documentation updates; added output in case of errors in Param.
    * corrected loading of the reference to the part in the PartTrainers.
        For this, the BrainPart abstract class now implements Storable, because
        it needs to do more on loading.
    * added Neuron.getLinkTo(Neuron) method which returns the output link
        from this to the given neuron.
    * added setter methods for lastErrorGradientSum, errorGradientSum in
        RpropTrainer. This is needed for storage.

2003-09-04 August Mayer <amayer@cosy.sbg.ac.at>

    * added a Neuron.getLinkTo(Neuron) method, to find links
        from Neuron A to Neuron B.

2003-09-03 August Mayer <amayer@cosy.sbg.ac.at>

    * learnRate is now stored in Trainer; changed rest of framework to fit.
      ATTENTION: API changes, next version will be 0.3 .
    * PartTrainer doesn't define empty beginTurn, endTurn, beginEpoch,
        endEpoch anymore.
    * Removed obsolete @TODO.s in Javadoc comments.
    * ant build.xml doesn't remove dist directory anymore.

2003-08-30 August Mayer <amayer@cosy.sbg.ac.at>

    * Release of boone v0.2.
	Major minor number increases because of API change.

2003-08-29 August Mayer <amayer@cosy.sbg.ac.at>

    * Changed "name" field in Desc from some explicitly given name
        to the class name of the desired class.

2003-08-28 August Mayer <amayer@cosy.sbg.ac.at>

    * Changed the PatternSet class: inputPatterns, outputPatterns are now a VarArray
        - Fixes in XNetPatternFile for this change.
        - Updating the tests.
    * Moving the test data files to a subdirectory test/testFiles.
    * Added a SNNS pattern file handler, which can read and write SNNS pattern files.
        - Adding a test class for the SNNS Pattern file handler.
        - Also adding some test SNNS pattern data files.
        - Moved everything to the boone.io.snns package.
    * Small syntax and bug fixes: ags.util.Conversion, ags.util.XMLUtils

2003-08-22 August Mayer <amayer@cosy.sbg.ac.at>

    * ATTENTION: MAJOR API CHANGE; MODIFICATIONS WILL BE REQUIRED.
    * Renames:
        - Trainer.trainAll to Trainer.train
        - Trainer.train to Trainer trainTurn
        - ErrorFunction to TrainingSignalGenerator
        - ErrorFunction.calcError to calcSignal
    * Added a TrainingSignalGenerator link to the Trainer base class.
        - Note, by default, this is null.
    * Made HebbTrainer work again. Untested.
    * Made BackpropTrainer to actually be a BackpropMomentumTrainer,
        with the momentum set to 0 by default.
        Removed BackpropMomentumTrainer.

2003-08-21 August Mayer <amayer@cosy.sbg.ac.at>

    * Added Trainer reference to the PartTrainer class, changed
        standard trainers to use this reference.
    * Added setLearnRate, getLearnRate to the Trainer class. By default,
        these methods throw an UnsupportedOperationException if not
        overwritten.
    * Moved standard descriptions out of boone.Boone
        to the boone.Standard class.

2003-06-25 August Mayer <amayer@cosy.sbg.ac.at>

    * Many changes, as Boone v0.1.1 was released
    * Boone v0.1.2:
    	- optimization in the backprop trainers
   	- small changes
	- documentation changes

2003-05-07 August Mayer <amayer@cosy.sbg.ac.at>

    * debugged BackpropMomentumTrainer

2003-05-06 August Mayer <amayer@cosy.sbg.ac.at>

    * bug found why it didn't learn: beginTurn() wasn't called correctly...
    * changed NetJENXS emulation to work again with Boone

2003-05-05 August Mayer <amayer@cosy.sbg.ac.at>

    * added:
        - Link.directed, handling in Neuron.tickIt and factory
        - Neuron.loopOutputToInput, handling in Neuron.tickIt and factory
    * added a randomize() method to NeuralNet, Neuron and Link.
    * small corrections

2003-05-02 August Mayer <amayer@cosy.sbg.ac.at>

    * changed numXYZs() to getXYZCount(). Still think this is ugly.
    * added a SimpleNetFactory, for easy creation of networks.
        NetFactory is more flexible and elegant, but SimpleNetFactory is way
        shorter and uses less fluff.

2003-04-29 August Mayer <amayer@cosy.sbg.ac.at>

    * replaceNeuron, replaceLink, replaceTrainer

2003-04-28 August Mayer <amayer@cosy.sbg.ac.at>
    
    * added add(position, neuron)
    * removed flushTickList(); use setTickList(null) instead.
    * added Referentiable interface, support
    
2003-04-23 August Mayer <amayer@cosy.sbg.ac.at>

    * retrofitted HopfieldDeltaTrainer, HebbTrainer
    * added longName to the desc object; name now is an identifier,
      suitable e.g. for XML element names (without spaces)
    * recording loaded trainers for references by PartTrainers.
    * added calcOrder flag
    * added tickList creation for the new calc orders.

2003-04-22 August Mayer <amayer@cosy.sbg.ac.at>

    * some changes in Position
    * removed Neuron.setPosition(x,y,z);
      use Neuron.getPosition().set(x,y,z) instead.
    * retrofitted Backprop, BackpropMomentum, Rprop trainers
        to use static PartTrainers, and to have Descs for them in Boone,
        and to be correctly loaded.

2003-04-14 August Mayer <amayer@cosy.sbg.ac.at>

    * created the new NetFactory super-duper-class, with simple static methods.
    * added Param.Obj .
    * reworked Param and Function persistence system.
    * added kind to Desc object.
    * added Position description to Boone class.
    * made Neuron storable via standard Boone introspection mechanism.

2003-04-10 August Mayer <amayer@cosy.sbg.ac.at>

    * reworked HebbTrainer, HopfieldDeltaTrainer

2003-04-09 August Mayer <amayer@cosy.sbg.ac.at>

    * re-designed the I/O implementation:
        - renaming initialize() to load()
        - removing passing down the Trainer
        - added Neuron and Link and NeuralNet to the registered objects
          (in the Boone object)
        - changed the way how classes are re-created (now from the
          "class"-attribute of the sub-objects, using a static method in
          the Boone class.
    * trimmed the Trainer implementations (RpropTrainer etc.) for usage
      with the changed base class
        - using the Listener / EpochListener interface instead of
          a resetTraining method.
        - changed the mode of accessing the TickList and other properties
          in neuron etc.
    * added Listener / EpochListener support to the train() methods
    * added a toBool method to the ags.util.Conversion class
    * many more small changes ...

2003-04-08 August Mayer <amayer@cosy.sbg.ac.at>

    * changelog started.
    * renamed package from "oonet" to "boone".
    * removed TrainingException class.
    * moved the Tickable interface inside the TickList class.
    * created registry.BooneRegistry, moving all meta-info into this place.
    * PartTrainer: removed ticks, reset, isEvaluating/train()
    * diction change: Turn to Epoch, (one pattern presentation) to Network Turn
        - TurnBased -> EpochListener
        - Trainer.Listener.beginTrainPattern -> beginTurn (!)
    * removed NeuralNet.reset()
    * removed NeuralNet.run()

End of Changelog.
