One of the current trends in the field of Artificial Neural Net (ANN) research is the use of activation functions other than the classics, such as sigmoid or heaviside. Periodic activation functions, such as sine, that have long been thought of as unsuitable for ANNs, have proven to lead to results better, or in the worst case equal to the ones archieved with sigmoid activation functions. When trying to solve classic ANN benchmark problems (with an analytical approach) we found that the best results can be found when different activation functions are used. Another topic that was of interest to our project was the evolvation of ANNs. The Departement of Computer Science at the University of Salzburg runs a research program called "siGis" (which stands for "Salzburg Interest Group on Integrated Systems"), that uses a system for evolving the structure of ANNs (NetGEN). When taking these two developments into account, it seems clear that an evolvable activation function could lead to an improvement in ANN performance. The first approach of siGis was to use function tables, which completely changes the function when mutated. For this project, we chose an approach which makes a more subtle way of changing the activation function possible. We interpolate the activation function from an evolvable point set, using the cubic spline interpolation method, which fulfills the criteria of an activation function in the SNNS, being smooth in the first derivative, and continuous in the second derivative, both within the interval and at its boundaries. This paper deals with the process of implementing and testing this feature. The first theoretical part discusses the advantages of different activation functions, and then turns to the theory and implementation of cubic splines as activation function. The second part tests the results using an analytical approach.