From the Lab
Experiments that made it out alive.
GitHub 2025 KANTransformersCode
KANGPT
Mathew Vanherreweghe
A transformer-based language model that replaces traditional MLP layers with Kolmogorov-Arnold Networks (KAN). Instead of Linear → GELU → Linear, KANGPT uses learnable Chebyshev polynomial basis functions—achieving GPT-2 comparable performance with an alternative computational approach.
View on GitHub
arXiv 2025 Neural NetworksGeometry
Scale-Agnostic Kolmogorov-Arnold Geometry in Neural Networks
Mathew Vanherreweghe, Michael H. Freedman, Keith M. Adams
This research extends previous findings about geometric structures in neural networks to realistic, high-dimensional settings. We examined how 2-layer MLPs learn MNIST digit classification and discovered that KAG (Kolmogorov-Arnold Geometry) emerges during training and appears consistently across spatial scales.
Read on arXiv