Difference between revisions of "Evolving Kolmogorov-Arnold Networks"

From ISLAB/CAISR
(Created page with "{{StudentProjectTemplate |Summary=This project aims to enhance the architecture of Kolmogorov-Arnold Networks (KANs) by optimizing key components such as loss functions, activ...")
 
 
Line 7: Line 7:
 
}}
 
}}
 
Kolmogorov-Arnold Networks (KANs), recently proposed by researchers at MIT, present a promising alternative to traditional Multi-Layer Perceptrons (MLPs), demonstrating superior performance in terms of both accuracy and interpretability. The goal of this project is to further advance the architecture of KANs by enhancing their computational graph through various research directions. These include optimizing loss functions, refining activation functions, developing more effective initialization schemes, and improving learning processes.
 
Kolmogorov-Arnold Networks (KANs), recently proposed by researchers at MIT, present a promising alternative to traditional Multi-Layer Perceptrons (MLPs), demonstrating superior performance in terms of both accuracy and interpretability. The goal of this project is to further advance the architecture of KANs by enhancing their computational graph through various research directions. These include optimizing loss functions, refining activation functions, developing more effective initialization schemes, and improving learning processes.
 +
 +
https://arxiv.org/abs/2404.19756
 +
https://arxiv.org/abs/2407.13044
 +
https://arxiv.org/abs/2407.20667

Latest revision as of 15:32, 23 September 2024

Title Evolving Kolmogorov-Arnold Networks
Summary This project aims to enhance the architecture of Kolmogorov-Arnold Networks (KANs) by optimizing key components such as loss functions, activation functions, initialization methods, and learning processes to improve their performance and interpretability.
Keywords
TimeFrame Fall 2024
References
Prerequisites
Author
Supervisor Mohammed Ghaith Altarabichi
Level Master
Status Open


Kolmogorov-Arnold Networks (KANs), recently proposed by researchers at MIT, present a promising alternative to traditional Multi-Layer Perceptrons (MLPs), demonstrating superior performance in terms of both accuracy and interpretability. The goal of this project is to further advance the architecture of KANs by enhancing their computational graph through various research directions. These include optimizing loss functions, refining activation functions, developing more effective initialization schemes, and improving learning processes.

https://arxiv.org/abs/2404.19756 https://arxiv.org/abs/2407.13044 https://arxiv.org/abs/2407.20667