KANs and beyond: On nn-based PDE solvers with high-precision
| 구분 | 응용수학 |
|---|---|
| 일정 | 2025-06-12(목) 10:30~12:00 |
| 세미나실 | 온라인 |
| 강연자 | Yixuan Wang (Caltech) |
| 담당교수 | 홍영준 |
| 기타 |
We drew inspiration from the Kolmogorov–Arnold representation theory, using a composition of sum of 1D functions to represent higher dimensional functions, and proposed KANs, which has learnable activation functions parameterized by splines. Compared with MLPs, it will have much better scaling laws and interpretability since we use a much smaller network with 1D functions, provided that the underlying function has such a smooth representation, and in practice we can make KANs arbitrarily wide and deep. We demonstrate the wide applicability of this paradigm by examples of function pitting, PDE solving (PINNs), applications to scientific problems, and large scale problems. I will also talk about high-precision training of PINNs with application to singularity formulation in fluid dynamics.
Zoom Address: https://snu-ac-kr.zoom.us/my/youngjoonhong