Multi-Layer Perceptron
Krishna Kumar
The University of Texas at Austin, Chishiki-AI
08/2025 (original)
This topic introduces the fundamental concepts of using neural networks for function approximation, a core task in scientific machine learning (SciML). We will build up from the basic building block, the perceptron, to a single-layer network and demonstrate its capacity to approximate a simple function, connecting theory to practice using PyTorch.
This topic includes materials that are collected together in a Jupyter notebook that can be run to reproduce the results contained in the topic pages. Access to the notebook is described in the Lab page at the end of this topic: Lab: Neural Networks and Function Approximation. If you would like to run the code in the notebook as you work through the materials in this topic, consult that Lab page for information on how to proceed.
CVW material development is supported by NSF OAC awards 1854828, 2321040, 2323116 (UT Austin) and 2005506 (Indiana University)