Yu-Chao (Morris) Huang
Incoming PhD Student @ University of North Carolina, Chapel Hill
Hey, my name is Yu-Chao Huang (黃禹超 in Chinese). I also go by Morris. I am an incoming PhD student at UNC CS, advised by Prof. Tianlong Chen and Prof. Guorong Wu. Previously, I had the opportunity to intern at NU, working under Prof. Han Liu in the MAGICS lab. Before that, I earned M.Sc. in Physics under the guidance of Prof. Hsi-Sheng Goan at National Taiwan University followed by research assistant position at National Center for Theoretical Sciences.
The equation \(|\)Morris\(\rangle\) = \( {\alpha} |\)AI\(\rangle\) + \( {\beta} |\)Science\(\rangle\) reflects my superposition of research passion at the intersection of AI and science. My research interests lie in following directions:
- Developing methodologies with theoretical guarantees to ensure both optimality and practical applicability. [ICML’24, ArXiv, ArXiv]
- Theoretical understanding of the computational and statistical properties of deep learning models. [ICLR’25]
- Tackling interdisciplinary scientific problems from the machine learning perspective — physics-based modeling, quantum computing. [ArXiv, ArXiv]
- Advancing domain adaptation and scientific discovery frameworks for Large Language Models. [EMNLP’24]
Feel free to reach out if you think we should connect !
Somewhere, something incredible is waiting to be known.
— Carl Sagan
- [Apr, 2025] ~ $ Starting my PhD study at University of North Carolina at Chapel Hill this Fall. 🎉 I’ve set some ambitious goals!
- [Mar, 2025] ~ $ Paper accepted to ICLR’25 DeLTa Workshop: Statistical Foundations of Conditional Diffusion Transformers.
- [Jan, 2025] ~ $ Paper accepted to ICLR’25: On Statistical Rates of Conditional Diffusion Transformer: Approximation and Estimation.
- [Nov, 2024] ~ $ New preprint! QTTT , a test-time training framework using quantum auto-encoders to adapt to data shifts and circuit noise.
- [Sep, 2024] ~ $ Paper accepted to EMNLP’24 Findings: Two Tales of Persona in LLMs: A Survey of Role-Playing and Personalization.
- [Jul, 2024] ~ $ New preprint! L2O-\( g^{\dagger} \) is a quantum-aware learned optimizer that leverages quantum geometry and LSTM networks to efficiently train variational quantum algorithms (VQAs) with strong generalization and no hyperparameter tuning.
- [May, 2024] ~ $ Paper accepted to ICML’24: BiSHop: Bi-Directional Cellular Learning for Tabular Data with Generalized Sparse Modern Hopfield Model.
Selected Publications
* These authors contributed equally to this work.
Machine Learning Demo - Hopfield Networks

The Nobel Prize in Physics 2024 is awarded to John Hopfield and Geoffrey Hinton! (see here) The Hopfield network is inspired by the Ising model. Hopfield network acts as a dynamic energy system where neurons interact to reach stable, low-energy states, similar to particles finding equilibrium in a physical system. The energy function is given by \[ E = -\frac{1}{2} \sum_{i \neq j} W_{ij} s_i s_j - \sum_{i} b_i s_i, \] where neuron interactions mimic energy exchanges, guiding the network to "remember" stored patterns (see a nice blog post).
Check out our paper on utilizing modern Hopfield networks for tabular learning [ICML'24].