Morris Huang

Incoming Ph.D. Student @ University of North Carolina, Chapel Hill

Hey, my name is Yu-Chao Huang (黃禹超 in Chinese). I also go by Morris. I am an incoming Ph.D. student at UNC CS, advised by Professor Tianlong Chen and Professor Guorong Wu. Previously, I had the opportunity to intern at NU, working under Professor Han Liu in the MAGICS lab. Before that, I earned M.Sc. in Physics under the guidance of Prof. Hsi-Sheng Goan at National Taiwan University followed by research assistant position at National Center for Theoretical Sciences.

The equation \(|\)Morris\(\rangle\) = \( {\alpha} |\)AI\(\rangle\) + \( {\beta} |\)Science\(\rangle\) reflects my superposition of research passion at the intersection of AI and science. My research interests lie in following directions:

  1. Developing methodologies with theoretical guarantees to ensure both optimality and practical applicability. [ICML’24, ArXiv, ArXiv]
  2. Theoretical understanding of the computational and statistical properties of deep learning models. [ICLR’25]
  3. Tackling interdisciplinary scientific problems from the machine learning perspective — physics-based modeling, quantum computing. [ArXiv, ArXiv]
  4. Advancing domain adaptation and scientific discovery frameworks for Large Language Models. [EMNLP’24]

Feel free to reach out if you think we should connect !

Somewhere, something incredible is waiting to be known.
— Carl Sagan

  1. Sept. 2018 - June 2022
    National Central University B.Sc. in Physics
  2. Sept. 2022 - June 2024
    National Taiwan University M.Sc. in Physics
  3. July 2024 - Feb. 2025
    NCTS-Physics Research Assistant
  4. Aug. 2025
    UNC Chapel Hill Incoming Ph.D. Student

Selected Publications

  1. EMNLP
    Two Tales of Persona in LLMs: A Survey of Role-Playing and Personalization
    Yu-Min Tseng*, Yu-Chao Huang*, Teng-Yun Hsiao*, Yu-Ching Hsu, and 3 more authors
    Findings of the Association for Computational Linguistics: EMNLP 2024, 2024
    * These authors contributed equally to this work
  2. ICML
    BiSHop: Bi-Directional Cellular Learning for Tabular Data with Generalized Sparse Modern Hopfield Model
    Chenwei Xu*, Yu-Chao Huang*, Jerry Yao-Chieh Hu*, Weijian Li, and 3 more authors
    International Conference on Machine Learning (ICML), 2024
    * These authors contributed equally to this work
  3. ICLR
    On Statistical Rates of Conditional Diffusion Transformers: Approximation, Estimation and Minimax Optimality
    Jerry Yao-Chieh Hu*, Weimin Wu*, Yi-Chen Lee*, Yu-Chao Huang*, and 2 more authors
    International Conference on Learning Representations (ICLR), 2025
    * These authors contributed equally to this work
  4. Under Review
    L2O-g†: Learning to Optimize Parameterized Quantum Circuits with Fubini-Study Metric Tensor
    Yu-Chao Huang, and Hsi-Sheng Goan
    arXiv preprint arXiv:2407.14761, 2024
  5. ArXiv
    Test-Time Training with Quantum Auto-Encoder: From Distribution Shift to Noisy Quantum Circuits
    Damien Jian*, Yu-Chao Huang*, and Hsi-Sheng Goan
    arXiv preprint arXiv:2411.06828, 2024
    * These authors contributed equally to this work

Machine Learning Demo - Hopfield Networks

The retrieve pattern is ... :drum:
(Steps: 0/3000)

The Nobel Prize in Physics 2024 is awarded to John Hopfield and Geoffrey Hinton! (see here) The Hopfield network is inspired by the Ising model. Hopfield network acts as a dynamic energy system where neurons interact to reach stable, low-energy states, similar to particles finding equilibrium in a physical system. The energy function is given by \[ E = -\frac{1}{2} \sum_{i \neq j} W_{ij} s_i s_j - \sum_{i} b_i s_i, \] where neuron interactions mimic energy exchanges, guiding the network to "remember" stored patterns (see a nice blog post). :brain: :sparkles: Check out our paper on utilizing modern Hopfield networks for tabular learning [ICML'24].