Haw-Shiuan Chang

張浩軒

profile-hschang-6.jpeg

I am a postdoctoral scientist at Amazon AGI Foundations. I am on the job market now. My current research goal is to reduce the hallucination of LLMs in fundamental ways. I am especially interested in 1) identifying if the next word generated by a LLM comes from LLM’s guess, memory, or causal inference, and 2) designing the knowledge representations that facilitate the communication between humans and LLMs. Previously, I got my PhD at the University of Massachusetts Amherst advised by Professor Andrew McCallum. I worked with Yu-Chiang Frank Wang and Kuan-Ta Chen in Academia Sinica, Taiwan. I received BS in EECS Undergraduate Honors Program from National Yang Ming Chiao Tung University (NYCU), Taiwan.

selected publications

  1. WSDM
    To Copy, or not to Copy; That is a Critical Issue of the Output Softmax Layer in Neural Sequential Recommenders
    Haw-Shiuan Chang, Nikhil Agarwal, and Andrew McCallum
    In Proceedings of The 17th ACM Inernational Conference on Web Search and Data Mining, 2024
  2. ACL Findings
    Revisiting the Architectures like Pointer Networks to Efficiently Improve the Next Word Distribution, Summarization Factuality, and Beyond
    Haw-Shiuan Chang*, Zonghai Yao*, Alolika Gon, and 2 more authors
    In Findings of the Association for Computational Linguistics: ACL 2023 (Findings of ACL), 2023
  3. ACL
    Multi-CLS BERT: An Efficient Alternative to Traditional Ensembling
    Haw-Shiuan Chang*, Ruei-Yao Sun*, Kathryn Ricci*, and 1 more author
    In Annual Meeting of the Association for Computational Linguistics (ACL), 2023
  4. Thesis
    Modeling the Multi-mode Distribution in Self-Supervised Language Models
    Haw-Shiuan Chang
    In PhD Thesis, 2022
  5. ACL
    Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions
    Haw-Shiuan Chang, and Andrew McCallum
    In Annual Meeting of the Association for Computational Linguistics (ACL), 2022
  6. ML
    Using Error Decay Prediction to Overcome Practical Issues of Deep Active Learning for Named Entity Recognition
    Haw-Shiuan Chang, Shankar Vembu, Sunil Mohan, and 2 more authors
    Machine Learning, 2020
  7. EDM Short
    Modeling Exercise Relationships in E-Learning: A Unified Approach
    Haw-Shiuan Chang, Hwai-Jung Hsu, and Kuan-Ta Chen
    In International Conference on Educational Data Mining (EDM), 2015