I am currently a first-year PhD student at National Taiwan University, supervised by Prof. Hung-yi Lee, in the Speech Processing and Machine Learning Lab. My research interests include speech foundation models, spoken language models, model compression, neuron analysis, and model merging. I am eager to explore new research areas and am currently looking for research internships for the year 2025. If there are any possibilities for research collaboration, please feel free to contact me.

🔥 News

  • 2024.09:  🎉🎉 Two papers accepted at SLT 2024 main conference track. See you in Macao 🇲🇴!
  • 2024.06:  🎉🎉 Two papers accepted at Interspeech 2024. See you in Greece 🇬🇷!
  • 2023.09:  🎉🎉 One paper accepeted at ASRU 2023.

📝 Publications

SLT 2024
sym

Property Neurons in Self-Supervised Speech Transformers

Tzu-Quan Lin, Guan-Ting Lin, Hung-yi Lee, Hao Tang

  • In this work, we identify a set of property neurons in the feedforward layers of Transformers to study how speech-related properties, such as phones, gender, and pitch, are stored.
Interspeech 2024
sym

DAISY: Data Adaptive Self-Supervised Early Exit for Speech Representation Models

Tzu-Quan Lin, Hung-yi Lee, Hao Tang

  • This work introduces a novel early exit method for speech self-supervised models that enhances the speed of HuBERT with minimal performance loss.
ASRU 2023
sym

MelHuBERT: A Simplified Hubert on Mel Spectrograms

Tzu-Quan Lin, Hung-yi Lee, Hao Tang

Project

  • MelHuBERT simplifies the model architecture and loss function of HuBERT, achieving comparable performance while saving 33.5% of MACs per one second of speech.
Submitted to TASLP (under review)
sym

Compressing Transformer-based self-supervised models for speech processing

Tzu-Quan Lin, Tsung-Huan Yang, Chun-Yao Charly registrationang, Kuang-Ming Chen, Tzu-hsun Feng, Hung-yi Lee, Hao Tang

Project

  • This work propose evaluating model compression methods using three different metrics: MACs, number of parameters, and real-time factor. We find that different compression methods excel in different metrics.

📖 Educations

  • 2024.09 - now, PhD in Electrical, Electronics, Communications Engineering (EE), Data Science and Smart Networking, National Taiwan University
  • 2022.09 - 2024.06, Master in CSIE, Networking and Multimedia, National Taiwan University
  • 2018.09 - 2022.06, Bachelor in Department of Computer Science and Information Engineering (CSIE), National Taiwan University

🏆 Honors and Awards

  • Interspeech 2024 Travel Grant

💻 Internships

  • 2021.07 - 2021-09, aetherAI, Taipei, Taiwan.