About Me
I am Ryokan Ri (李 凌寒), a research engineer at LINE corporation.
My primary research interest lies in Natural Language Processing (NLP).
Some of my research interests are:
- Multilingual representation learning
- Machine translation
- Intersection of linguistics and machine learning.
In my spare time, I enjoy cooking 🍳 and playing music 🎸.
Links:
CV
Google Scholar
Github
Education
- Apr’ 2018 - : Tsuruoka Lab., Graduate School of Information Science and Technology, The University of Tokyo
- Apr’ 2020 - Mar’ 2023: Ph. D student
- Apr’ 2018 - Mar’ 2020: Master student
- Apr’ 2013 - Mar’ 2018: College of Arts an Sciences, The University of Tokyo
- Apr’ 2015 - Mar’ 2018: Bachelor of Arts (linguistics)
- Jun’ 2016 - Jun’ 2017: The University of Sydney (Student Exchange)
- Apr’ 2013 - Mar’ 2015: Natural Science Ⅰ
Experience
Awards and Honors
Publication
International Conferences / Journals
- I. Yamada and R. Ri, “LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation”, 2024. [arxiv]
- R. Ri, R. Ueda, J. Naradowsky, “Emergent Communication with Attention”, in Proceedings of the Annual Meeting of the Cognitive Science Society, 2023. [eScholarship] [arxiv]
- R. Ri and Y. Tsuruoka, “Pretraining with Artificial Language: Studying Transferable Knowledge in Language Models”, in Proceedings of the 60th ACL (long), 2022. [anthology] [arxiv]
- R. Ri, I. Yamada, and Y. Tsuruoka, “mLUKE: The Power of Entity Representations in Multilingual Pretrained Language Models”, in Proceedings of the 60th ACL (long), 2022. [anthology] [arxiv]
- S. Nishikawa, R. Ri, I. Yamada, Y. Tsuruoka, and I. Echizen, “EASE: Entity-Aware Contrastive Learning of Sentence Embedding”, in Proceedings of NAACL, 2022. [anthology] [arxiv]
- R. Ri, Y. Hou, R. Marinescu, and A. Kishimoto, “Finding Sub-task Structure with Natural Language Instruction”, in Proceedings of the First Workshop on Learning with Natural Language Supervision, 2022. [anthology]
- R. Ri, T. Nakazawa, and Y. Tsuruoka, “Modeling Target-side Inflection in Placeholder Translation”, in Proceedings of Machine Translation Summit XVIII Volume 1: Research Track, 2021. [anthology] [arXiv]
- R. Ri, T. Nakazawa, and Y. Tsuruoka, “Zero-pronoun Data Augmentation for Japanese-to-English Translation”, in Proceedings of the 8th Workshop on Asian Translation, 2021. [anthology] [arXiv]
- S. Nishikawa, R. Ri, and Y. Tsuruoka, “Data Augmentation with Unsupervised Machine Translation Improves the Structural Similarity of Cross-lingual Word Embeddings”, in ACL-IJCNLP 2021 Student Research Workshop, 2021. [anthology] [arXiv]
- R. Ri and Y. Tsuruoka, “Revisiting the Context Window for Cross-lingual Word Embeddings”, in Proceedings of the 58th ACL (long), 2020. [anthology] [arXiv]
- M. Rikters, T. Nakazawa, and R. Ri, “The University of Tokyo’s Submissions to the WAT 2020 Shared Task”, in Proceedings of the 7th Workshop on Asian Translation, 2020. [anthology]
- M. Rikters, R. Ri, T. Li, and T. Nakazawa, “Document-aligned Japanese-English Conversation Parallel Corpus”, in Proceedings of the Fifth Conference on Machine Translation, 2020. [anthology]
- M. Rikters, R. Ri, T. Li, and T. Nakazawa, “Designing the Business Conversation Corpus”, in Proceedings of the 6th Workshop on Asian Translation, 2019. [anthology] [arXiv]
- R. Ri and Y. Tsuruoka, “What do Multilingual Neural Machine Translation Models Learn about Typology?”, in The First Workshop on Typology for Polyglot NLP, 2019. [PDF]
Book
- I. Yamada, M. Suzuki, K. Yamada, R. Ri, “大規模言語モデル入門 (Introduction to Large Language Models)”, Gijutsu-Hyohron Co., Ltd., 2023. [Amazon]