Ibraheem Muhammad Moosa

LanguageX Lab, Department of Computer Science and Engineering, Penn State University, State College, PA



University Park, PA 16802

Hi, I am Moosa. I am a 1st year Computer Science PhD student at Penn State University, supervised Dr. Wenpeng Yin.

My current research focus is on understanding In-context Learning in language models, including how to use Task Instructions to achieve better results. I am also interested in trainign dynamics of language models and how that knowledge can be used to achieve better continual learning in these models.

Previously, I was working on multilingual language models. Specifically I studied how to improve tokenization using transliteration to train better multilingual language models. Before that I worked at Samsung R&D Institute Bangladesh where I worked on Samsung Internet and MyFiles.

Some of the stuff that I enjoy other than my work:

  1. Learning about history
  2. Reading books
  3. Watching movies
  4. Trying out different cuisines and
  5. Running/hiking.

The best way to reach out to me is via email.

selected publications

  1. Does Transliteration Help Multilingual Language Modeling?
    Ibraheem Muhammad Moosa, Mahmud Elahi Akhter, and Ashfia Binte Habib