Research engineer in the OPTIMAL lab under Prof. Francesco Orabona. Leading development of a bilingual Arabic-English LLM at the 3B parameter and 5T token scale.
Research engineer at KAUST (in Prof. Francesco Orabona's lab) working on pretraining bilingual Arabic-English LLMs. I've spent the last few years getting my hands dirty with large-scale distributed training! Most recent thing I've done is release a SOTA Arabic pretraining dataset (AraMix).
Previously at SDAIA as a founding member of the ALLaM team, where I helped build Saudi Arabia's flagship Arabic language model.
Research engineer in the OPTIMAL lab under Prof. Francesco Orabona. Leading development of a bilingual Arabic-English LLM at the 3B parameter and 5T token scale.
Founding member of the ALLaM team, worked across the full LLM pipeline because of our initial understaffing.
Selected for a fellowship program where I focused on AI for education. Built an AI-based learning management system for the Ministry of Education that I presented to the Minister at GAIN 2024. System started piloting in public schools.
frozenlake).