Reading the Pseudo-code for Transformers

Reading the Pseudo-code for Transformers

Transformers are the underlying algorithms powering large language models like GPT, Bert, and Llama. In this seminar, Dr. Yixiang Wu, with MTSU's Department of Mathematical Sciences will delve into a paper authored by the DeepMind team, focusing on the formal algorithms that define transformers. To grasp these algorithms, attendees need only a basic understanding of linear algebra (matrix multiplications) and probability theory (conditional probabilities). 

See the video here.