Home | Back to Courses
The Evolution of Transformers

Partner: Udemy
Affiliate Name:
Area:
Description: Transformers are the dominant neural architecture in modern AI. This course covers everything there is to know about transformers, including what inspired their birth, how they have evolved and become more efficient, and how they grew from their origins in natural language processing into the most ubiquitous neural architecture. The course primarily covers theory and general concepts. There are a few coding exercises and equations scattered throughout, but if you are looking for a deep technical dive, you should probably look elsewhere. The most technical parts of this course cover building multi-headed attention and transformer blocks from scratch, using Pytorch. Some of the topics covered in this course include:Efficient transformersLongformerReformerVision transformer (ViT)wave2vec2stable diffusionthe perceiver familydecision transformerChatGPT and how to combine generative pre-trained (GPT) models with reinforcement learning with human feedback (RLHF)Tokenizers and positional encoding strategiesFine tuning a language model, like BERT, using Pytorch and HuggingFaceHow to think about attention from a sequence perspective and a graph perspective, and how these perspectives can yield insights into improving transformersThe course spends more time on text than other data types. This is because of the recent hype around LLMs. The course aims to demystify LLMs like ChatGPT, and help you learn how to explain them to non-technical audiences.
Category: IT & Software > Other IT & Software > Data Science
Partner ID:
Price: 19.99
Commission:
Source: Impact
Go to Course