Coding a Multimodal (Vision) Language Model from scratch in PyTorch with full explanation Published 2024-08-07 Download video MP4 360p Recommendations 26:55 LoRA: Low-Rank Adaptation of Large Language Models - Explained visually + PyTorch code from scratch 58:04 Attention is all you need (Transformer) - Model explanation (including math), Inference and Training 40:08 The Most Important Algorithm in Machine Learning 17:35 The Reparameterization Trick 1:07:40 I can't believe we coded an app with AI in 67 mins (V0, Cursor AI, Replit, Claude AI) 1:10:55 LLaMA explained: KV-Cache, Rotary Positional Embedding, RMS Norm, Grouped Query Attention, SwiGLU 57:24 Terence Tao at IMO 2024: AI and Mathematics 27:12 Variational Autoencoder - Model, ELBO, loss function and maths explained easily! 17:38 The moment we stopped understanding AI [AlexNet] 50:55 Quantization explained with PyTorch - Post-Training Quantization, Quantization-Aware Training 54:52 BERT explained: Training, Inference, BERT vs GPT/LLamA, Fine tuning, [CLS] token 20:18 Why Does Diffusion Work Better than Auto-Regression? 1:12:53 Distributed Training with PyTorch: complete tutorial with cloud infrastructure and code 31:28 Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math) 51:44 How to train a model to generate image embeddings from scratch 30:49 Vision Transformer Basics 31:51 MAMBA from Scratch: Neural Nets Better and Faster than Transformers Similar videos 5:43:41 Create a Large Language Model from Scratch with Python – Tutorial 1:11:41 Stanford CS25: V2 I Introduction to Transformers w/ Andrej Karpathy 59:48 [1hr Talk] Intro to Large Language Models 06:44 How do Multimodal AI models work? Simple explanation 08:38 Transformers: The best idea in AI | Andrej Karpathy and Lex Fridman 05:34 How Large language Models Work 09:53 "okay, but I want GPT to perform 10x for my specific use case" - Here is how 06:36 What is Retrieval-Augmented Generation (RAG)? 03:22 Vector databases are so hot right now. WTF are they? 09:11 Transformers, explained: Understand the model behind GPT, BERT, and T5 24:30 Tutorial 1-Transformer And Bert Implementation With Huggingface 48:07 OpenAI CLIP: ConnectingText and Images (Paper Explained) 10:24 Training Your Own AI Model Is Not As Hard As You (Probably) Think More results