MedAI #41: Efficiently Modeling Long Sequences with Structured State Spaces | Albert Gu Published 2022-04-21 Download video MP4 360p Recommendations 57:19 Efficiently Modeling Long Sequences with Structured State Spaces - Albert Gu | Stanford MLSys #46 16:01 Mamba - a replacement for Transformers? 1:04:28 Structured State Space Models for Deep Sequence Modeling (Albert Gu, CMU) 47:27 Physics Informed Machine Learning: High Level Overview of AI and ML in Science and Engineering 50:05 6. Monte Carlo Simulation 40:40 Mamba: Linear-Time Sequence Modeling with Selective State Spaces (Paper Explained) 36:55 Andrew Ng: Opportunities in AI - 2023 1:32:01 Diffusion and Score-Based Generative Models 23:47 Economist explains why India can never grow like China 43:26 xLSTM: Extended Long Short-Term Memory 45:46 Geoffrey Hinton | On working with Ilya, choosing problems, and the power of intuition 34:32 Physics Informed Neural Networks (PINNs) [Physics Informed Machine Learning] 18:05 How AI 'Understands' Images (CLIP) - Computerphile 59:00 An Introduction to Graph Neural Networks: Models and Applications 1:12:30 Jeff Dean (Google): Exciting Trends in Machine Learning 58:58 FlashAttention - Tri Dao | Stanford MLSys #67 40:08 The Most Important Algorithm in Machine Learning 13:37 What are Transformer Models and How do they Work? Similar videos 39:43 Do we need Attention? - Linear RNNs and State Space Models (SSMs) for NLP 1:19:37 JAX Talk: Generating Extremely Long Sequences with S4 1:16:55 Numerics of ML 5 -- State-Space Models -- Jonathan Schmidt 11:48 Introduction to State Space Modelling (Pt. 1) 1:02:17 RWKV: Reinventing RNNs for the Transformer Era (Paper Explained) 13:05 State-Space Models, Control, Time Series Analysis (Episode 19) 1:16:37 CMU Advanced NLP 2022 (21): Modeling Long Sequences More results