Quanquan Gu: "Learning Over-parameterized Neural Networks: From Neural Tangent Kernel to Mean-fi..." Published 2020-07-06 Download video MP4 360p Recommendations 43:32 Symbolic AGI: How the Natural Will Build the Formal 45:46 Geoffrey Hinton | On working with Ilya, choosing problems, and the power of intuition 00:13 The rarest move in chess 13:03 Does Hollywood ruin books? - Numberphile 10:31 The U-Net (actually) explained in 10 minutes 26:40 Ilya Sutskever | we canreally try to buildnot only really powerful and useful AI but actually AGI 20:18 Why Does Diffusion Work Better than Auto-Regression? 15:25 Visual Guide to Transformer Neural Networks - (Episode 2) Multi-Head & Self-Attention 12:21 What's a Tensor? 50:26 Tom Goldstein: "What do neural loss surfaces look like?" 51:49 Terence Tao: The Erdős Discrepancy Problem 15:52 Optimization for Deep Learning (Momentum, RMSprop, AdaGrad, Adam) 1:14:52 Lecture 7 - Deep Learning Foundations: Neural Tangent Kernels 31:54 Oddball Visitor from Outer Space | Dr. David Jewitt | All Space Considered at Griffith Observatory 15:28 What are Diffusion Models? 31:51 MAMBA from Scratch: Neural Nets Better and Faster than Transformers Similar videos 05:22 2 2 3 Neural Tangent Kernel 42:38 Towards an Understanding of Wide, Deep Neural Networks | NeurIPS 2019 | Yasaman Bahri 55:12 RL theory workshop 2023: Quanquan Gu 1:00:05 Generalization Bounds and Neural Tangent Kernels, by Debarghya Ghoshdastidar 44:56 Francis Bach: Gradient descent on infinitely wide neural networks: Global convergence and... 47:38 Greg Yang — Feature Learning in Infinite-Width Neural Networks More results