Function Calling in Ollama vs OpenAI Published 2024-02-12 Download video MP4 360p Recommendations 17:29 Function Calling with Local Models & LangChain - Ollama, Llama3 & Phi-3 15:36 GPT function calling in a nutshell 10:47 Finally Ollama has an OpenAI compatible API 13:36 Why Structured Outputs by OpenAI Change Everything 18:42 Crawl4AI - Crawl the web in an LLM-friendly Style 25:34 "I want Llama3.1 to perform 10x with my private knowledge" - Self learning Local Llama3.1 405B 33:24 Fine-Tuning Llama 3 on a Custom Dataset: Training LLM for a RAG Q&A Use Case on a Single GPU 24:20 host ALL your AI locally 25:21 Model Distillation: Same LLM Power but 3240x Smaller 05:21 How does OpenAI Function Calling work? 31:14 LLM Function Calling - AI Tools Deep Dive 06:55 Faster LLM Function Calling — Dynamic Routes 08:55 The Story of React Query 20:07 ALL ROADS LEAD to AI CODING: Cursor, Aider in the browser, Multi file Prompting 13:10 Have You Picked the Wrong AI Agent Framework? 13:31 Unlock Ollama's Modelfile | How to Upgrade your Model's Brain using the Modelfile 10:37 Amazing New VS Code AI Coding Assistant with Open Source Models Similar videos 06:18 Ollama Function Calling Advanced: Make your Application Future Proof! 13:28 Easiest Local Function Calling using Ollama and Llama 3.1 [A-Z] 28:24 OpenAI Function Calling - FULL Beginner Tutorial 19:31 Local Function Calling with Llama3 using Ollama and Phidata 30:25 Function Calling Local LLMs!? LLaMa 3 Web Search Agent Breakdown (With Code!) More results