Locally-hosted, offline LLM w/LlamaIndex + OPT (open source, instruction-tuning LLM) Published 2023-05-02 Download video MP4 360p Similar videos 10:22 LangChain - Using Hugging Face Models locally (code walkthrough) 09:53 "okay, but I want GPT to perform 10x for my specific use case" - Here is how 04:40 How to use Llama Index with a local model instead of OpenAI 17:11 LocalGPT: OFFLINE CHAT FOR YOUR FILES [Installation & Code Walkthrough] 14:50 Run Your Own ChatGPT-like LLM on Your Windows PC! 17:32 Talk to Your Documents, Powered by Llama-Index 12:10 LangChain: Run Language Models Locally - Hugging Face Models 24:36 LangChain + HuggingFace's Inference API (no OpenAI credits required!) 13:57 LangChain + Retrieval Local LLMs for Retrieval QA - No OpenAI!!! 11:06 Talk to YOUR DATA without OpenAI APIs: LangChain 19:08 Deploy FULLY PRIVATE & FAST LLM Chatbots! (Local + Production) 08:14 Mistral-7B with LocalGPT: Chat with YOUR Documents 13:50 Private GPT4All : Chat with PDF with Local & Free LLM using GPT4All, LangChain & HuggingFace 16:21 ChatGPT for your data with Local LLM 14:03 How to Build a Custom Knowledge ChatGPT Clone in 5 Minutes More results