Build local LLM applications using Python and Ollama

Learn to create LLM applications in your system using Ollama and LangChain in Python | Completely private and secure
4.45 (132 reviews)
Udemy
platform
English
language
Data Science
category
Build local LLM applications using Python and Ollama
7,595
students
2 hours
content
Feb 2025
last update
$54.99
regular price

What you will learn

Download and install Ollama for running LLM models on your local machine

Set up and configure the Llama LLM model for local use

Customize LLM models using command-line options to meet specific application needs

Save and deploy modified versions of LLM models in your local environment

Develop Python-based applications that interact with Ollama models securely

Call and integrate models via Ollama’s REST API for seamless interaction with external systems

Explore OpenAI compatibility within Ollama to extend the functionality of your models

Build a Retrieval-Augmented Generation (RAG) system to process and query large documents efficiently

Create fully functional LLM applications using LangChain, Ollama, and tools like agents and retrieval systems to answer user queries

Screenshots

Build local LLM applications using Python and Ollama - Screenshot_01Build local LLM applications using Python and Ollama - Screenshot_02Build local LLM applications using Python and Ollama - Screenshot_03Build local LLM applications using Python and Ollama - Screenshot_04
6222463
udemy ID
10/7/2024
course created date
10/10/2024
course indexed date
Bot
course submited by
Build local LLM applications using Python and Ollama - Coupon | Comidoc