Home/Deals/IT & Software

Local LLMs via Ollama & LM Studio - The Practical Guide

Run open large language models like Gemma, Llama or DeepSeek locally to perform AI inference on consumer hardware.
4.8 ★★★★★6,220 studentsCreated by Maximilian SchwarzmüllerLast updated Oct 23, 2025🌐 English

What you'll learn

Explore & understand Open-LLM use-cases
Achieve 100% privacy & agency by running highly capable open LLMs locally
Select & run open LLMs like Gemma 3 or Llama 4
Utilize Ollama & LM Studio to run open LLMs locally
Analyze text, documents and images with open LLMs
Integrate locally running open LLMs into custom AI-powered programs & applications

Requirements

Basic understanding of LLM functionality & usage
NO programming or advanced technical expertise is required
If you want to run models locally: At least 8 GB of (V)RAM will be required

Description

Frequently Asked Questions

Student Feedback

4.8
★★★★★
Course Rating
75%
15%
5%
5%
5%
S
Sarah J.
★★★★★2 weeks ago

This course was absolutely amazing! The instructor explained everything clearly and the projects were very helpful.

M
Michael T.
★★★★1 month ago

Great content, highly recommended for beginners. Just wish there were more practice exercises.

D
David K.
★★★★★2 months ago

Best course on this topic I've taken so far. Worth every penny (even better since I got it for free!).

More Courses You Might Like

Local LLMs via Ollama & LM Studio - The Practical Guide
$9.99$119.9992% Off
🎫 Coupon
D_1025
REDEEM COUPON
30-Day Money-Back Guarantee
This course includes:
  • 📺 4h on-demand video
  • 📱 Access on mobile and TV
  • ♾️ Full lifetime access
  • 🏆 Certificate of completion