DataScience Show

DataScience Show

How to Use Ollama for Effortless Local LLM Deployment

Mirko Peters - M365 Specialist's avatar
Mirko Peters - M365 Specialist
Jun 21, 2025
∙ Paid

You can use Ollama to run LLMs locally in just a few steps. Ollama makes it easy for you to keep your data private and save money. You do not need to worry about cloud fees or sending your information elsewhere. Ollama cuts model inference time by up to 50% compared to cloud platforms, so you get faster results. You can use Ollama on Windows, macOS, or …

User's avatar

Continue reading this post for free, courtesy of Mirko Peters - M365 Specialist.

Or purchase a paid subscription.
© 2026 Mirko Peters · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture