_CORE
AI & Agentic Systems Core Information Systems Cloud & Platform Engineering Data Platform & Integration Security & Compliance QA, Testing & Observability IoT, Automation & Robotics Mobile & Digital Banking & Finance Insurance Public Administration Defense & Security Healthcare Energy & Utilities Telco & Media Manufacturing Logistics & E-commerce Retail & Loyalty
References Technologies Blog Know-how Tools
About Collaboration Careers
CS EN
Let's talk

Ollama — LLM on Your Laptop in 5 Minutes

30. 01. 2024 1 min read CORE SYSTEMSai
Ollama — LLM on Your Laptop in 5 Minutes

“I want to try an LLM locally but don’t want to set up CUDA.” Ollama is the answer: one command to install, one to run. Docker for LLMs.

Why Local Inference

  • Privacy: Data never leaves your machine
  • Offline: Works without internet
  • Cost: $0 per token
  • Latency: No network roundtrip

OpenAI-Compatible API

Redirect your existing code to localhost:11434. LangChain, LlamaIndex — everything integrates natively.

  • mistral (7B): Versatile, decent Czech language support
  • codellama: Code generation
  • phi-2 (2.7B): Ultra lightweight, surprisingly capable

Local AI Is a Reality

Every developer can run a quality LLM locally. A must-have tool.

ollamalocal aillmdeveloper tools
Share:

CORE SYSTEMS

Stavíme core systémy a AI agenty, které drží provoz. 15 let zkušeností s enterprise IT.

Need help with implementation?

Our experts can help with design, implementation, and operations. From architecture to production.

Contact us