Featured Tool

Ollama

Run large language models locally with a simple CLI interface

Open SourceSelf-HostedOffline Capable
0.0 (0)

About

Ollama makes it easy to run large language models locally. It bundles model weights, configurations, and datasets into a single package called a Modelfile. Supports Llama 2, Mistral, Gemma, and many other models with a simple command-line interface.

Reviews (0)

Leave a Review

No reviews yet. Be the first to review!

Details

Price
Free
Platform
Local/Desktop
Difficulty
Beginner (1/5)
License
MIT
Added
Jan 29, 2026

Similar Tools

Featured

Port of Meta's LLaMA model in C/C++ for efficient CPU inference

Open SourceSelf-HostedOffline
Intermediate
0.0 (0)
Featured

High-throughput LLM serving engine with PagedAttention

Open SourceSelf-HostedOfflineGPU 16GB+
Intermediate
0.0 (0)

Hugging Face's high-performance text generation server

Open SourceSelf-HostedOfflineGPU 16GB+
Advanced
0.0 (0)