Self-Hosting Cognee: Choosing LLM on Ollama.

Testing Cognee RAG framework with local LLMs - gpt-oss, qwen3, deepseek-r1, and others. Real-world results, configs, and performance insights. https://www.glukhov.org/post/2025/12/selfhosting-cognee-quickstart-llms-comparison/ #SelfHosting #LLM #AI #RAG #Python #Ollama #Hardware #Docker #OpenSource

Comments

Popular posts from this blog

Gitflow Workflow overview

UV - a New Python Package Project and Environment Manager. Here we provide it's short description, performance statistics, how to install it and it's main commands