Posts

Modern Alerting Systems Design for Observability Teams

A practical pillar page on alerting design, routing, noise reduction, and human response across observability systems, paging tools, and chat platforms. Modern Alerting Systems Design for Observability Teams

AI Systems: Self-Hosted Assistants, RAG, and Local Infrastructure

Build self-hosted AI systems with OpenClaw, Hermes, RAG, and local LLM infrastructure. Learn to orchestrate assistants with memory, retrieval, routing, and observability. AI Systems: Self-Hosted Assistants, RAG, and Local Infrastructure

Anthropic Closes Claude Loophole for Agent Tools

Anthropic blocks Claude subscriptions in agent tools like OpenClaw, forcing API usage. What changed, who is affected, and practical workarounds. Anthropic Closes Claude Loophole for Agent Tools

LLM Self-Hosting and AI Sovereignty

Why and how self-hosted LLMs support AI sovereignty: control, data residency, and compliance for orgs and nations. LLM Self-Hosting and AI Sovereignty

vLLM Quickstart: High-Performance LLM Serving - in 2026

Complete vLLM setup guide with Docker, OpenAI API compatibility, PagedAttention optimization. Compare vLLM vs Ollama vs Docker Model Runner for production. vLLM Quickstart: High-Performance LLM Serving - in 2026

Hermes AI Assistant - Install, Setup, Workflow, and Troubleshooting

Self-hosted Hermes Agent install quickstart config workflow and troubleshooting, with provider setup, tool sandboxing, gateway tips, and diagnostics. Hermes AI Assistant - Install, Setup, Workflow, and Troubleshooting

Ollama vs vLLM vs LM Studio: Best Way to Run LLMs Locally in 2026?

Choosing the best way to run LLMs locally? Compare Ollama, vLLM, TGI, SGLang, LM Studio, LocalAI and 8+ tools by API support, hardware compatibility, tool calling, and production readiness. Ollama vs vLLM vs LM Studio: Best Way to Run LLMs Locally in 2026?

Vane (Perplexica 2.0) Quickstart With Ollama and llama.cpp

Self-host Vane (Perplexica 2.0) with Docker, wire it to SearxNG, and use local LLMs via Ollama or llama.cpp. History, features, API. Vane (Perplexica 2.0) Quickstart With Ollama and llama.cpp