Back to Market
Local AI on Windows
Enables a self-hosted AI environment on Windows, integrating Ollama, Open WebUI, and MCP for local language model management and chat interaction.
12
This project provides a complete self-hosted AI stack for Windows, combining the power of Ollama for running language models locally, Open WebUI for a user-friendly chat interface, and MCP for centralized model management. It includes a sample MCP-based tool server for managing employee leave, exposed via OpenAPI for seamless integration. This setup offers full control, privacy, and flexibility without relying on cloud services.
API Development
Developer Tools
Productivity & Workflow