kibitz

kibitz

The coding agent for professionals with MCP integration.

104
Stars
14
Forks
104
Watchers
39
Issues
kibitz is a coding agent that supports advanced AI collaboration by enabling seamless integration with Model Context Protocol (MCP) servers via WebSockets. It allows users to configure Anthropic API keys, system prompts, and custom context providers for each project, enhancing contextual understanding for coding tasks. The platform is designed for developers and professionals seeking tailored AI-driven coding workflows and provides flexible project-specific configuration.

Key Features

Integration with Model Context Protocol (MCP) servers
Anthropic API key configuration
Project-specific context settings
Customizable system prompts
WebSocket-based MCP connectivity
Support for multiple projects with isolated configurations
UI-based settings management
Flexible server URI configuration
Kinode build support
Open-source installation and setup

Use Cases

Collaborating with AI agents for coding tasks
Integrating external context providers for smarter code generation
Managing multiple coding projects with individual AI configurations
Streamlining onboarding for teams needing project-based AI settings
Experimenting with Anthropic models in a customizable coding environment
Rapid prototyping with context-aware AI suggestions
Extending support for additional AI providers via MCP servers
Automating routine code reviews with configurable AI prompts
Setting up development environments with consistent AI configurations
Working in organizations with segmented, secure AI agent setups

README

kibitz

The coding agent for professionals

https://github.com/user-attachments/assets/3f8df448-1c81-4ff2-8598-c48283a4dc00

Prerequisites

  • git
  • npm

Installation

  1. Clone the repository:
bash
git clone https://github.com/nick1udwig/kibitz.git
cd kibitz
  1. Install dependencies:
bash
npm install
  1. Run the development server:
bash
npm run dev
  1. Open http://localhost:3000 in your browser.

Configuration

  1. Open the Settings panel in the UI
  2. Enter your Anthropic API key (Get one here).
  3. Optionally set a system prompt
  4. Configure MCPs by running them using ws-mcp and then connecting to them in the Settings page

Note configuration is PER-PROJECT. When creating a new project, it will use some, but not all, of the current project's configuration: the API key, model, and system prompt will be copied over, but MCP servers will not.

Building for Kinode

  1. Add a base to the endpoint by building with the NEXT_PUBLIC_BASE_PATH (MUST start with a /),
  2. Change the default WS-MCP server URI by specifying NEXT_PUBLIC_DEFAULT_WS_URI (MUST start with a /),

like so:

bash
NEXT_PUBLIC_BASE_PATH=/kibitz:kibitz:nick.kino NEXT_PUBLIC_DEFAULT_WS_URI=/fwd-ws:kibitz:nick.kino npm run build

and then copy the contents of out/ into the package's pkg/ui/ dir.

Star History

Star History Chart

Repository Owner

nick1udwig
nick1udwig

User

Repository Details

Language TypeScript
Default Branch main
Size 11,496 KB
Contributors 4
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
98.57%
CSS
1.23%
JavaScript
0.2%

Tags

Topics

ai anthropic claude llm mcp tool-use

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP Manager for Claude Desktop

    MCP Manager for Claude Desktop

    A desktop app to manage Model Context Protocol (MCP) servers for Claude Desktop on MacOS.

    MCP Manager for Claude Desktop provides a user-friendly interface to manage Model Context Protocol (MCP) servers, enabling Claude to access private data, APIs, and local or remote services securely from a MacOS desktop. It facilitates rapid configuration and integration with a wide variety of MCP servers, including productivity tools, databases, and web APIs. The app runs locally to ensure data privacy and streamlines connecting Claude to new sources through simple environment and server settings management.

    • 270
    • MCP
    • zueai/mcp-manager
  • Kanboard MCP Server

    Kanboard MCP Server

    MCP server for seamless AI integration with Kanboard project management.

    Kanboard MCP Server is a Go-based server implementing the Model Context Protocol (MCP) for integrating AI assistants with the Kanboard project management system. It enables users to manage projects, tasks, users, and workflows in Kanboard directly via natural language commands through compatible AI tools. With built-in support for secure authentication and high performance, it facilitates streamlined project operations between Kanboard and AI-powered clients like Cursor or Claude Desktop. The server is configurable and designed for compatibility with MCP standards.

    • 15
    • MCP
    • bivex/kanboard-mcp
  • Insforge MCP Server

    Insforge MCP Server

    A Model Context Protocol server for seamless integration with Insforge and compatible AI clients.

    Insforge MCP Server implements the Model Context Protocol (MCP), enabling smooth integration with various AI tools and clients. It allows users to configure and manage connections to the Insforge platform, providing automated and manual installation methods. The server supports multiple AI clients such as Claude Code, Cursor, Windsurf, Cline, Roo Code, and Trae via standardized context management. Documentation and configuration guidelines are available for further customization and usage.

    • 3
    • MCP
    • InsForge/insforge-mcp
  • Kubectl MCP Server

    Kubectl MCP Server

    Natural language Kubernetes management for AI assistants using the Model Context Protocol.

    Kubectl MCP Server enables AI assistants such as Claude and Cursor to interact with Kubernetes clusters using natural language through the Model Context Protocol (MCP). It supports a wide range of Kubernetes operations including resource management, Helm integration, monitoring, diagnostics, and advanced security features. The server is designed to handle context-aware commands, maintain session memory, and provide intelligent command construction and explanations. Integration with multiple AI assistants and flexible transport protocols are supported for a seamless user experience.

    • 734
    • MCP
    • rohitg00/kubectl-mcp-server
  • mcp

    mcp

    Universal remote MCP server connecting AI clients to productivity tools.

    WayStation MCP acts as a remote Model Context Protocol (MCP) server, enabling seamless integration between AI clients like Claude or Cursor and a wide range of productivity applications, such as Notion, Monday, Airtable, Jira, and more. It supports multiple secure connection transports and offers both general and user-specific preauthenticated endpoints. The platform emphasizes ease of integration, OAuth2-based authentication, and broad app compatibility. Users can manage their integrations through a user dashboard, simplifying complex workflow automations for AI-powered productivity.

    • 27
    • MCP
    • waystation-ai/mcp
  • Exa MCP Server

    Exa MCP Server

    Fast, efficient web and code context for AI coding assistants.

    Exa MCP Server provides a Model Context Protocol (MCP) server interface that connects AI assistants to Exa AI’s powerful search capabilities, including code, documentation, and web search. It enables coding agents to retrieve precise, token-efficient context from billions of sources such as GitHub, StackOverflow, and documentation sites, reducing hallucinations in coding agents. The platform supports integration with popular tools like Cursor, Claude, and VS Code through standardized MCP configuration, offering configurable access to various research and code-related tools via HTTP.

    • 3,224
    • MCP
    • exa-labs/exa-mcp-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results