TOD - TheOneDev CLI Tool
Streamlined DevOps and AI integration for OneDev 13+ via an advanced CLI and Model Context Protocol server.
Key Features
Use Cases
README
TOD - TheOneDev CLI Tool
TOD (TheOneDev) is a powerful command-line tool for OneDev 13+ that streamlines your development workflow by enabling you to run CI/CD jobs against local changes, set up local working directory to work on pull requests, etc. It also offers a comprehensive Model Context Protocol (MCP) server with tools and prompts, enabling you to interact with OneDev 13+ through AI assistants in an intelligent and natural way.
Features
- MCP (Model Context Protocol) server for AI tool integration
- Run CI/CD jobs against local changes without committing/pushing
- Run jobs against specific branches or tags
- Checkout pull requests locally
- Check and migrate build specifications to the latest version
- Real-time log streaming to console from job execution
- Configuration management via config files
- Cross-platform support (Windows, macOS, Linux)
Installation
To install tod, just put tod binary into your PATH.
Download Pre-built Binaries
https://code.onedev.io/onedev/tod/~builds?query=%22Job%22+is+%22Release%22
Build Binary from Source
Requirements:
- Go 1.22.1 or higher
Steps:
-
Clone the repository:
bashgit clone https://code.onedev.io/onedev/tod.git cd tod -
Build the binary:
bashgo build
Configuration
TOD uses a configuration file to store commonly used settings, eliminating the need to specify them repeatedly.
Config File Location
Create a config file at: $HOME/.todconfig
Config File Format
The configuration uses INI format:
server-url=https://onedev.example.com
access-token=your-personal-access-token
Commands
mcp - Start MCP Server
Start the Model Context Protocol server for AI tool integration.
Syntax:
tod mcp [OPTIONS]
Options:
--log-file <file>- Specify log file path for debug logging
Example:
# Start MCP server
tod mcp
# Start with debug logging
tod mcp --log-file /tmp/tod-mcp.log
For detailed information about available MCP tools and their parameters, see MCP Documentation.
run-local - Run Jobs Against Local Changes
Run CI/CD jobs against your uncommitted local changes without the commit/push/run/check loop.
Syntax:
tod run-local [OPTIONS] <job-name>
Options:
--working-dir <dir>- Specify working directory (defaults to current directory). Working directory is expected to be inside a git repository, with one of the remote pointing to a OneDev project--param <key=value>or-p <key=value>- Specify job parameters (can be used multiple times)
Examples:
# Basic usage
tod run-local ci
# With parameters
tod run-local -p database=postgres -p environment=test ci
# Specify working directory
tod run-local --working-dir /path/to/project ci
How it works:
- Stashes your local changes
- Creates a temporary commit
- Pushes to a temporal ref on the server
- Runs the specified job
- Streams logs back to your terminal
- Cancels the job if you press Ctrl+C
run - Run Jobs Against Branches or Tags
Run CI/CD jobs against specific branches or tags in the repository.
Syntax:
tod run [OPTIONS] <job-name>
Options:
--working-dir <dir>- Specify working directory (defaults to current directory). Working directory is expected to be inside a git repository, with one of the remote pointing to a OneDev project--branch <branch>- Run against specific branch (mutually exclusive with --tag)--tag <tag>- Run against specific tag (mutually exclusive with --branch)--param <key=value>or-p <key=value>- Specify job parameters (can be used multiple times)
Examples:
# Run against main branch
tod run --branch main ci
# Run against a tag
tod run --tag v1.2.3 release
# Run with parameters
tod run --branch develop -p environment=staging ci
checkout - Checkout Pull Requests
Checkout pull requests locally for testing and review.
Syntax:
tod checkout [OPTIONS] <pull-request-reference>
Options:
--working-dir <dir>- Specify working directory (defaults to current directory). Working directory is expected to be inside a git repository, with one of the remote pointing to a OneDev project
Example:
# Checkout pull request #123
tod checkout 123
# Checkout in specific directory
tod checkout --working-dir /path/to/project 456
check-build-spec - Check and Migrate Build Specifications
Check your .onedev-buildspec.yml file for validity and migrate it to the latest version if needed.
Syntax:
tod check-build-spec [OPTIONS]
Options:
--working-dir <dir>- Directory containing build spec file (defaults to current directory). Working directory is expected to be inside a git repository, with one of the remote pointing to a OneDev project
Example:
# Check build spec in current directory
tod check-build-spec
# Check build spec in specific directory
tod check-build-spec --working-dir /path/to/project
Usage Examples
Complete Workflow Example
-
Set up configuration:
bash# Create ~/.todconfig echo "server-url=https://onedev.example.com" > ~/.todconfig echo "access-token=your-token-here" >> ~/.todconfig -
Test local changes:
bash# Run CI against your uncommitted changes cd /path/to/onedev-git-repository tod run-local ci -
Run against specific branch:
bash# Run ci job against the main branch cd /path/to/onedev-git-repository tod run --branch main ci -
Checkout a pull request:
bash# Checkout pull request #123 cd /path/to/onedev-git-repository tod checkout #123
Parameter Usage
# Multiple parameters of the same key
tod run-local -p env=test -p env=staging -p db=postgres ci
Important Notes to Run Local Job
Nginx Configuration
If OneDev is running behind Nginx, configure it to disable HTTP buffering for real-time log streaming:
location /~api/streaming {
proxy_pass http://localhost:6610/~api/streaming;
proxy_buffering off;
}
See OneDev Nginx setup documentation for details.
Security Considerations
If the job accesses job secrets. Make sure the authorization field is cleared to allow all jobs. Set authorization to allow all branches is not sufficient as local change will be pushed to a temporal ref not belonging to any branch
Performance Tips
- Large repositories: Use appropriate clone depth in checkout steps instead of full history
- External dependencies: Implement caching for downloads and intermediate files
- Build optimization: Cache slow-to-generate intermediate files
Contributing
TOD is part of the OneDev ecosystem. For contributions, issues, and feature requests, visit the OneDev project.
License
See license.txt for license information.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
mcp-cli
A command-line inspector and client for the Model Context Protocol
mcp-cli is a command-line interface tool designed to interact with Model Context Protocol (MCP) servers. It allows users to run and connect to MCP servers from various sources, inspect available tools, resources, and prompts, and execute commands non-interactively or interactively. The tool supports OAuth for various server types, making integration and automation seamless for developers working with MCP-compliant servers.
- ⭐ 391
- MCP
- wong2/mcp-cli
MyMCP Server (All-in-One Model Context Protocol)
Powerful and extensible Model Context Protocol server with developer and productivity integrations.
MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.
- ⭐ 93
- MCP
- nguyenvanduocit/all-in-one-model-context-protocol
MCP CLI
A powerful CLI for seamless interaction with Model Context Protocol servers and advanced LLMs.
MCP CLI is a modular command-line interface designed for interacting with Model Context Protocol (MCP) servers and managing conversations with large language models. It integrates with the CHUK Tool Processor and CHUK-LLM to provide real-time chat, interactive command shells, and automation capabilities. The system supports a wide array of AI providers and models, advanced tool usage, context management, and performance metrics. Rich output formatting, concurrent tool execution, and flexible configuration make it suitable for both end-users and developers.
- ⭐ 1,755
- MCP
- chrishayuk/mcp-cli
Unichat MCP Server
Universal MCP server providing context-aware AI chat and code tools across major model vendors.
Unichat MCP Server enables sending standardized requests to leading AI model vendors, including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception, utilizing the Model Context Protocol. It features unified endpoints for chat interactions and provides specialized tools for code review, documentation generation, code explanation, and programmatic code reworking. The server is designed for seamless integration with platforms like Claude Desktop and installation via Smithery. Vendor API keys are required for secure access to supported providers.
- ⭐ 37
- MCP
- amidabuddha/unichat-mcp-server
Teamwork MCP Server
Seamless Teamwork.com integration for Large Language Models via the Model Context Protocol
Teamwork MCP Server is an implementation of the Model Context Protocol (MCP) that enables Large Language Models to interact securely and programmatically with Teamwork.com. It offers standardized interfaces, including HTTP and STDIO, allowing AI agents to perform various project management operations. The server supports multiple authentication methods, an extensible toolset architecture, and is designed for production deployments. It provides read-only capability for safe integrations and robust observability features.
- ⭐ 11
- MCP
- Teamwork/mcp
Klavis
One MCP server for AI agents to handle thousands of tools.
Klavis provides an MCP (Model Context Protocol) server with over 100 prebuilt integrations for AI agents, enabling seamless connectivity with various tools and services. It offers both cloud-hosted and self-hosted deployment options and includes out-of-the-box OAuth support for secure authentication. Klavis is designed to act as an intelligent connector, streamlining workflow automation and enhancing agent capability through standardized context management.
- ⭐ 5,447
- MCP
- Klavis-AI/klavis
Didn't find tool you were looking for?