This is the fourth iteration of the Archon project, building upon V3 by adding a comprehensive Streamlit UI for managing all aspects of Archon and Docker support. The system retains the core LangGraph workflow and MCP support from V3, but now provides a unified interface for environment configuration, database setup, documentation crawling, agent service management, and MCP integration.
What makes V4 special is its guided setup process that walks users through each step of configuring and running Archon. The Streamlit UI eliminates the need for manual configuration of environment variables, database setup, and service management, making Archon much more accessible to users without extensive technical knowledge.
The core remains an intelligent documentation crawler and RAG (Retrieval-Augmented Generation) system built using Pydantic AI, LangGraph, and Supabase. The system crawls the Pydantic AI documentation, stores content in a vector database, and provides Pydantic AI agent code by retrieving and analyzing relevant documentation chunks.
This version continues to support both local LLMs with Ollama and cloud-based LLMs through OpenAI/OpenRouter.
- Comprehensive Streamlit UI: Unified interface for all Archon functionality
- Docker Support: Containerized deployment with automated build and run scripts
- Guided Setup Process: Step-by-step instructions for configuration
- Environment Variable Management: Configure all settings through the UI
- Database Setup: Automated creation of Supabase tables and indexes
- Documentation Crawler: Fetch and process documentation for RAG
- Agent Service Management: Start/stop the agent service from the UI
- MCP Integration: Configure and manage MCP for AI IDE integration
- Multiple LLM Support: OpenAI, OpenRouter, and local Ollama models
- Multi-agent workflow using LangGraph: Manage multiple agents simultaneously
- Docker (optional but preferred)
- Python 3.11+
- Supabase account (for vector database)
- OpenAI/OpenRouter/Anthropic API key or Ollama for local LLMs
- Clone the repository:
git clone https://github.com/coleam00/archon.git
cd archon/iterations/v4-streamlit-ui-overhaul
- Run the Docker setup script:
# This will build both containers and start Archon
python run_docker.py
- Access the Streamlit UI at http://localhost:8501.
Note:
run_docker.py
will automatically:
- Build the MCP server container
- Build the main Archon container
- Run Archon with the appropriate port mappings
- Use environment variables from
.env
file if it exists
- Clone the repository:
git clone https://github.com/coleam00/archon.git
cd archon/iterations/v4-streamlit-ui-overhaul
- Install dependencies:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
- Start the Streamlit UI:
streamlit run streamlit_ui.py
- Access the Streamlit UI at http://localhost:8501.
The Streamlit interface will guide you through each step with clear instructions and interactive elements. There are a good amount of steps for the setup but it goes quick!
The Streamlit UI provides the following tabs:
- Intro: Overview and guided setup process
- Environment: Configure API keys and model settings
- Database: Set up your Supabase vector database
- Documentation: Crawl and index the Pydantic AI documentation
- Agent Service: Start and monitor the agent service
- Chat: Interact with Archon to create AI agents
- MCP: Configure integration with AI IDEs
The Environment tab allows you to set and manage all environment variables through the UI:
- Base URL for API endpoints
- API keys for LLM providers
- Supabase connection details
- Model selections for different agent roles
- Embedding model configuration
All settings are saved to an env_vars.json
file, which is automatically loaded when Archon starts.
The Database tab simplifies the process of setting up your Supabase database:
- Select embedding dimensions based on your model
- View SQL commands for table creation
- Get instructions for executing SQL in Supabase
- Clear existing data if needed
The Documentation tab provides an interface for crawling and managing documentation:
- Start and monitor the crawling process with progress tracking
- View logs of the crawling process
- Clear existing documentation
- View database statistics
The Agent Service tab allows you to manage the agent service:
- Start, restart, and stop the service
- Monitor service output in real-time
- Clear output logs
- Auto-refresh for continuous monitoring
The MCP tab simplifies the process of configuring MCP for AI IDEs:
- Select your IDE (Windsurf, Cursor, or Cline)
- Generate configuration commands or JSON
- Copy configuration to clipboard
- Get step-by-step instructions for your specific IDE
streamlit_ui.py
: Comprehensive web interface for managing all aspects of Archongraph_service.py
: FastAPI service that handles the agentic workflowrun_docker.py
: Script to build and run Archon Docker containersDockerfile
: Container definition for the main Archon application
mcp/
: Model Context Protocol server implementationmcp_server.py
: MCP server script for AI IDE integrationDockerfile
: Container definition for the MCP server
archon/
: Core agent and workflow implementationarchon_graph.py
: LangGraph workflow definition and agent coordinationpydantic_ai_coder.py
: Main coding agent with RAG capabilitiescrawl_pydantic_ai_docs.py
: Documentation crawler and processor
utils/
: Utility functions and database setuputils.py
: Shared utility functionssite_pages.sql
: Database setup commandsenv_vars.json
: Environment variables defined in the UI are stored here (included in .gitignore, file is created automatically)
- Docker Containers: Run Archon in isolated containers with all dependencies included
- Main container: Runs the Streamlit UI and graph service
- MCP container: Provides MCP server functionality for AI IDEs
- Local Python: Run directly on your system with a Python virtual environment
The Docker implementation consists of two containers:
-
Main Archon Container:
- Runs the Streamlit UI on port 8501
- Hosts the Graph Service on port 8100
- Built from the root Dockerfile
- Handles all agent functionality and user interactions
-
MCP Container:
- Implements the Model Context Protocol for AI IDE integration
- Built from the mcp/Dockerfile
- Communicates with the main container's Graph Service
- Provides a standardized interface for AI IDEs like Windsurf, Cursor, and Cline
When running with Docker, the run_docker.py
script automates building and starting both containers with the proper configuration.
Contributions are welcome! Please feel free to submit a Pull Request.