| tags |
|
||||
|---|---|---|---|---|---|
| deployspec |
|
||||
| license | MIT |
An autonomous research assistant powered by LLMs, designed to perform deep information gathering, multi-step reasoning, and structured report synthesis.
- Multi-Agent Orchestration: A specialized
Orchestratordecomposes goals into sub-tasks, delegated to focusedSubAgentsfor deep investigation. - Robust ReAct Framework: Built on a unified
BaseReActAgentabstraction, ensuring consistent reasoning, acting, and context management across all agent roles. - Real-time Event Streaming: Powered by a custom
StreamHandler, providing live feedback on Thoughts, Actions, and tool outputs. - Powerful Toolset:
- Smart Search: Integrated with DuckDuckGo for up-to-date web discovery.
- High-Precision Scraping: Uses
TrafilaturaandPlaywrightfor clean content extraction and JavaScript rendering support. - Multi-Format Parsing: Built-in support for PDF analysis and Markdown conversion via
MarkItDown.
- Interactive UI: A modern Streamlit-based dashboard for visualizing the research process and managing loops.
- Core: Python 3.12+
- Agent Framework: Custom ReAct orchestration with OpenAI-compatible API integration.
- UI: Streamlit
- Package Management: uv
Ensure you have uv installed, then run:
# Clone the repository
git clone <repo-url>
cd deep-research
# Install dependencies and setup virtual environment
uv syncCreate a .env file in the project root:
OPENAI_API_KEY=your_api_key_here
OPENAI_BASE_URL=https://api.openai.com/v1
MODEL_NAME=gpt-4o # Or your preferred modelStart the interactive research interface:
uv run drsrc/deep_research/agents/: Core agent roles (Orchestrator,SubAgent,Validator) inheriting from a sharedBaseAgent.core/: Infrastructure layer includingLLMClient,Log, andStreamHandler.tools/: Extensible tool modules and theToolRegistrymanager.prompts/: Role-specific system instructions and templates.models.py: Shared data models for events and pulses.config.py: Global configuration and environment management.gui.py: Streamlit frontend implementation.
This project is licensed under the MIT License - see the LICENSE file for details.