A comprehensive AI-powered research platform that combines autonomous research agents with interactive chat capabilities.
- π Autonomous Research: AI agents conduct multi-step research using web search, document analysis, and academic sources
- π¬ Interactive Chat: Chat with your research results using local LLMs
- π URL Tracking: Automatically collects and organizes all website links found during research
- π Real-time Monitoring: Live view of research process, server logs, and agent intelligence
- π― Local First: Runs entirely on your hardware with local LLMs via llama.cpp
Note: If the video doesn't play directly in GitHub, you can view it here
- Python 3.8+
- Node.js 16+
- llama.cpp server
- Research-capable LLM model (e.g., Qwen, Llama, etc.)
git clone https://github.com/gopinath87607/LocalDeepResearch.git
cd LocalDeepResearch
# Create isolated environment with Python 3.10.0
conda create -n LocalDeepResearch_env python=3.10.0
conda activate LocalDeepResearch_env
# Or using virtualenv
python3.10 -m venv LocalDeepResearch_env
source LocalDeepResearch_env/bin/activate # On Windows: deepresearch_env\Scripts\activate
pip install -r requirements.txt
# Backend setup
cd backend
pip install -r requirements.txt
# Frontend setup
cd ../frontend
npm install# Main research model (port 8080)
./llama-server -m path/to/your-model.gguf --host 0.0.0.0 --port 8080
# ReaderLM for web extraction (port 8081) - optional
./llama-server -m path/to/reader-lm.gguf --host 0.0.0.0 --port 8081edit this file > inference/run_react_infer_with_progress.sh
export SERPER_KEY_ID="your-serper-api-key"
export API_BASE="http://localhost:8080/v1"
export READERLM_ENDPOINT="http://localhost:8081/v1"# Start backend
cd backend
python main.py
# Start frontend (new terminal)
cd frontend
npm startVisit http://localhost:3000 and start researching!
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β React Frontendβ β Flask Backend β β llama.cpp β
β βββββΊβ βββββΊβ LLM Servers β
β - Dashboard β β - Research API β β β
β - Chat UI β β - WebSocket β β Main: :8080 β
β - URL Display β β - URL Tracking β β Reader: :8081 β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ
β Research Tools β
β β
β - Web Search β
β - Visit Pages β
β - Scholar β
β - Python Code β
βββββββββββββββββββ
- Serper API: Required for web search - get free key at serper.dev
- Other APIs: Optional depending on tools used
- Main Model: Any instruction-following model (Tongyi-DeepResearch, Jan-v1, Jan-v1-2509-gguf, etc.)
- ReaderLM: jinaai/ReaderLM-v2 for better web content extraction
- Tongyi-DeepResearch-30B-A3B: Available at π€ HuggingFace or π€ ModelScope
- Enter your research question
- Watch real-time progress in the monitoring panels
- Review collected URLs and research intelligence
- Read comprehensive results
After research completes:
- Chat interface appears below results
- Ask follow-up questions about findings
- Get clarifications and deeper insights
- Context-aware responses based on research
backend/: Flask API server and research orchestrationfrontend/: React dashboard and user interfaceinference/: Research agent scripts and toolstools/: Individual research capability modules
- Create tool in
backend/tools/tool_name.py - Register in research agent configuration
- Test with isolated queries
See docs/api.md for detailed API reference.
- Fork the repository
- Create feature branch (
git checkout -b feature/amazing-feature) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- DeepResearch by Alibaba-NLP for the core research agent implementation and methodologies
- llama.cpp for local LLM inference
- Serper for web search API
- Research methodologies inspired by various AI research frameworks
- π Documentation
- π Issue Tracker
- π¬ Discussions
all out put is saved here output_dir = f"/home/XXX/DeepResearch/outputs/session_{session_id}"
β Star this repo if you find it useful!


