Intelligent Research Assistant
Go to file
Steve White 01c1a74484 Update current focus with progressive report generation implementation plan 2025-03-12 10:28:46 -05:00
.gradio Integrate Jina Reranker with ResultCollector for semantic ranking 2025-02-27 16:59:54 -06:00
.note Update current focus with progressive report generation implementation plan 2025-03-12 10:28:46 -05:00
config Fully functional end-to-end test of research with gemini flash 2025-03-01 11:55:57 -06:00
examples Clean up repository: Remove unused test files and add new test directories 2025-03-11 16:56:58 -05:00
execution Update result collector, database manager, and document scraper test with improved error handling and performance optimizations 2025-02-28 08:07:19 -06:00
query Make generate_search_queries async to fix runtime errors 2025-02-28 16:46:25 -06:00
ranking Fix Jina Reranker API integration with proper request and response handling 2025-02-27 17:16:52 -06:00
report Add report templates module and tests 2025-03-12 10:15:28 -05:00
scripts Clean up repository: Remove unused test files and add new test directories 2025-03-11 16:56:58 -05:00
tests Add report templates module and tests 2025-03-12 10:15:28 -05:00
ui Update Gradio interface to handle async methods 2025-02-28 17:08:13 -06:00
utils Clean up repository: Remove unused test files and add new test directories 2025-03-11 16:56:58 -05:00
.gitignore Fully functional end-to-end test of research with gemini flash 2025-03-01 11:55:57 -06:00
.windsurfrules Clean up repository: Remove unused test files and add new test directories 2025-03-11 16:56:58 -05:00
README.md Initial commit: Intelligent Research System with search execution module 2025-02-27 16:21:54 -06:00
jina-ai-metaprompt.md Initial commit: Intelligent Research System with search execution module 2025-02-27 16:21:54 -06:00
report.md Clean up repository: Remove unused test files and add new test directories 2025-03-11 16:56:58 -05:00
report_115857.md Add support for custom models and thinking tag processing 2025-02-28 09:19:27 -06:00
report_20250228_090933_deepseek-r1-distill-llama-70b-specdec.md Add support for custom models and thinking tag processing 2025-02-28 09:19:27 -06:00
requirements.txt Implement Phase 1 of Report Generation Module: Document Scraping and SQLite Storage 2025-02-27 17:39:34 -06:00
run_ui.py Add Gradio web interface for the intelligent research system 2025-02-27 16:27:24 -06:00
test_report.md Clean up repository: Remove unused test files and add new test directories 2025-03-11 16:56:58 -05:00

README.md

Intelligent Research System

An end-to-end research automation system that handles the entire process from query to final report, leveraging multiple search sources and semantic similarity to produce comprehensive research results.

Overview

This system automates the research process by:

  1. Processing and enhancing user queries
  2. Executing searches across multiple engines (Serper, Google Scholar, arXiv)
  3. Ranking and filtering results based on relevance
  4. Generating comprehensive research reports

Features

  • Query Processing: Enhances user queries with additional context and classifies them by type and intent
  • Multi-Source Search: Executes searches across Serper (Google), Google Scholar, and arXiv
  • Intelligent Ranking: Uses Jina AI's Re-Ranker to prioritize the most relevant results
  • Result Deduplication: Removes duplicate results across different search engines
  • Modular Architecture: Easily extensible with new search engines and LLM providers

Components

  • Query Processor: Enhances and classifies user queries
  • Search Executor: Executes searches across multiple engines
  • Result Collector: Processes and organizes search results
  • Document Ranker: Ranks documents by relevance
  • Report Generator: Synthesizes information into a coherent report (coming soon)

Getting Started

Prerequisites

  • Python 3.8+
  • API keys for:
    • Serper API (for Google and Scholar search)
    • Groq (or other LLM provider)
    • Jina AI (for reranking)

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/sim-search.git
cd sim-search
  1. Install dependencies:
pip install -r requirements.txt
  1. Create a configuration file:
cp config/config.yaml.example config/config.yaml
  1. Edit the configuration file to add your API keys:
api_keys:
  serper: "your-serper-api-key"
  groq: "your-groq-api-key"
  jina: "your-jina-api-key"

Usage

Basic Usage

from query.query_processor import QueryProcessor
from execution.search_executor import SearchExecutor
from execution.result_collector import ResultCollector

# Initialize components
query_processor = QueryProcessor()
search_executor = SearchExecutor()
result_collector = ResultCollector()

# Process a query
processed_query = query_processor.process_query("What are the latest advancements in quantum computing?")

# Execute search
search_results = search_executor.execute_search(processed_query)

# Process results
processed_results = result_collector.process_results(search_results)

# Print top results
for i, result in enumerate(processed_results[:5]):
    print(f"{i+1}. {result['title']}")
    print(f"   URL: {result['url']}")
    print(f"   Snippet: {result['snippet'][:100]}...")
    print()

Testing

Run the test scripts to verify functionality:

# Test search execution
python test_search_execution.py

# Test all search handlers
python test_all_handlers.py

Project Structure

sim-search/
├── config/                 # Configuration management
├── query/                  # Query processing
├── execution/              # Search execution
│   └── api_handlers/       # Search API handlers
├── ranking/                # Document ranking
├── test_*.py               # Test scripts
└── requirements.txt        # Dependencies

LLM Providers

The system supports multiple LLM providers through the LiteLLM interface:

  • Groq (currently using Llama 3.1-8b-instant)
  • OpenAI
  • Anthropic
  • OpenRouter
  • Azure OpenAI

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Jina AI for their embedding and reranking APIs
  • Serper for their Google search API
  • Groq for their fast LLM inference