mcp-gpt-researcher/README.md
ThomasTaroni 44b91b9375 Refactor codebase to implement MCP server for GPT Researcher
Replaced FastAPI app with an MCP server implementation, enhancing flexibility and modularity for research operations. Deprecated `phoenix_technologies` package, updated server logic, added utility functions, and revised dependencies in `requirements.txt`. Updated Dockerfile and README to align with the new architecture.
2025-04-26 17:54:43 +02:00

72 lines
2.6 KiB
Markdown

# Project Overview
## Description
This project is a server-side application built with Python that facilitates research-related operations. It provides functionalities to manage researchers, handle resources, process queries, and generate in-depth research reports. The application features reusable utility functions to streamline responses, handle exceptions gracefully, and format data for client consumption. A `Dockerfile` is provided for easy containerization and deployment.
## Features
### Server Functionality
The main server functionalities are defined in `server.py`, which includes:
- **research_resource**: Management of research resources.
- **deep_research**: Conducts detailed research operations.
- **write_report**: Creates comprehensive reports based on researched data.
- **get_research_sources**: Retrieves information sources for research.
- **get_research_context**: Provides contextual information tied to research.
- **research_query**: Handles incoming research-related queries.
- **run_server**: Initializes and runs the server.
### Utility Functions
The `utils.py` file provides additional support, including:
- **Response Handling**:
- `create_error_response`
- `create_success_response`
- **Error & Exception Management**:
- `handle_exception`
- **Data Operations**:
- `get_researcher_by_id`
- `format_sources_for_response`
- `format_context_with_sources`
- `store_research_results`
- `create_research_prompt`
### Docker Support
The included `Dockerfile` allows for simple containerized deployment:
- Uses a lightweight Python 3.13 image.
- Installs required dependencies from `requirements.txt`.
- Configures the application to run via `server.py` on port `8000` using `CMD ["python", "server.py"]`.
## Setup and Usage
### Prerequisites
- Python 3.13 or later.
- `pip` for dependency management.
- Docker (optional, for containerized deployment).
### Installation
1. Clone this repository.
2. Install dependencies:
``` bash
pip install -r requirements.txt
```
1. Run the application:
``` bash
python server.py
```
### Using Docker
Build and run the application as a Docker container:
1. Build the Docker image:
``` bash
docker build -t research-app .
```
1. Run the Docker container:
``` bash
docker run -p 8000:8000 research-app
```
The application will be accessible at `http://localhost:8000`.
## Folder Structure
```
|-- src/
|-- server.py # Main server logic
|-- utils.py # Reusable utility functions
|-- Dockerfile # Containerization setup
|-- requirements.txt # Dependencies file
|-- README.md # Documentation (this file)
```