Refactor module structure and improve Docker configuration

Reorganized module directories under `phoenix_technologies` for better namespace clarity and maintainability. Updated the Dockerfile to use an environment variable for the application entry point, enhancing flexibility in deployment. Additionally, revamped the README to reflect the new structure and provide clearer project documentation.
This commit is contained in:
ThomasTaroni 2025-05-03 11:03:15 +02:00
parent 6bab336883
commit eec1b34517
7 changed files with 117 additions and 67 deletions

View file

@ -4,6 +4,9 @@ FROM python:3.13-slim
# Set environment variable for Python
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Use an environment variable to define the script path
ENV APP_ENTRYPOINT "/app/mcp/gpt_researcher/server.py"
# Set working directory within the container
WORKDIR /app
@ -22,4 +25,4 @@ COPY src/ /app/
EXPOSE 8000
# Set the default command to run the app with `uvicorn`
CMD ["python", "server.py"]
CMD ["python", "$APP_ENTRYPOINT"]

163
README.md
View file

@ -1,72 +1,119 @@
# Project Overview
## Description
This project is a server-side application built with Python that facilitates research-related operations. It provides functionalities to manage researchers, handle resources, process queries, and generate in-depth research reports. The application features reusable utility functions to streamline responses, handle exceptions gracefully, and format data for client consumption. A `Dockerfile` is provided for easy containerization and deployment.
# GPT Researcher
## Project Description
**GPT Researcher** is a Python-based tool designed to assist researchers and knowledge seekers in generating detailed research insights efficiently and accurately. This project combines powerful machine learning-backed methodologies with intuitive features for processing and analyzing research data.
The project revolves around managing research queries, obtaining resourceful insights, and enabling the creation of detailed research outputs. It provides flexibility for both quick general searches and deep explorations of specific topics. Additionally, it comes with a robust system for managing logs, exceptions, and user-defined custom responses.
The application is containerized with Docker, making it easier to deploy in any environment. It is built to utilize modular components for research management, formatting responses, and enhancing the overall research experience.
## Features
### Server Functionality
The main server functionalities are defined in `server.py`, which includes:
- **research_resource**: Management of research resources.
- **deep_research**: Conducts detailed research operations.
- **write_report**: Creates comprehensive reports based on researched data.
- **get_research_sources**: Retrieves information sources for research.
- **get_research_context**: Provides contextual information tied to research.
- **research_query**: Handles incoming research-related queries.
- **run_server**: Initializes and runs the server.
### Key Highlights:
1. **Research Operations**
- Perform deep-dives into specific topics through a systematic research mechanism.
- Quickly search for general information in a fast and lightweight fashion (`quick_search`).
- Retrieve necessary sources for research with flexibility.
### Utility Functions
The `utils.py` file provides additional support, including:
- **Response Handling**:
- `create_error_response`
- `create_success_response`
2. **Resource Management**
- Organize and store research results for later analysis.
- Format research sources and context to create user-friendly reports.
- Generate customizable research prompts for optimized research requests.
- **Error & Exception Management**:
- `handle_exception`
3. **Logging and Error Handling**
- Employ a custom logging handler (`CustomLogsHandler`) to manage logs effectively.
- Provide user-friendly error responses with flexible exception handling.
- **Data Operations**:
- `get_researcher_by_id`
- `format_sources_for_response`
- `format_context_with_sources`
- `store_research_results`
- `create_research_prompt`
4. **Report Generation**
- Automate the creation of research reports in reusable templates.
- Aggregate research resources and contexts while ensuring professional quality formatting.
### Docker Support
The included `Dockerfile` allows for simple containerized deployment:
- Uses a lightweight Python 3.13 image.
- Installs required dependencies from `requirements.txt`.
- Configures the application to run via `server.py` on port `8000` using `CMD ["python", "server.py"]`.
5. **Web Server Support**
- Run a backend server with the `run_server` function for handling client-side operations and queries.
## Setup and Usage
6. **Containerized Deployment**
- Fully Dockerized to allow for fast setup and consistent deployment across environments.
## Installation
To get the project up and running locally, follow these steps:
### Prerequisites
- Python 3.13 or later.
- `pip` for dependency management.
- Docker (optional, for containerized deployment).
- Python 3.13+
- Docker Engine (for containerized deployments)
- `pip` for managing necessary Python packages
### Installation
1. Clone this repository.
2. Install dependencies:
``` bash
### Steps:
1. **Clone the Repository**:
``` sh
git clone <repository-url>
cd gpt_researcher
```
1. **Install Dependencies**: If you are working outside of Docker:
``` sh
pip install -r requirements.txt
```
1. Run the application:
``` bash
python server.py
1. **Run the Application**:
``` sh
python src/mcp/gpt_researcher/server.py
```
### Using Docker
Build and run the application as a Docker container:
1. Build the Docker image:
``` bash
docker build -t research-app .
1. **Using Docker**: Build and run the Docker container:
``` sh
docker build -t gpt-researcher .
docker run -p 8000:8000 gpt-researcher
```
1. Run the Docker container:
``` bash
docker run -p 8000:8000 research-app
The app will be available at `http://localhost:8000`.
## File Structure
### Overview:
``` plaintext
src/
├── mcp/
│ ├── gpt_researcher/
│ │ ├── server.py # Main entrypoint for research workflow and server
│ │ └── utils.py # Utility functions for research formatting and storage
Dockerfile # Defines the container runtime environment
requirements.txt # Python dependencies for the project
```
The application will be accessible at `http://localhost:8000`.
## Folder Structure
```
|-- src/
|-- server.py # Main server logic
|-- utils.py # Reusable utility functions
|-- Dockerfile # Containerization setup
|-- requirements.txt # Dependencies file
|-- README.md # Documentation (this file)
### Key Components:
1. **`server.py` **:
- Implements the main research operations:
- `deep_research`, `quick_search`, and `write_report`.
- Contains functions to retrieve research sources, manage contexts, and execute research prompts.
2. **`utils.py` **:
- Provides utilities for:
- Managing research storage.
- Formatting sources and context for readable outputs.
- Generating responses to different types of user interactions.
3. **`Dockerfile` **:
- A configuration file for containerizing the application using the official Python 3.13-slim image.
## Usage Instructions
### Available Endpoints
Once the server is running successfully, you can interact with the application by sending API requests to the available endpoints. Below are the core functionalities:
1. **Quick Research**:
- Quickly get general insights based on a query.
2. **Deep Research**:
- Run comprehensive research on a specific topic.
3. **Custom Research Reports**:
- Combine sources and analysis to create detailed user reports.
4. **Access Research Context**:
- Retrieve the context and sources behind a result for deeper understanding.
## Contributing
We follow a **code of conduct** that expects all contributors to maintain a respectful and inclusive environment.
### Contribution Guidelines:
1. Fork the repository and make your development locally.
2. Create a branch for your changes.
``` sh
git checkout -b feature-name
```
1. Make your changes and test them thoroughly.
2. Submit a pull request with a clear description of the changes.
## Code of Conduct
As contributors and maintainers of this project, we pledge to respect all members of the community by fostering a positive and inclusive environment. Instances of abuse, harassment, or unacceptable behavior will not be tolerated.
For details, please refer to the [Code of Conduct](CODE_OF_CONDUCT.md).
## License
This project is licensed under the **MIT License**. For more information, refer to the LICENSE file in the repository.
## Feedback and Support
For any queries, suggestions, or feedback, feel free to open an issue in this repository. You can also contact the maintainers directly via email or GitHub.
Please feel free to expand or modify this README according to your project's future needs!

View file

@ -1,8 +0,0 @@
"""
GPT Researcher MCP Server
This module provides an MCP server implementation for GPT Researcher,
allowing AI assistants to perform web research and generate reports via the MCP protocol.
"""
__version__ = "0.1.0"

View file

View file

@ -0,0 +1,8 @@
"""
GPT Researcher MCP Server
This module provides an MCP server implementation for GPT Researcher,
allowing AI assistants to perform web research and generate reports via the MCP protocol.
"""
__version__ = "0.1.0"