mcp-gpt-researcher/Dockerfile
ThomasTaroni 44b91b9375 Refactor codebase to implement MCP server for GPT Researcher
Replaced FastAPI app with an MCP server implementation, enhancing flexibility and modularity for research operations. Deprecated `phoenix_technologies` package, updated server logic, added utility functions, and revised dependencies in `requirements.txt`. Updated Dockerfile and README to align with the new architecture.
2025-04-26 17:54:43 +02:00

25 lines
No EOL
685 B
Docker

# Use the official Python image as a base
FROM python:3.13-slim
# Set environment variable for Python
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set working directory within the container
WORKDIR /app
# Copy the requirements file if you have it (or define dependencies manually)
# If you don't have a requirements.txt, let me know and I'll guide you further
COPY requirements.txt .
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy the rest of the project files into the container
COPY src/ /app/
# Expose the port that FastAPI will run on
EXPOSE 8000
# Set the default command to run the app with `uvicorn`
CMD ["python", "server.py"]