litellm-mirror/tests/llm_translation/test_aiohttp_openai.py
Ishaan Jaff d861aa8ff3
(perf) use aiohttp for custom_openai (#7514)
* use aiohttp handler

* BaseLLMAIOHTTPHandler

* use CustomOpenAIChatConfig

* CustomOpenAIChatConfig

* CustomOpenAIChatConfig

* fix linting

* AiohttpOpenAIChatConfig

* fix order

* aiohttp_openai
2025-01-02 22:15:17 -08:00

24 lines
627 B
Python

import json
import os
import sys
from datetime import datetime
import pytest
sys.path.insert(
0, os.path.abspath("../../")
) # Adds the parent directory to the system path
import litellm
@pytest.mark.asyncio
async def test_aiohttp_openai():
litellm.set_verbose = True
response = await litellm.acompletion(
model="aiohttp_openai/fake-model",
messages=[{"role": "user", "content": "Hello, world!"}],
api_base="https://exampleopenaiendpoint-production.up.railway.app/v1/chat/completions",
api_key="fake-key",
)
print(response)
print(response.model_dump_json(indent=4))