llama-stack-mirror/fp8_requirements.txt

30 lines
342 B
Text

torch>=2.4.0
accelerate
black==24.4.2
codeshield
fairscale
fastapi
fire
flake8
huggingface-hub
httpx
hydra-core
hydra-zen
json-strong-typing
matplotlib
omegaconf
pandas
Pillow
pre-commit
pydantic==1.10.13
pydantic_core==2.18.2
python-dotenv
python-openapi
requests
tiktoken
transformers
ufmt==2.7.0
usort==1.0.8
uvicorn
zmq
fbgemm-gpu==0.8.0