mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-28 02:53:30 +00:00
Updates to setup and requirements for PyPI
This commit is contained in:
parent
d802d0f051
commit
f7e053e3ba
3 changed files with 17 additions and 12 deletions
22
README.md
22
README.md
|
@ -1,23 +1,29 @@
|
||||||
# llama-toolchain
|
# llama-toolchain
|
||||||
|
|
||||||
This repo contains the API specifications for various components of the Llama Stack as well implementations for some of those APIs like model inference.
|
This repo contains the API specifications for various components of the Llama Stack as well implementations for some of those APIs like model inference.
|
||||||
The Stack consists of toolchain-apis and agentic-apis. This repo contains the toolchain-apis
|
The Stack consists of toolchain-apis and agentic-apis. This repo contains the toolchain-apis
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
You can install this repository as a [package](https://pypi.org/project/llama-toolchain/) by just doing `pip install llama-toolchain`
|
||||||
|
|
||||||
|
If you want to install from source:
|
||||||
|
|
||||||
## Installation and Setup ##
|
|
||||||
```bash
|
```bash
|
||||||
mkdir -p ~/local
|
mkdir -p ~/local
|
||||||
cd ~/local
|
cd ~/local
|
||||||
git clone git@github.com:meta-llama/llama-toolchain.git
|
git clone git@github.com:meta-llama/llama-toolchain.git
|
||||||
|
|
||||||
conda create -n toolchain python=3.10
|
conda create -n toolchain python=3.10
|
||||||
conda activate toolchain
|
conda activate toolchain
|
||||||
|
|
||||||
cd llama-toolchain
|
cd llama-toolchain
|
||||||
pip install -e .
|
pip install -e .
|
||||||
```
|
```
|
||||||
|
|
||||||
## Test with cli
|
## Test with cli
|
||||||
We have built a llama cli to make it easy to configure / run parts of the toolchain
|
|
||||||
|
We have built a llama cli to make it easy to configure / run parts of the toolchain
|
||||||
```
|
```
|
||||||
llama --help
|
llama --help
|
||||||
|
|
||||||
|
@ -31,13 +37,13 @@ options:
|
||||||
subcommands:
|
subcommands:
|
||||||
{download,inference,model,agentic_system}
|
{download,inference,model,agentic_system}
|
||||||
```
|
```
|
||||||
There are several subcommands to help get you started
|
There are several subcommands to help get you started
|
||||||
|
|
||||||
## Start inference server that can run the llama models
|
## Start inference server that can run the llama models
|
||||||
```bash
|
```bash
|
||||||
llama inference configure
|
llama inference configure
|
||||||
llama inference start
|
llama inference start
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
## Test client
|
## Test client
|
||||||
|
|
|
@ -6,11 +6,12 @@ fairscale
|
||||||
fastapi
|
fastapi
|
||||||
fire
|
fire
|
||||||
flake8
|
flake8
|
||||||
huggingface-hub
|
|
||||||
httpx
|
httpx
|
||||||
|
huggingface-hub
|
||||||
hydra-core
|
hydra-core
|
||||||
hydra-zen
|
hydra-zen
|
||||||
json-strong-typing
|
json-strong-typing
|
||||||
|
llama_models
|
||||||
matplotlib
|
matplotlib
|
||||||
omegaconf
|
omegaconf
|
||||||
pandas
|
pandas
|
||||||
|
@ -28,5 +29,3 @@ ufmt==2.7.0
|
||||||
usort==1.0.8
|
usort==1.0.8
|
||||||
uvicorn
|
uvicorn
|
||||||
zmq
|
zmq
|
||||||
|
|
||||||
llama_models[llama3_1] @ git+ssh://git@github.com/meta-llama/llama-models.git
|
|
||||||
|
|
2
setup.py
2
setup.py
|
@ -16,7 +16,7 @@ def read_requirements():
|
||||||
|
|
||||||
setup(
|
setup(
|
||||||
name="llama_toolchain",
|
name="llama_toolchain",
|
||||||
version="0.0.0.1",
|
version="0.0.1",
|
||||||
author="Meta Llama",
|
author="Meta Llama",
|
||||||
author_email="llama-oss@meta.com",
|
author_email="llama-oss@meta.com",
|
||||||
description="Llama toolchain",
|
description="Llama toolchain",
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue