API Updates: fleshing out RAG APIs, introduce "llama stack" CLI command (#51)

* add tools to chat completion request

* use templates for generating system prompts

* Moved ToolPromptFormat and jinja templates to llama_models.llama3.api

* <WIP> memory changes

- inlined AgenticSystemInstanceConfig so API feels more ergonomic
- renamed it to AgentConfig, AgentInstance -> Agent
- added a MemoryConfig and `memory` parameter
- added `attachments` to input and `output_attachments` to the response

- some naming changes

* InterleavedTextAttachment -> InterleavedTextMedia, introduce memory tool

* flesh out memory banks API

* agentic loop has a RAG implementation

* faiss provider implementation

* memory client works

* re-work tool definitions, fix FastAPI issues, fix tool regressions

* fix agentic_system utils

* basic RAG seems to work

* small bug fixes for inline attachments

* Refactor custom tool execution utilities

* Bug fix, show memory retrieval steps in EventLogger

* No need for api_key for Remote providers

* add special unicode character ↵ to showcase newlines in model prompt templates

* remove api.endpoints imports

* combine datatypes.py and endpoints.py into api.py

* Attachment / add TTL api

* split batch_inference from inference

* minor import fixes

* use a single impl for ChatFormat.decode_assistant_mesage

* use interleaved_text_media_as_str() utilityt

* Fix api.datatypes imports

* Add blobfile for tiktoken

* Add ToolPromptFormat to ChatFormat.encode_message so that tools are encoded properly

* templates take optional --format={json,function_tag}

* Rag Updates

* Add `api build` subcommand -- WIP

* fix

* build + run image seems to work

* <WIP> adapters

* bunch more work to make adapters work

* api build works for conda now

* ollama remote adapter works

* Several smaller fixes to make adapters work

Also, reorganized the pattern of __init__ inside providers so
configuration can stay lightweight

* llama distribution -> llama stack + containers (WIP)

* All the new CLI for api + stack work

* Make Fireworks and Together into the Adapter format

* Some quick fixes to the CLI behavior to make it consistent

* Updated README phew

* Update cli_reference.md

* llama_toolchain/distribution -> llama_toolchain/core

* Add termcolor

* update paths

* Add a log just for consistency

* chmod +x scripts

* Fix api dependencies not getting added to configuration

* missing import lol

* Delete utils.py; move to agentic system

* Support downloading of URLs for attachments for code interpreter

* Simplify and generalize `llama api build` yay

* Update `llama stack configure` to be very simple also

* Fix stack start

* Allow building an "adhoc" distribution

* Remote `llama api []` subcommands

* Fixes to llama stack commands and update docs

* Update documentation again and add error messages to llama stack start

* llama stack start -> llama stack run

* Change name of build for less confusion

* Add pyopenapi fork to the repository, update RFC assets

* Remove conflicting annotation

* Added a "--raw" option for model template printing

---------

Co-authored-by: Hardik Shah <hjshah@fb.com>
Co-authored-by: Ashwin Bharambe <ashwin@meta.com>
Co-authored-by: Dalton Flanagan <6599399+dltn@users.noreply.github.com>
This commit is contained in:
Ashwin Bharambe 2024-09-03 22:39:39 -07:00 committed by GitHub
parent 35093c0b6f
commit 7bc7785b0d
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
141 changed files with 8252 additions and 4032 deletions

View file

@ -4,5 +4,4 @@
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
from .datatypes import * # noqa: F401 F403
from .endpoints import * # noqa: F401 F403
from .api import * # noqa: F401 F403

View file

@ -5,12 +5,79 @@
# the root directory of this source tree.
from datetime import datetime
from typing import Any, Dict, List, Optional, Protocol
from enum import Enum
from typing import Any, Dict, List, Optional, Protocol, Union
from llama_models.schema_utils import json_schema_type, webmethod
from pydantic import BaseModel
from llama_models.llama3.api.datatypes import * # noqa: F403
from .datatypes import * # noqa: F403
@json_schema_type
class ExperimentStatus(Enum):
NOT_STARTED = "not_started"
RUNNING = "running"
COMPLETED = "completed"
FAILED = "failed"
@json_schema_type
class Experiment(BaseModel):
id: str
name: str
status: ExperimentStatus
created_at: datetime
updated_at: datetime
metadata: Dict[str, Any]
@json_schema_type
class Run(BaseModel):
id: str
experiment_id: str
status: str
started_at: datetime
ended_at: Optional[datetime]
metadata: Dict[str, Any]
@json_schema_type
class Metric(BaseModel):
name: str
value: Union[float, int, str, bool]
timestamp: datetime
run_id: str
@json_schema_type
class Log(BaseModel):
message: str
level: str
timestamp: datetime
additional_info: Dict[str, Any]
@json_schema_type
class ArtifactType(Enum):
MODEL = "model"
DATASET = "dataset"
CHECKPOINT = "checkpoint"
PLOT = "plot"
METRIC = "metric"
CONFIG = "config"
CODE = "code"
OTHER = "other"
@json_schema_type
class Artifact(BaseModel):
id: str
name: str
type: ArtifactType
size: int
created_at: datetime
metadata: Dict[str, Any]
@json_schema_type

View file

@ -1,80 +0,0 @@
# Copyright (c) Meta Platforms, Inc. and affiliates.
# All rights reserved.
#
# This source code is licensed under the terms described in the LICENSE file in
# the root directory of this source tree.
from datetime import datetime
from enum import Enum
from typing import Any, Dict, Optional, Union
from llama_models.schema_utils import json_schema_type
from pydantic import BaseModel
@json_schema_type
class ExperimentStatus(Enum):
NOT_STARTED = "not_started"
RUNNING = "running"
COMPLETED = "completed"
FAILED = "failed"
@json_schema_type
class Experiment(BaseModel):
id: str
name: str
status: ExperimentStatus
created_at: datetime
updated_at: datetime
metadata: Dict[str, Any]
@json_schema_type
class Run(BaseModel):
id: str
experiment_id: str
status: str
started_at: datetime
ended_at: Optional[datetime]
metadata: Dict[str, Any]
@json_schema_type
class Metric(BaseModel):
name: str
value: Union[float, int, str, bool]
timestamp: datetime
run_id: str
@json_schema_type
class Log(BaseModel):
message: str
level: str
timestamp: datetime
additional_info: Dict[str, Any]
@json_schema_type
class ArtifactType(Enum):
MODEL = "model"
DATASET = "dataset"
CHECKPOINT = "checkpoint"
PLOT = "plot"
METRIC = "metric"
CONFIG = "config"
CODE = "code"
OTHER = "other"
@json_schema_type
class Artifact(BaseModel):
id: str
name: str
type: ArtifactType
size: int
created_at: datetime
metadata: Dict[str, Any]