mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-09 11:20:58 +00:00
Fix fireworks and update the test
Don't look for eom_id / eot_id sadly since providers don't return the last token
This commit is contained in:
parent
bbd3a02615
commit
dba7caf1d0
4 changed files with 37 additions and 37 deletions
|
|
@ -13,3 +13,13 @@ providers:
|
|||
config:
|
||||
host: localhost
|
||||
port: 7002
|
||||
- provider_id: test-together
|
||||
provider_type: remote::together
|
||||
config: {}
|
||||
# if a provider needs private keys from the client, they use the
|
||||
# "get_request_provider_data" function (see distribution/request_headers.py)
|
||||
# this is a place to provide such data.
|
||||
provider_data:
|
||||
"test-together":
|
||||
together_api_key:
|
||||
0xdeadbeefputrealapikeyhere
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue