Commit graph

56 commits

Author SHA1 Message Date
Nathan Weinberg
e777d965a1
docs: add addn server guidance for Linux users in Quick Start (#972)
# What does this PR do?

- [x] Addresses issue #971


## Test Plan
Ran docs build locally

## Sources
See discussion linked in the issue

## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

Signed-off-by: Nathan Weinberg <nweinber@redhat.com>
Co-authored-by: Mert Parker <mertpaker@gmail.com>
2025-02-05 20:57:51 -08:00
Ihar Hrachyshka
f4343f7dc0
docs: clarify host.docker.internal works for recent podman (#977)
The host.docker.internal alias was implemented in podman 4.7.0:


b672ddc792

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>

# What does this PR do?

Follow-up to previous podman specific doc update.

## Test Plan

Please describe:
 - tests you ran to verify your changes with result summaries.
 - provide instructions so it can be reproduced.


## Sources

Please link relevant resources if necessary.


## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-02-05 16:02:05 -08:00
Ihar Hrachyshka
5c8e35a9e2
docs, tests: replace datasets.rst with memory_optimizations.rst (#968)
datasets.rst was removed from torchtune repo.

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>

# What does this PR do?

Replace a missing 404 document with another one that exists. (Removed it
from
the list when memory_optimizations.rst was already pulled.)


## Test Plan

Please describe:
 - tests you ran to verify your changes with result summaries.
 - provide instructions so it can be reproduced.


## Sources

Please link relevant resources if necessary.


## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-02-05 11:25:56 -05:00
Ihar Hrachyshka
529708215c
[docs] Make RAG example self-contained (#962)
Before the patch, the example could not be executed verbatim without
copy-pasting client function from the inference example. I think it's
better to have examples self-contained, especially in a getting started
guide.

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>

# What does this PR do?

See above.

## Test Plan

Confirmed example can now be executed verbatim.

## Sources

Please link relevant resources if necessary.


## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-02-04 16:22:50 -08:00
Ihar Hrachyshka
0cbb3e401c
docs: miscellaneous small fixes (#961)
- **[docs] Fix misc typos and formatting issues in intro docs**
- **[docs]: Export variables (e.g. INFERENCE_MODEL) in getting_started**
- **[docs] Show that `llama-stack-client configure` will ask for api
key**

# What does this PR do?

Miscellaneous fixes in the documentation; not worth reporting an issue.

## Test Plan

No code changes. Addressed issues spotted when walking through the
guide.
Confirmed locally.

## Sources

Please link relevant resources if necessary.

## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

---------

Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
2025-02-04 15:31:30 -08:00
Bill Murdock
b0dec797a0
Add Podman instructions to Quick Start (#957)
Podman is a popular alternative to Docker, so it would be nice to make
it clear that it can also be used to deploy the container for the
server. The instructions are a little different because you have to
create the directory (unlike with Docker which makes the directory for
you).

# What does this PR do?

- [ ] Add Podman instructions to Quick Start

## Test Plan

Documentation only.


## Sources

I tried it out and it worked.

## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
2025-02-04 14:37:02 -08:00
Charlie Doern
26aef50bc5
if client.initialize fails, the example should exit (#954)
# What does this PR do?

the example script can gracefully exit if the boolean returned from
initialize is used properly

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-02-04 13:54:21 -08:00
Nathan Weinberg
7a72082cdd
fix: formatting for ollama note in Quick Start doc (#945)
# What does this PR do?
Fixes formatting for Ollama note found here:
https://llama-stack.readthedocs.io/en/latest/getting_started/index.html#start-ollama

- [ ] Addresses issue (#issue)

## Test Plan
Ran local docs build as described
[here](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md#building-the-documentation)

## Sources
N/A

## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

Signed-off-by: Nathan Weinberg <nweinber@redhat.com>
2025-02-03 14:13:57 -08:00
Nathan Weinberg
1e36721686
fix: broken link in Quick Start doc (#943)
# What does this PR do?
Ollama download link is broken on this page:
https://llama-stack.readthedocs.io/en/latest/getting_started/index.html

## Test Plan
N/A

## Sources
https://ollama.com/docs/installation ==> 404
https://ollama.com/download

## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [x] Ran pre-commit to handle lint / formatting issues.
- [x] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [x] Updated relevant documentation.
- [x] Wrote necessary unit or integration tests.

Signed-off-by: Nathan Weinberg <nweinber@redhat.com>
2025-02-03 13:45:35 -08:00
Yuan Tang
34ab7a3b6c
Fix precommit check after moving to ruff (#927)
Lint check in main branch is failing. This fixes the lint check after we
moved to ruff in https://github.com/meta-llama/llama-stack/pull/921. We
need to move to a `ruff.toml` file as well as fixing and ignoring some
additional checks.

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-02-02 06:46:45 -08:00
Ashwin Bharambe
9351a4b2d7 Update documentation 2025-01-23 17:10:57 -08:00
ehhuang
2fefe8dacd
Update 'first RAG agent' in gettingstarted doc (#867)
# What does this PR do?

Fix documentation to reflect new API


## Test Plan
Before:

User> What are the top 5 topics that were explained? Only list succinct
bullet points.
inference> I'm ready to help, but we haven't discussed any topics yet!
This is the start of our conversation. What would you like to talk
about? I can summarize our discussion at the end if you'd like.


Run with the change, observe relevant response

<img width="1029" alt="image"
src="https://github.com/user-attachments/assets/a7dece3c-e8b4-4a60-9092-ba544c87dffd"
/>



## Sources

Please link relevant resources if necessary.


## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.

Co-authored-by: Eric Huang (AI Platform) <erichuang@fb.com>
2025-01-23 17:02:04 -08:00
Hardik Shah
94ffaf468c
More updates to ReadTheDocs (#861)
Improve Contributing section
2025-01-23 12:50:38 -08:00
Hardik Shah
65f07c3d63
Update Documentation (#838)
# What does this PR do?

Update README and other documentation


## Before submitting

- [X] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
2025-01-22 20:38:52 -08:00
Paul McCarthy
e1decaec9d
Fixing small typo in quick start guide (#807)
# What does this PR do?

Fixing small typo in the quick start guide

## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
2025-01-17 11:15:55 -08:00
Derek Slager
8ba29b19f2
Minor Quick Start documentation updates. (#692)
Clarifying Python version requirement, fixing a sample command.
2024-12-30 14:19:05 -08:00
raghotham
79f8bc8416
Update index.md 2024-12-30 11:32:28 -08:00
Xi Yan
28ce511986 fix --endpoint docs 2024-12-26 14:32:07 -08:00
Ashwin Bharambe
1274fa4c0d Add documentations for building applications and with some content for agentic loop 2024-12-08 14:56:37 -08:00
Aidan Do
0cb996c18d
doc: quickstart guide errors (#575)
# What does this PR do?

Addresses a few errors I got when running the quick start guide:
https://llama-stack.readthedocs.io/en/latest/getting_started/index.html.
We should keep this up to date to maintain engagement with the
community.

I've annotated the PR below. Could you PTAL 🙏 ?

## Before submitting

- [x] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
2024-12-06 12:03:31 -08:00
Ashwin Bharambe
0481fa9540 Fix broken links with docs 2024-11-22 20:42:17 -08:00
Ashwin Bharambe
900b0556e7 Much more documentation work, things are getting a bit consumable right now 2024-11-22 14:06:18 -08:00
Ashwin Bharambe
a0a00f1345 Update telemetry to have TEXT be the default log format 2024-11-21 15:18:45 -08:00
Ashwin Bharambe
55c55b9f51 Update Quick Start significantly 2024-11-21 13:20:55 -08:00
Dinesh Yeduguru
b3f9e8b2f2
Restructure docs (#494)
Rendered docs at: https://llama-stack.readthedocs.io/en/doc-simplify/
2024-11-20 15:54:47 -08:00
Xi Yan
b0fdf7552a docs 2024-11-19 16:41:45 -08:00
Xi Yan
c49acc5226 docs 2024-11-19 16:39:40 -08:00
Xi Yan
f78200b189 docs 2024-11-19 16:37:30 -08:00
Xi Yan
2da93c8835 fix 3.2-1b fireworks 2024-11-19 14:20:07 -08:00
Xi Yan
189df6358a codegen docs 2024-11-19 14:16:00 -08:00
Xi Yan
1b0f5fff5a fix curl endpoint 2024-11-19 10:26:05 -08:00
Ashwin Bharambe
e8d3eee095 Fix docs yet again 2024-11-18 23:51:35 -08:00
Ashwin Bharambe
d463d68e1e Update docs 2024-11-18 23:21:25 -08:00
Ashwin Bharambe
7693786322 Use HF names for registering fireworks and together models 2024-11-18 22:34:47 -08:00
Ashwin Bharambe
939056e265 More documentation fixes 2024-11-18 17:06:13 -08:00
Ashwin Bharambe
e40404625b Update to docs 2024-11-18 16:52:48 -08:00
Ashwin Bharambe
afa4f0b19f Update remote vllm docs 2024-11-18 16:34:33 -08:00
Ashwin Bharambe
47c37fd831 Fixes 2024-11-18 16:03:53 -08:00
Ashwin Bharambe
3aedde2ab4 Add a pre-commit for distro_codegen but it does not work yet 2024-11-18 15:21:13 -08:00
Ashwin Bharambe
2a31163178
Auto-generate distro yamls + docs (#468)
# What does this PR do?

Automatically generates
- build.yaml
- run.yaml
- run-with-safety.yaml
- parts of markdown docs

for the distributions.

## Test Plan

At this point, this only updates the YAMLs and the docs. Some testing
(especially with ollama and vllm) has been performed but needs to be
much more tested.
2024-11-18 14:57:06 -08:00
Dinesh Yeduguru
fdff24e77a
Inference to use provider resource id to register and validate (#428)
This PR changes the way model id gets translated to the final model name
that gets passed through the provider.
Major changes include:
1) Providers are responsible for registering an object and as part of
the registration returning the object with the correct provider specific
name of the model provider_resource_id
2) To help with the common look ups different names a new ModelLookup
class is created.



Tested all inference providers including together, fireworks, vllm,
ollama, meta reference and bedrock
2024-11-12 20:02:00 -08:00
Ashwin Bharambe
3d7561e55c
Rename all inline providers with an inline:: prefix (#423) 2024-11-11 22:19:16 -08:00
Xi Yan
b0b9c905b3 docs 2024-11-09 10:22:41 -08:00
Xi Yan
cc61fd8083 docs 2024-11-09 09:00:18 -08:00
Xi Yan
0c14761453 docs 2024-11-09 08:57:51 -08:00
Ashwin Bharambe
4986e46188
Distributions updates (slight updates to ollama, add inline-vllm and remote-vllm) (#408)
* remote vllm distro

* add inline-vllm details, fix things

* Write some docs
2024-11-08 18:09:39 -08:00
Xi Yan
bd0622ef10 update docs 2024-11-08 12:47:05 -08:00
Xi Yan
7ee9f8d8ac rename 2024-11-08 10:34:48 -08:00
Xi Yan
b1d7376730 kill tgi/cpu 2024-11-08 10:33:45 -08:00
Xi Yan
8350f2df4c
[docs] refactor remote-hosted distro (#402)
* move docs

* docs
2024-11-07 19:16:38 -08:00