diff --git a/README.md b/README.md index 6b79173..2d8893a 100644 --- a/README.md +++ b/README.md @@ -41,8 +41,11 @@ llama-cpp-python offers a web server which aims to act as a drop-in replacement To install the server package and get started: pip install llama-cpp-python[server] + export MODEL=./models/your_model.py + python3 -m llama_cpp.server + Navigate to http://localhost:8000/docs to see the OpenAPI documentation. # Usage