A guidance compatibility layer for llama_cpp_python
Project description
llama.cpp Guidance
The llama_cpp_guidance
package provides an LLM client compatibility layer between llama_cpp_python
and guidance
.
Installation
The llama_cpp_guidance
package can be installed using pip.
pip install llama_cpp_guidance
Basic Usage
Once installed, you can use the LlamaCpp
class like any other guidance-compatible LLM class.
from pathlib import Path
from llama_cpp_guidance.llm import LlamaCpp
import guidance
guidance.llm = LlamaCpp(
model_path=Path("../path/to/llamacpp/model.gguf"),
n_gpu_layers=1,
n_threads=8
)
program = guidance(
"The best thing about the beach is {{~gen 'best' temperature=0.7 max_tokens=10}}"
)
output = program()
print(output)
The best thing about the beach is that there’s always something to do.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for llama_cpp_guidance-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4b40cdf78a97a902b9f1969b5f49cfc1e0b584c515965bae053bbeb07188510f |
|
MD5 | b1ec5f7173f4306bfd23de33bac06461 |
|
BLAKE2b-256 | 3c888cfdb09cf1e19100c41808800905de9c528b62adba55cef53407e9bfab3a |