Access the Cohere Command R family of models
Project description
llm-command-r
Access the Cohere Command R family of models via the Cohere API
Installation
Install this plugin in the same environment as LLM.
llm install llm-command-r
Configuration
You will need a Cohere API key. Configure it like this:
llm keys set cohere
# Paste key here
To use an alternative base URL for the Cohere API, set the COHERE_BASE_URL environment variable.
Usage
This plugin adds two models.
llm -m command-r 'Say hello from Command R'
llm -m command-r-plus 'Say hello from Command R Plus'
The Command R models have the ability to search the web as part of answering a prompt.
You can enable this feature using the -o websearch 1 option to the models:
llm -m command-r 'What is the LLM CLI tool?' -o websearch 1
Running a search costs more as it involves spending tokens including the search results in the prompt.
The full search results are stored as JSON in the LLM logs.
You can also use the command-r-search command provided by this plugin to see a list of documents that were used to answer your question as part of the output:
llm command-r-search 'What is the LLM CLI tool by simonw?'
Example output:
The LLM CLI tool is a command-line utility that allows users to access large language models. It was created by Simon Willison and can be installed via pip, Homebrew or pipx. The tool supports interactions with remote APIs and models that can be locally installed and run. Users can run prompts from the command line and even build an image search engine using the CLI tool.
Sources:
- GitHub - simonw/llm: Access large language models from the command-line - https://github.com/simonw/llm
- llm, ttok and strip-tags—CLI tools for working with ChatGPT and other LLMs - https://simonwillison.net/2023/May/18/cli-tools-for-llms/
- Sherwood Callaway on LinkedIn: GitHub - simonw/llm: Access large language models from the command-line - https://www.linkedin.com/posts/sherwoodcallaway_github-simonwllm-access-large-language-activity-7104448041041960960-2WRG
- LLM Python/CLI tool adds support for embeddings | Hacker News - https://news.ycombinator.com/item?id=37384797
- CLI tools for working with ChatGPT and other LLMs | Hacker News - https://news.ycombinator.com/item?id=35994037
- GitHub - simonw/homebrew-llm: Homebrew formulas for installing LLM and related tools - https://github.com/simonw/homebrew-llm
- LLM: A CLI utility and Python library for interacting with Large Language Models - https://llm.datasette.io/en/stable/
- GitHub - simonw/llm-prompts: A collection of prompts for use with the LLM CLI tool - https://github.com/simonw/llm-prompts
- GitHub - simonw/llm-cmd: Use LLM to generate and execute commands in your shell - https://github.com/simonw/llm-cmd
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-command-r
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest
To generate new recorded VCR cassettes:
PYTEST_COHERE_API_KEY="$(llm keys get cohere)" pytest --record-mode once
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_command_r-0.3.1.tar.gz.
File metadata
- Download URL: llm_command_r-0.3.1.tar.gz
- Upload date:
- Size: 9.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9cce12111d87a8e722301864826f51d1ec742b82931744793c08f3f3437461bc
|
|
| MD5 |
4197059acc1af258f5918169b38c04f2
|
|
| BLAKE2b-256 |
395e92a56423ac9d99bd5647cf68a9c87fe70e6bbc03b7eb924ebc51f9453fe0
|
Provenance
The following attestation bundles were made for llm_command_r-0.3.1.tar.gz:
Publisher:
publish.yml on simonw/llm-command-r
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_command_r-0.3.1.tar.gz -
Subject digest:
9cce12111d87a8e722301864826f51d1ec742b82931744793c08f3f3437461bc - Sigstore transparency entry: 189231669
- Sigstore integration time:
-
Permalink:
simonw/llm-command-r@6a66c04b2e73ef57eb9d3bfd0d247130a4a30887 -
Branch / Tag:
refs/tags/0.3.1 - Owner: https://github.com/simonw
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6a66c04b2e73ef57eb9d3bfd0d247130a4a30887 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llm_command_r-0.3.1-py3-none-any.whl.
File metadata
- Download URL: llm_command_r-0.3.1-py3-none-any.whl
- Upload date:
- Size: 9.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ca88d58372e32091e9823413d46ee0fc7d448e2edc8b2d26d5612c1bf81763dc
|
|
| MD5 |
483ebc62568ed22a4eddf92369758f15
|
|
| BLAKE2b-256 |
3c4c63c6b12aea49d6eb903d78cfc5ba06f53b7153b221e44b646fe1c5a50ef4
|
Provenance
The following attestation bundles were made for llm_command_r-0.3.1-py3-none-any.whl:
Publisher:
publish.yml on simonw/llm-command-r
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_command_r-0.3.1-py3-none-any.whl -
Subject digest:
ca88d58372e32091e9823413d46ee0fc7d448e2edc8b2d26d5612c1bf81763dc - Sigstore transparency entry: 189231670
- Sigstore integration time:
-
Permalink:
simonw/llm-command-r@6a66c04b2e73ef57eb9d3bfd0d247130a4a30887 -
Branch / Tag:
refs/tags/0.3.1 - Owner: https://github.com/simonw
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6a66c04b2e73ef57eb9d3bfd0d247130a4a30887 -
Trigger Event:
release
-
Statement type: