Skip to main content

Access the Cohere Command R family of models

Project description

llm-command-r

PyPI Changelog Tests License

Access the Cohere Command R family of models via the Cohere API

Installation

Install this plugin in the same environment as LLM.

llm install llm-command-r

Configuration

You will need a Cohere API key. Configure it like this:

llm keys set cohere
# Paste key here

To use an alternative base URL for the Cohere API, set the COHERE_BASE_URL environment variable.

Usage

This plugin adds two models.

llm -m command-r 'Say hello from Command R'
llm -m command-r-plus 'Say hello from Command R Plus'

The Command R models have the ability to search the web as part of answering a prompt.

You can enable this feature using the -o websearch 1 option to the models:

llm -m command-r 'What is the LLM CLI tool?' -o websearch 1

Running a search costs more as it involves spending tokens including the search results in the prompt.

The full search results are stored as JSON in the LLM logs.

You can also use the command-r-search command provided by this plugin to see a list of documents that were used to answer your question as part of the output:

llm command-r-search 'What is the LLM CLI tool by simonw?'

Example output:

The LLM CLI tool is a command-line utility that allows users to access large language models. It was created by Simon Willison and can be installed via pip, Homebrew or pipx. The tool supports interactions with remote APIs and models that can be locally installed and run. Users can run prompts from the command line and even build an image search engine using the CLI tool.

Sources:

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-command-r
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

To generate new recorded VCR cassettes:

PYTEST_COHERE_API_KEY="$(llm keys get cohere)" pytest --record-mode once

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_command_r-0.3.1.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_command_r-0.3.1-py3-none-any.whl (9.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_command_r-0.3.1.tar.gz.

File metadata

  • Download URL: llm_command_r-0.3.1.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llm_command_r-0.3.1.tar.gz
Algorithm Hash digest
SHA256 9cce12111d87a8e722301864826f51d1ec742b82931744793c08f3f3437461bc
MD5 4197059acc1af258f5918169b38c04f2
BLAKE2b-256 395e92a56423ac9d99bd5647cf68a9c87fe70e6bbc03b7eb924ebc51f9453fe0

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_command_r-0.3.1.tar.gz:

Publisher: publish.yml on simonw/llm-command-r

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_command_r-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: llm_command_r-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for llm_command_r-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ca88d58372e32091e9823413d46ee0fc7d448e2edc8b2d26d5612c1bf81763dc
MD5 483ebc62568ed22a4eddf92369758f15
BLAKE2b-256 3c4c63c6b12aea49d6eb903d78cfc5ba06f53b7153b221e44b646fe1c5a50ef4

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_command_r-0.3.1-py3-none-any.whl:

Publisher: publish.yml on simonw/llm-command-r

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page