Skip to main content

LLM plugin for running models using MLC

Project description

llm-mlc

PyPI Changelog Tests License

LLM plugin for running models using MLC

Installation

Install this plugin in the same environment as llm.

llm install llm-mlc

Next, run the llm mlc setup command to complete the installation:

llm mlc setup

This will setup git lfs and use it to install some extra dependencies:

Git LFS is not installed. Should I run 'git lfs install' for you?
Install Git LFS? [y/N]: y
Updated Git hooks.
Git LFS initialized.
Downloading prebuilt binaries...
Cloning into '/Users/simon/Library/Application Support/io.datasette.llm/mlc/dist/prebuilt/lib'...
remote: Enumerating objects: 221, done.
remote: Counting objects: 100% (86/86), done.
remote: Compressing objects: 100% (54/54), done.
remote: Total 221 (delta 59), reused 56 (delta 32), pack-reused 135
Receiving objects: 100% (221/221), 52.06 MiB | 9.13 MiB/s, done.
Resolving deltas: 100% (152/152), done.
Updating files: 100% (60/60), done.
Ready to install models in /Users/simon/Library/Application Support/io.datasette.llm/mlc

Finally, install the mlc_chat package. This is a few extra steps, which are described in detail on the mlc.ai/package site.

If you are on an Apple Silicon M1/M2 Mac you can run this command:

llm mlc pip install --pre --force-reinstall \
  mlc-ai-nightly \
  mlc-chat-nightly \
  -f https://mlc.ai/wheels

The llm mlc pip command ensures that pip will run in the same virtual environment as llm itself.

For other systems, follow the instructions here.

Installing models

After installation you will need to download a model using the llm mlc download-model command.

Here's how to download and install Llama 2:

llm mlc download-model Llama-2-7b-chat

This will download around 8GB of content.

You can also use Llama-2-13b-chat or Llama-2-70b-chat, though these files are a lot larger.

The download-model command also takes a URL to one of the MLC repositories on Hugging Face.

For example, to install mlc-chat-WizardLM-13B-V1:

llm mlc download-model https://huggingface.co/mlc-ai/mlc-chat-WizardLM-13B-V1.2-q4f16_1

You can see a full list of models you have installed this way using:

llm mlc models

This will also show the name of the model you should use to activate it, e.g.:

MlcModel: mlc-chat-Llama-2-7b-chat-hf-q4f16_1

Running a prompt through a model

Once you have downloaded and added a model, you can run a prompt like this:

llm -m mlc-chat-Llama-2-7b-chat-hf-q4f16_1 \
  'five names for a cute pet ferret'

Great! Here are five cute and creative name suggestions for a pet ferret:

  1. Ferbie - a playful and affectionate name for a friendly and outgoing ferret.
  2. Mr. Whiskers - a suave and sophisticated name for a well-groomed and dignified ferret.
  3. Luna - a celestial and dreamy name for a curious and adventurous ferret.
  4. Felix - a cheerful and energetic name for a lively and mischievous ferret.
  5. Sprinkles - a fun and playful name for a happy and energetic ferret with a sprinkle of mischief.

Remember, the most important thing is to choose a name that you and your ferret will love and enjoy!

And to send a follow-up prompt to continue the current conversation, use -c:

llm -c 'two more'

Of course! Here are two more cute name ideas for a pet ferret:

  1. Digger - a fun and playful name that suits a pet that loves to dig and burrow, and is also a nod to the ferret's natural instincts as a burrower.
  2. Gizmo - a fun and quirky name that suits a pet with a curious and mischievous personality, and is also a nod to the ferret's playful and inventive nature.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-mlc
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-mlc-0.3.tar.gz (10.1 kB view hashes)

Uploaded Source

Built Distribution

llm_mlc-0.3-py3-none-any.whl (10.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page