Skip to main content

No project description provided

Project description

Code style: black

Introduction

Implement the sentence embedding retriever with local cache from the embedding store.

Features

  • Embedding store abstraction class

  • Support Jina client implementation embedding store

  • Save the cache to parquet file

  • Load the cache from existed parquet file

Installation

</code></pre>
<h2>Quick Start</h2>
<h3><strong>Option 1.</strong> Using Jina flow serve the embedding model</h3>
<ul>
<li>To start up the Jina flow service with sentence embedding model
<code>sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2</code>, you can just clone
this github repo directly and serve by the docker container.</li>
</ul>
<pre lang="bash"><code>git clone https://github.com/ycc789741ycc/sentence-embedding-dataframe-cache.git

cd sentence-embedding-dataframe-cache

make serve-jina-embedding
  • Retrieve the embedding
from embestore.jina import JinaEmbeddingStore

JINA_embestore_GRPC = "grpc://0.0.0.0:54321"


query_sentences = ["I want to listen the music.", "Music don't want to listen me."]

jina_embestore = JinaEmbeddingStore(embedding_grpc=JINA_embestore_GRPC)
results = jina_embestore.retrieve_embeddings(sentences=query_sentences)
  • Stop the docker container
stop-jina-embedding

Option 2. Using local sentence embedding model sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2

from embestore.torch import TorchEmbeddingStore

query_sentences = ["I want to listen the music.", "Music don't want to listen me."]


torch_embestore = TorchEmbeddingStore()
results = torch_embestore.retrieve_embeddings(sentences=query_sentences)

Option 3. Inherit from the abstraction class

from typing import List, Text

import numpy as np
from sentence_transformers import SentenceTransformer

from embestore.base import EmbeddingStore

model = SentenceTransformer("sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2").eval()


class TorchEmbeddingStore(EmbeddingStore):
    def _retrieve_embeddings_from_model(self, sentences: List[Text]) -> np.ndarray:
        return model.encode(sentences)

Save the cache

torch_embestore.save("cache.parquet")

Load from the cache

torch_embestore = TorchEmbeddingStore("cache.parquet")

Road Map

[Done] prototype abstraction

[Done] Unit-test, integration test

[Done] Embedding retriever implementation: Pytorch, Jina

  • [Done] Jina

  • [Done] Sentence Embedding

[Done] Docker service

[Todo] Example, Documentation

[Todo] Embedding monitor

[Todo] pip install support

[Improve] Accelerate the Pandas retriever efficiency

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

embestore-0.1.4.tar.gz (4.4 kB view hashes)

Uploaded Source

Built Distribution

embestore-0.1.4-py3-none-any.whl (4.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page