Skip to main content

A small example package

Project description

aio-sse-chat

Special python aio sse client module especially for parsing server-sent events (SSE) response from LLM.

Modified from aiohttp-sse-client

Why need this?

Normal SSE packages will not get correct value from streaming LLM response(in case you have not escape \n to \\n) since it will not parse the response correctly. This module will parse the response correctly and return the correct value.

Also, LLM request usually need to submit a POST request while most current aio sse modules choose to raise error when submit a POST request. Though it is not a good practice to use POST request to get a streaming response, but it helps a lot for simplifying the code.

Installation

pip install aio-sse-chat

Usage

Create your aiohttp session and use aiosseclient to wrap the session to do request.

# fastapi side

@app.post('/sse')   # support all http methods
async def sse_endpoint(data: dict):
    async def f():
        for i in range(10):
            yield '\n'
            await asyncio.sleep(0.2)
    return EventSourceResponse(f())

##################
# client side

import aiohttp
from aiossechat import aiosseclient

async for event in aiosseclient(url=some_url, method='post', json=some_data):
    print(data, end='', flush=True)   # can get single `'\n'` correctly

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiossechat-0.0.2.tar.gz (4.8 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page