A small example package
Project description
aio-sse-chat
Special python aio sse client module especially for parsing server-sent events (SSE) response from LLM.
Modified from aiohttp-sse-client
Why need this?
Normal SSE packages will not get correct value from streaming LLM response(in case you have not escape \n
to \\n
) since it will not parse the response correctly. This module will parse the response correctly and return the correct value.
Also, LLM request usually need to submit a POST
request while most current aio sse modules choose to raise error when submit a POST
request. Though it is not a good practice to use POST
request to get a streaming response, but it helps a lot for simplifying the code.
Installation
pip install aio-sse-chat
Usage
Create your aiohttp session and use aiosseclient
to wrap the session to do request.
# fastapi side
@app.post('/sse') # support all http methods
async def sse_endpoint(data: dict):
async def f():
for i in range(10):
yield '\n'
await asyncio.sleep(0.2)
return EventSourceResponse(f())
##################
# client side
import aiohttp
from aiossechat import aiosseclient
async with aiohttp.ClientSession() as session:
async for event in aiosseclient(url=some_url, session, method='post', json=some_data):
print(data, end='', flush=True) # can get single `'\n'` correctly
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for aiossechat-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fdca2a8c76971f1d78856873ad0cca64b5c90609ff87685bb8711f203d520e4d |
|
MD5 | bb5c54a085ab5cd8dbb7a4c0c459c833 |
|
BLAKE2b-256 | 683642b73d48704153f7cc0a7af0752fc6c60255ffea76e0924115300f638cac |