Launch a WSGI or ASGI Application in a background thread with werkzeug or uvicorn.
Project description
⚙️ Server Thread
Launch a WSGI or ASGI Application in a background thread with werkzeug or uvicorn.
This application was created for localtileserver
and provides the basis for how it can launch an image tile server as a
background thread for visualizing data in Jupyter notebooks.
While this may not be a widely applicable library, it is useful for a few Python packages I have created that require a background service.
🚀 Usage
Use the ServerThread with any WSGI or ASGI Application.
Start by creating a application (this can be a flask app or a simple app like below):
# Create some WSGI Application
from werkzeug import Request, Response
@Request.application
def app(request):
return Response("howdy", 200)
Then launch the app with the ServerThread class:
import requests
from server_thread import ServerThread
# Launch app in a background thread
server = ServerThread(app)
# Perform requests against the server without blocking
requests.get(f"http://{server.host}:{server.port}/").raise_for_status()
⬇️ Installation
Get started with server-thread to create applications that require a
WSGIApplication in the background.
🐍 Installing with conda
Conda makes managing server-thread's dependencies across platforms quite
easy and this is the recommended method to install:
conda install -c conda-forge server-thread
🎡 Installing with pip
If you prefer pip, then you can install from PyPI: https://pypi.org/project/server-thread/
pip install server-thread
💭 Feedback
Please share your thoughts and questions on the Discussions board. If you would like to report any bugs or make feature requests, please open an issue.
If filing a bug report, please share a scooby Report:
import server_thread
print(server_thread.Report())
🚀 Examples
Minimal examples for using server-thread with common micro-frameworks.
💨 FastAPI
import requests
from fastapi import FastAPI
from server_thread import ServerThread
app = FastAPI()
@app.get("/")
def root():
return {"message": "Howdy!"}
server = ServerThread(app)
requests.get(f"http://{server.host}:{server.port}/").json()
⚗️ Flask
import requests
from flask import Flask
from server_thread import ServerThread
app = Flask("testapp")
@app.route("/")
def howdy():
return {"message": "Howdy!"}
server = ServerThread(app)
requests.get(f"http://{server.host}:{server.port}/").json()
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file server_thread-0.3.0.tar.gz.
File metadata
- Download URL: server_thread-0.3.0.tar.gz
- Upload date:
- Size: 6.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.13.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d2b6219d452cf79fa527bf559c35e7acb59ce38660492a65a185fa5cfddae295
|
|
| MD5 |
afe13176e4f5f8e157870ba4524f23a6
|
|
| BLAKE2b-256 |
de67738f229f73364615252fa208c82cb45a93b72e7d439bd75e0d85d4f5d74b
|
File details
Details for the file server_thread-0.3.0-py3-none-any.whl.
File metadata
- Download URL: server_thread-0.3.0-py3-none-any.whl
- Upload date:
- Size: 6.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.13.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bd1e2ee15e0d448e1c0e479aec1417368364ec7785c92c2973a17e9ce458dc00
|
|
| MD5 |
503628925e209cdb66207438814168d1
|
|
| BLAKE2b-256 |
bf853ffc300dc13ea00a4fe063338ee9352a322a8932338ed45a18473d1a4806
|