Skip to main content

A Chain of Thought logger for langchain

Project description

Langchain Logger

The genesis for this project was the need to see and display what was actually occurring during a langchain invoke or run method. There is a cloud service LangSmith appears to capture the internals transactions or Chain of Thought of an invoke or run but what if you wanted to capture that within your application, maybe show to your end user the internal processing of a request.

Right now you would have to take the result of the invoke and display the COT after processing. However you can capture it in real time, and also display the contents.

Installation

pip install langchain-logger

Usage

Begin by creating a logger, file loggers work well as you can tail them in real time across workers and threads. This is a standard stream logger

from langchain_openai import OpenAI

from langchain_logger.callback import ChainOfThoughtCallbackHandler
import logging

# Set up logging for the example
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
handler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
logger.addHandler(handler)

# Create the Chain of Thought callback handler
cot= ChainOfThoughtCallbackHandler(logger=logger)


# Create a new OpenAI instance
llm = OpenAI(callbacks=[cot])

Any using this LLM will now have the chain of though streamed through the logger

A full example is in example.py

Tips

CrewAI

This works with any layer above langchain that you have access to the LLM and looks really impressive when you have multiple calls or iterations occurring

For example CrewAI

e.g.

from crewai import Agent
....
cot= ChainOfThoughtCallbackHandler(logger=logger)
llm=ChatOpenAI(model_name="gpt-3.5", temperature=0.7, callbacks=[cot])

researcher = Agent(
  role='Senior Research Analyst',
  goal='Uncover cutting-edge developments in AI and data science',
  backstory="""You work at a leading tech think tank.
  Your expertise lies in identifying emerging trends.
  You have a knack for dissecting complex data and presenting actionable insights.""",
  verbose=True,
  allow_delegation=False,
  llm=llm
)

Viewing in a browser

A lot of python implementations are single server or shared file system deployments So it can make sense to log to a file and tail that log to a browser. We suggest taking a look at the flask-log-viewer to see how you can do that.

We've included a simple log file method to pair with the Chain of Though logger Start with installing flask and flask-log-viewer

pip install flask flask-log-viewer

create your flask app and add the flask-log-viewer blueprint

from langchain_logger.logger import configure_logger
import uuid

random_log = f"tmp/log_{uuid.uuid4()}.txt"
logger = configure_logger(log_filename=random_log, max_bytes=1024, backup_count=1, max_age_days=3, formatter=None)

cot= ChainOfThoughtCallbackHandler(logger=logger)
llm=ChatOpenAI(model_name="gpt-3.5", temperature=0.7, callbacks=[cot])

log_directory = "tmp/logs" #Directory logs are stored in
app = Flask(__name__)

log_viewer = log_viewer_blueprint(base_path=log_directory, allowed_directories=[log_directory])

app.register_blueprint(log_viewer, url_prefix='/logs')

Now going to http://localhost:5000/logs/stream/xxxxxx where xxxxx is the random_log file name will now stream the log for you!

Local Dev

If using local dev with the example, we tend to use poetry for python management

poetry install -G dev

This will install langchain_openai as it's an optional dependency that we used in the example

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_logger-0.1.0.tar.gz (5.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_logger-0.1.0-py3-none-any.whl (6.1 kB view details)

Uploaded Python 3

File details

Details for the file langchain_logger-0.1.0.tar.gz.

File metadata

  • Download URL: langchain_logger-0.1.0.tar.gz
  • Upload date:
  • Size: 5.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for langchain_logger-0.1.0.tar.gz
Algorithm Hash digest
SHA256 a4929baa4e0de93e4ed3d3a0a4e641e9ee8d1774f0a8dd8f9ea41b920c49718e
MD5 911e223abe8257e34627d53b14a5d792
BLAKE2b-256 b80e2bb6e945ec40592ace8304f7f4a30193e17bbaacb18b0b84ea0a2d5b85ef

See more details on using hashes here.

File details

Details for the file langchain_logger-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_logger-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d6de526899d446096a3de386066079883a95ace72e4295a5097ff54fe05ed05a
MD5 5dd7c53e008decd4436d01f0ed1a1c7d
BLAKE2b-256 3629493c41ddebff6c3e196d18be25b2209955099227b097fb09b044e135a414

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page