Skip to main content

Toolset for granular and live profiling

Project description

Pyrofiler

https://img.shields.io/pypi/v/pyrofiler.svg https://img.shields.io/travis/DaniloZZZ/pyrofiler.svg Documentation Status

Toolset for granular memory and cpu live profiling

Quick start

Contextmanager that measures time of execution

# examples/simple_profile.py
import pyrofiler
import time

with pyrofiler.timing('Time elapsed'):
    time.sleep(1)
$ python simple_profile.py
Time elapsed : 1.001563310623169

Decorators for profiling functions

# examples/simple_profile_cpu.py
import pyrofiler

@pyrofiler.cpu_util(description='Cpu usage')
@pyrofiler.timed('Time elapsed')
def sum_series(x, N):
    return sum([x**i/i for i in range(1, N)])

sum_series(.3, 1000_000)
$ python simple_profile_cpu.py
Time elapsed : 0.13478374481201172
Cpu usage : 29.4

Aggregate the results in common context:

# examples/profile_with_context.py
from pyrofiler import Profiler
import time

prof = Profiler()

with prof.timing('Time 1'):
    time.sleep(1)

with prof.timing('Time 2'):
    time.sleep(1.5)

print('Profiling data recorded:')
print(prof.data)
$ python profile_with_context.py
Time 1 : 1.0011215209960938
Time 2 : 1.5020403861999512
Profiling data recorded:
{'Time 1': 1.0011215209960938, 'Time 2': 1.5020403861999512}

You can use other actions, for example appending results to some list in data. Check the documentation for more use cases

Design

There are following types of objects in pyrofiler:

  1. Measures, which are run as a context manager

  2. Decorators, that are based on measures

  3. Profiler class that uses decorators to aggregate data

Callbacks

The decorators have an optional argument callback, to which you can pass a function that will handle the data. The function will be passed profiling results as a first argument, as well as any other arguments that you provided to original decorator.

Here, a custom spice argument is provided

def print_spicy_time(time, spice):
    print(f'Spice {spice} took {time} seconds')

@pyrofiler.timed(spice='spicy', callback=print_spicy_time)
def spicy_sleep():
    time.sleep(10)

Similar products

Problems

Either you have a cli tool that profiles memory and cpu, but no code api for granular data

or you have stuff like decorators and no memory profiling

Having a live dashboard would help also, use https://github.com/libvis for that

Features

  • TODO

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

History

0.1.0 (2020-03-04)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyrofiler-0.1.10.tar.gz (22.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyrofiler-0.1.10-py2.py3-none-any.whl (10.4 kB view details)

Uploaded Python 2Python 3

File details

Details for the file pyrofiler-0.1.10.tar.gz.

File metadata

  • Download URL: pyrofiler-0.1.10.tar.gz
  • Upload date:
  • Size: 22.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for pyrofiler-0.1.10.tar.gz
Algorithm Hash digest
SHA256 a7af2aa1b31cc23f327478cfa53214204de1526ce647175dc99a998d9d63c87d
MD5 2559fb3485943b99c9be900e4d8cbf0a
BLAKE2b-256 c2bb77fe7a2de538cf57bf82c05998fe69a6d2e0807f52316a41389978a78df0

See more details on using hashes here.

File details

Details for the file pyrofiler-0.1.10-py2.py3-none-any.whl.

File metadata

  • Download URL: pyrofiler-0.1.10-py2.py3-none-any.whl
  • Upload date:
  • Size: 10.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for pyrofiler-0.1.10-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f22806ae240cad955b680b92904b542c0d0fd32fa694d863785beca8b0e9718b
MD5 57b1abb481a839e409ff23f818ffb066
BLAKE2b-256 097d20db3f267b454f3008095eb9ddf160b790617bc0893acaba80c5ed913ac7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page