Python client for Fiddler Service
Project description
Fiddler Client
Python client for interacting with Fiddler. Provides a user-friendly interface to our REST API and enables event publishing for use with our monitoring features.
Requirements
Requires Python >= Python-3.6.3 and pip >= 19.0
Installation
$ pip3 install fiddler-client
API Example Usage
Documentation for the API can be found here. For examples of interacting with our APIs, please check out our Quick Start Guide as well as the notebooks found on our Examples Github.
Version History
2.4.0
-
New Features
- Support for enrichments
2.3.0
-
New Features
- Added support for creating alerts on the
Frequencymetric.
- Added support for creating alerts on the
-
Modifications
- Relax pydantic version to allow any version between 1.9 and 2
2.2.1
-
Modifications
- Relax pydantic version to allow any version between 1.9 and 2
2.2.0
-
New Features
- Support segments
2.1.2
-
Modifications
- Relax pydantic version to allow any version between 1.9 and 2
2.1.1
-
Modifications
- Update
pyarrowrequirement to7.0.0.
- Update
2.1.0
-
New Features
- Introduce Model Tasks
NOT_SETandLLM - Relax Target / Output specification for model tasks
NOT_SETandLLM - Support custom metrics
-
Modifications
DatasetDataSourceandEventIdDataSourcewill takedataset_idinstead ofdataset_namelist_baselines()to return baseline names instead of baseline objects
2.0.8
-
Modifications
- Relax pydantic version to allow any version between 1.9 and 2
2.0.7
-
Modifications
- Support string metric types for alert creation, for server 23.7.
2.0.6
-
Modifications
- Update
pyarrowrequirement to7.0.0.
- Update
2.0.5
-
Modifications
- Fix for a minor bug in
fdl.DatasetInfo.from_dataframe()
- Fix for a minor bug in
2.0.4
-
Modifications
- Update
pyarrowrequirement to13.0.0.
- Update
2.0.3
-
Modifications
- Relax pandas version for 2.0.
- get_slice()
queryparameter reverted tosql_query
2.0.2
-
Modifications
- Fix Parquet conversion issue in
upload_datasetandpublish_event_batch
- Fix Parquet conversion issue in
2.0.1
-
Removed
- Following methods are removed
- register_model
- upload_model_package
- update_model
- trigger_pre_computation
- _trigger_model_predictions
- generate_sample_events
- list_teams
- list_project_roles
- list_org_roles
- unshare_project
- share_project
- process_avro
- process_csv
- Following methods are removed
-
New Features
- Add
monitor_componentsas an attribute forCustomFeatureof typeFROM_COLUMNS. Default asFalse - Adds new statistic type
SUMto supported alert metrics - Support
CustomFeatureof typeFROM_VECTORFROM_TEXT_EMBEDDINGandFROM_IMAGE_EMBEDDING
- Add
-
Modificatiosn
- Remove
columnas a parameter inadd_alert_ruleandget_alert_rulesfunctions - Default
FileTypeParquet inupload_datasetandpublish_event_batch -
- get_slice()
sql_queryparameter changed toquery
- get_slice()
- Remove
1.8.6
-
Modifications
- Relax pandas version for 2.0.
1.8.4
-
New Features
- New DeploymentType enum for
MANUALdeployment
1.8.3
-
Modifications
- New
columnsparameter in add_alert_rule and get_alert_rules to support multiple columns to be used for server version >= 23.3.0 - get_triggered_alerts supports
alert_valueas a float as well as a dict
1.8.2
-
Modifications
- Fixed a bug where
minandmaxfor columns of typefloatindataset_infoare cast intointafter uploaded
- Fixed a bug where
1.8.1
-
Modifications
- Fixed a bug wherein null string was going in request body if body wasn't specified.
- Fix
categorical_target_class_detailswhen passed as an array - Fix a bug where
fdl.ModelInputType.TEXTwere not being accepted properly - Fix
categorical_target_class_detailswhen passed as an empty list
1.8.0
-
Modifications
- Add new alert type -
statisticfor setting alerts - Add
target_class_orderas a required field ofModelInfoobject whenmodel_taskisMULTICLASS_CLASSIFICATION,RANKINGorBINARY_CLASSIFICATION. Only applies forBINARY_CLASSIFICATIONwhen target column is of typeCATEGORY - Add
columnsas a parameter inadd_alert_ruleandget_alert_rulesfunctions - Add deprecation warning for
columnas a parameter inadd_alert_ruleandget_alert_rulesfunctions
- Add new alert type -
1.7.4
-
Modification
- Do not typecast column with strings in get_slice()
1.7.3
-
Modification
- Send row and column count information to dataset upload api
1.7.2
-
Modification
- Bring back
WeightingParamsobject
- Bring back
1.7.1
-
Modification
- Relaxed boto3 version constraint
1.7.0
-
Removed
- Remove support for initializing fiddler client with version=1
- Following methods are removed
- get_segment_info
- delete_segment
- deactivate_segment
- activate_segment
- list_segments
- upload_segment
- add_monitoring_config
- publish_parquet_s3
- publish_events_log
1.6.2
-
Modifications
- Make dataset_id a required field in add_model()
- Update max_inferred_cardinality to 100
-
New Features
- New method for updating a model artifact
update_model_artifact
- New method for updating a model artifact
1.5.3
-
Modifications
- Fix add_model_artifact error for NLP models
- Add model_info validation during add_model
1.5.2
-
Modifications
- Add fix for self signed certificate not working by adding verify param to FiddlerApi
1.5.1
-
Modifications
- Fix in
violation_of_typeto include numpy dtypes such asint64
- Fix in
1.5.0
-
New Features
- New methods addition for alert rules:
add_alert_rule,get_alert_rules,delete_alert_rule - New method to get triggered alerts for an alert rule:
get_triggered_alerts
- New methods addition for alert rules:
1.4.5
-
Modifications
- Assert nullable columns in
missing_value_encodings(If users send non-nullable columns asmissing_value_encodings, Fiddler converts them as nullable automatically with a warning)
- Assert nullable columns in
1.4.4
-
Modifications
- Allow types other than
Column data_typeformissing_value_encodings.
- Allow types other than
1.4.3
-
Modifications
- Accept
string'inf'infloatcolumns inmissing_value_encodings.
- Accept
1.4.2
-
New Features
- Support
missing_value_encodingsas a new field ofmodel_infoobject.
- Support
1.4.1
-
Modifications
- Minor bug fix to handle string nan
1.4.0
-
Modifications
- Default client initiation is now the v2 client
publish_events_batchis now async, returns status id and doesn't wait for the upload to complete.- Default behavior of all publish data in v2 client is async (
is_sync = False)
1.3.0
-
New Features
- New capabilities for Artifact-less Monitoring
1.2.8
-
Modifications
- Change the
batch_sizeargument default to 1000 fortrigger_pre_computation - Updated the
delete_modelAPI default value for thedelete_prodparameter from False to True.- We will by default delete all the events associated with the model.
- Change the
1.2.7
-
Modifications
- Added check for "model" key before access in from_dict
- Allow changing artifact_status when updating the model
- Adds docstrings for add_model, add_model_surrogate and add_model_artifact
1.2.6
-
Modifications
- Fixed publish_events_batch_schema backward compatible.
1.2.5
-
Modifications
- Added add_model_surrogate and add_model_artifact APIs for artifactless monitoring
- Simplifies the add_model API by removing unnecessary parameters
- Fixed publish_events_batch_schema parameter names.
1.2.4
-
Modifications
- Fixed a type coercion bug that caused some get_slice calls to fail cryptically
1.2.3
-
Modifications
- Map Tree shap values from log odds space to probability space
- Added add_model API for artifactless monitoring
- Fixed bug in request when creating a model using add_model
1.2.2
-
Modifications
- Fixed a bug that prevented importing the client in some environments.
1.2.1
-
Modifications
- Removed unnecessary server-client version check that produced an uninformative warning.
1.2.0
-
New Features
- New
WeightingParamsobject. This enables weighted histograms for class-imbalanced models.
- New
-
Modifications
update_modelallows some small modifications in model info for the following fields: custom_explanation_names, preferred_explanation_method, display_name, description, framework, algorithm and model_deployment_params
1.1.0
-
New Features
- Add v2 client. v2 methods can be accessed either via sub-module (
client.v2.) or by instantiating theFiddlerApiand passingversion=2.
- Add v2 client. v2 methods can be accessed either via sub-module (
-
Modifications
- Remove handlers from root logger
- Add url, org_id, auth_token and version validation while instantiating client
- Fix dataset ingestion file extension issue
- init monitoring issue
- Fix publish_event request header bug
- Add
publish_events_batch_dataframeandupload_dataset_dataframemethods - Support for DatasetInfo class
- Using
http_clientpackage. A wrapper overrequetsts.
1.0.6
-
Modifications
- Add client v2 sub-package.
1.0.5
-
Modifications
- relax the version requirements for
requests. - adds flag to init_monitoring to enable synchronous initialization
- relax the version requirements for
1.0.4
-
Modifications
- Fixed the JSON transformation issue which was forcing
requestspackage upgrade issue
- Fixed the JSON transformation issue which was forcing
1.0.3
-
New Features
- Tree SHAP Helper.
-
Modifications
fdl.ModelInfohas an additional optional parameter to enabled Tree Shap
1.0.2
-
New Features
- Integrated Gradients Keras TF2 Helpers.
-
Modifications
- Relax
botocoreversion requirements.
- Relax
1.0.1
-
Modifications
- Minor bug fixes and improvements.
run_explanationhas two additional optional arguments (n_permutationandn_background) allowing users to change the default parameters for Fiddler SHAP explanations.
1.0.0
Inaugural client for Fiddler 22.0! This version includes numerous improvements for stability, performance, and usability.
Compatible with server versions >=22.0.0.
0.8.1.8
-
Modifications
- Minor bug fixes and improvements.
0.8.1.7
-
Modifications
- Minor bug fixes and improvements.
0.8.1.6
-
Modifications
- Add a parameter in list_projects API to get detailed project information
- Minor bug fix for the datetime format.
0.8.1.5
-
Modifications
- Add a parameter in list_projects API to get detailed project information
- Allow
run_explanationapi call to pass a list of explanation withig_flexand one of the shap algorithm
0.8.1.4
-
Modifications
- Minor bugfix for categorical feature drift
0.8.1.3
-
Modifications
- Addressed an issue with categorical features with string literals containing numeric content
0.8.1.2
-
Modifications
- Implement a ranking surrogate model for ranking task models
0.8.1.1
-
Modifications
- change the dependecy of requests package to 0.25.1
0.8.1
-
Modifications
- Improved
SegmentInfovalidation. - make the dependency versions less strict.
- Improved
0.8.0
-
New Features
- New
publish_events_batch_schemaAPI call, Publishes a batch events object to Fiddler Service using the passedpublish_schemaas a template. - New Ranking Monitoring capability available with publish_events_batch API
- New
-
Modifications
- Enforced package versions in setup.py
trigger_pre_computationhas an additional optional argument (cache_dataset) to enable/disable dataset histograms caching.register_modelhas 3 additional optional arguments to enable/disable pdp caching (set to False by default), feature importance caching (set to True by default) and dataset histograms caching (set to True by default).
0.7.6
-
New Features
- New segment monitoring related functionality (currently in preview):
- Ability to create and validate
SegmentInfoobjects, upload_segmentBE call,activate_segmentBE call,deactivate_segmentBE call, andlist_segmentsBE call,
- Ability to create and validate
- New segment monitoring related functionality (currently in preview):
-
Modifications
- Upon connecting to the server, the client now performs a version check for the server by default. Earlier the default was to only do a version check for the client.
0.7.5
-
New Features
- New
update_eventparameter forpublish_events_batchAPI. - Changes to
fdl.publish_event():- Renamed parameter
event_time_stamptoevent_timestamp - Added new parameter:
timestamp_format- Allows specification of timestamp format using the
FiddlerTimestampclass
- Allows specification of timestamp format using the
- Renamed parameter
- New
0.7.4
-
New Features
- New
initialize_monitoringAPI call, sets up monitoring for a model. Intended to also work retroactively for legacy schema.
- New
-
Modifications
- Modified
DatasetInfo.from_dataframeandModelInfo.from_dataset_infoto take additionaldataset_idas parameter. - Modified the
outputsparameter ofModelInfo.from_dataset_infoto now expect a dictionary in case of regression tasks, specifying output range. - Modified the
preferred_explanation_methodparameter ofModelInfo.from_dataset_infoto accept string names fromcustom_explanation_names. Details in docstring. - Misc bug fixes and documentation enhancements.
- Modified
0.7.3
-
New Features
- Changed the default display for
ModelInfoandDatasetInfoto render HTML instead of plaintext, when accessed via jupyter notebooks - Added support for GCP Storage ingestion of log events using
fdl.BatchPublishType.GCP_STORAGE
- Changed the default display for
0.7.2
-
New Features
- Restructured the following arguments for
fdl.ModelInfo.from_dataset_info()- Added:
categorical_target_class_details:- Mandatory for Multiclass classification tasks, optional for Binary (unused for Regression)
- Used to specify the positive class for Binary classification, and the class order for Multiclass classification
- Modified:
target:- No longer optional, models must specify target columns
- Added:
- Restructured the following arguments for
0.7.1
-
New Features
- Restructured the following arguments for
fdl.publish_events_batch()- Added:
id_field:- Column to extract
idvalue from
- Column to extract
- Added:
timestamp_format:- Format of timestamp within batch object. Can be one of:
fdl.FiddlerTimestamp.INFERfdl.FiddlerTimestamp.EPOCH_MILLISECONDSfdl.FiddlerTimestamp.EPOCH_SECONDSfdl.FiddlerTimestamp.ISO_8601Removed:default_timestamp
- Format of timestamp within batch object. Can be one of:
- Added:
- Minor bug fixes
- Restructured the following arguments for
-
Deprecation Warning
- Support
fdl.publish_events_logandfdl.publish_parquet_s3will soon be deprecated in favor offdl.publish_events_batch()
- Support
0.7.0
-
Dataset Refactor
- Datasets refactored to be members of a Project
- This is a change promoting Datasets to be first class within Fiddler. It will affects both the UI and several API in Fiddler
- Many API utilizing Projects will now require
project_idpassed as a parameter
- Datasets refactored to be members of a Project
-
New Features
- Added
fdl.update_model()to client- update the specified model, with model binary and package.py from the specified model_dir
- Added
fdl.get_model()to client- download the model binary, package.py and model.yaml to the given output dir.
- Added
fdl.publish_events_batch()to client- Publishes a batch events object to Fiddler Service.
- Note: Support for other batch methods including
fdl.publish_events_log()andfdl.publish_parquet_s3()will be deprecated in the near future in favor offdl.publish_events_batch()
- Added
-
Changes
- Simplified logic within
fld.upload-dataset() - Added client/server handshake for checking version compatibilities
- Warning issued in case of mismatch
- Deleted redundant APIs
fdl.create_surrogate_model()fdl.upload_model_sklearn()
- Restructured APIs to be more duck typing-friendly (relaxing data type restrictions)
- Patches for minor bug-fixes
- Simplified logic within
0.6.18
-
Features
- Minor updates to ease use of binary classification labels
0.6.17
-
Features
- Added new arguments to
ModelInfo.from_dataset_info()preferred_explanation_methodto express a preferred default explanation algorithm for a modelcustom_explanation_namesto support user-provided explanation algorithms which the user will implement on their model object via package.py.
- Added new arguments to
0.6.16
-
Features
- Minor improvements to
publish_events_log()to circumvent datatype conversion issues
- Minor improvements to
0.6.15
-
Features
- Added strict name checks
0.6.14
-
Features
- Added client-native multithreading support for
publish_events_log()using new parametersnum_threadsandbatch_size
- Added client-native multithreading support for
0.6.13
-
Features
- Added
fdl.generate_sample_events()to client- API for generating monitoring traffic to test out Fiddler
- Added
fdl.trigger_pre_computation()to client- Triggers various precomputation steps within the Fiddler service based on input parameters.
- Optionally add proxies to FiddlerApi() init
- Added
0.6.12
-
Features
- Added
fdl.publish_parquet_s3()to client- Publishes parquet events file from S3 to Fiddler instance. Experimental and may be expanded in the future.
- Added
0.6.10
-
Features
- Added
fdl.register_model()to client- Register a model in fiddler. This will generate a surrogate model, which can be replaced later with original model.
- Added
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fiddler-client-2.4.0.dev2.tar.gz.
File metadata
- Download URL: fiddler-client-2.4.0.dev2.tar.gz
- Upload date:
- Size: 166.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
16740c8381f3c125e81e275e682adf90f09910b16e47c7015c0ec0f7eccd90d5
|
|
| MD5 |
6aff0b91d5e012f8b4c8b77fb19f8812
|
|
| BLAKE2b-256 |
9bafd5c661a82b67a49c203016fadabdf9083af31b104c0f506ac2edba653f00
|
File details
Details for the file fiddler_client-2.4.0.dev2-py3-none-any.whl.
File metadata
- Download URL: fiddler_client-2.4.0.dev2-py3-none-any.whl
- Upload date:
- Size: 221.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1c89877c22cd6c6292e07621982bcc6510d9dd052b022885e161ad91339b9728
|
|
| MD5 |
6e952d0bfc00df82b0124e305a6c7881
|
|
| BLAKE2b-256 |
96393a78eb9ea42ab56817e16754ed37ccd812d6686753dccd2ce03e284f8f23
|