Datarobot-batch-scoring

Latest version: v1.16.4

Safety actively analyzes 631310 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 8

1.13.1

=========================

Bugfixes
--------
* Fix package installation in Python 3 environments.

1.13.0

=========================

Enhancements
------------
* Brings back support for legacy predictions (api/v1) and a new parameter for specifying api version (``--api_version``).
Check ``batch_scoring --help`` for a list of valid options and the default value.
* Adds ``--no_verify_ssl`` argument for disabling SSL verification and ``--ca_bundle``
for specifying certificate(s) of trusted Certificate Authorities.
* Default for timeout is now None, meaning that the code does not enforce a timeout for operations to the server. This allows completion of runs with higher numbers of threads, particularly in MacOS. The value remains modifiable, and 30 seconds is a reasonable value in most cases.

Bugfixes
--------
* An issue which caused exit codes to not be set correctly from executables installed via the standalone installer
has been addressed. The exit codes will now be set correctly.
* An issue which caused script crashes if one or more boolean options were specified in the config file.

1.12.1

=======================

Bugfixes
--------
* Updates the distribution metadata to include modules critical to the functioning of this library.

1.12.0

======================

Enhancements
------------
* Batch scoring now works with Python 3.6 on Windows (offline installs require 3.5 though)
* Logs now include version, retry attempts and whether output file was removed.
* New argument `no-resume` that allows you to start new batch-scoring run from scratch without being questioned about previous runs.
* The version of the dependency `trafaret` has been pinned to `0.10.0` to deal with a breaking change in the interface
of that package.


Documentation
-------------
* A new "Version Compatibility" section has been added to the README to help surface to users any
incompatibilities between versions of `batch_scoring` and versions of `DataRobot`.

1.11.0

====================

New Features
------------
* New parameter `field_size_limit` allows users to specify a larger maximum field
size than the Python `csv` module normally allows. Users can use a larger number
for this value if they encounter issues with very large text fields, for example.
Please note that using larger values for this parameter may cause issues with
memory consumption.

Bugfixes
--------
* Previously, files whose first few lines did not fit within 512KB would error during
the auto-sampler (which finds a reasonable number of rows to send with each batch).
This issue hsa been fixed by adding a fallback to a default of 10 lines per
batch in these cases. This parameter can still be overridden by using the
`n_samples` parameter.

* Fix issue when client error message wasn't logged properly.

1.10.2

===================
* Set default timeout on server response to infinity.

Page 3 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.