This is a major feature release that includes the new `PredictorEvaluationWorkflow` and `DataSourceDesignSpace`, along with convenience methods for waiting on modules and executions and fetching descriptors from Data Sources.
**What’s New**
* A `PredictorEvaluationWorkflow` has been added. This workflow allows you to evaluate predictor performance using multiple `PredictorEvaluator`s. The first evaluator we're introducing is the `CrossValidationEvaluator`. This evaluator allows you to configure k-fold cross-validation and request desired metrics, e.g. RMSE, NDME, etc. (See `citrine.informatics.predictor_evaluation_metrics` for all the metrics that can be requested.) This workflow is meant to replace the `PerformanceWorkflow`. Instead of returning results as JSON, `PredictorEvaluationWorkflow` results are objects that allow you to fetch results by response, and computed values are now objects that codify the value associated with each requested metric.
* A `DataSourceDesignSpace` has been introduced which allows a Design Space to be created directly from a Data Source.
* New utility functions `wait_while_validating` and `wait_while_executing` allow you to easily wait while a resource is validating or in progress and fetch the result when it's ready.
* Descriptors can be fetched from data source using `DescriptorMethods.descriptors_from_data_source`.
**Improvements**
* The `IngredientQuantityByProcessAndName` variable definition now includes an optional `units` parameter to better support absolute quantities.
* Documentation for TableConfigs and GEM Tables has been updated.
**Deprecations**
* The `PerformanceWorkflow` has been deprecated in favor of the `PredictorEvaluationWorkflow`.
* The `CrossValidationAnalysis` has been deprecated in favor of the `CrossValidationEvaluator`. The latter is used to configure the new `PredictorEvaluationWorkflow`.