actableai.utils.resources.predict.models.create_pipeline(resource_predicted: actableai.utils.resources.predict.ResourcePredictorType, task: actableai.tasks.TaskType) Optional[actableai.utils.river.MultiOutputPipeline]¶Create a pipeline (model) for resource prediction
The river pipeline to use for prediction or None if not available
actableai.utils.resources.predict.predictor.ResourcesPredictorsActor(s3_resources_predictors_bucket: Optional[str] = None, s3_resources_predictors_prefix: Optional[str] = None, resources_predictors_data_path: Optional[str] = None, random_state: Optional[int] = None, backup_state_probability: Optional[float] = None, override_models: bool = False)¶Bases: object
Class used to store different resources predictors Is is assumed that this class will be used as a ray Actor
add_data(resource_predicted: actableai.utils.resources.predict.ResourcePredictorType, task: actableai.tasks.TaskType, features: dict, target: float, prediction: Optional[float] = None, timestamp: Optional[int] = None, full_features: Optional[dict] = None)¶Add data to the prediction model (train)
Will backup the resources predictors data to either AWS S3 or the filesystem with the pre-defined probability
get_actor(s3_resources_predictors_bucket: Optional[str] = None, s3_resources_predictors_prefix: Optional[str] = None, resources_predictors_data_path: Optional[str] = None, random_state: Optional[int] = None, backup_state_probability: Optional[float] = None) ray.actor.ActorHandle¶Get the ray Actor, will create it if it does not exist already
The Actor Handle
get_model_metrics(resource_predicted: actableai.utils.resources.predict.ResourcePredictorType, task: actableai.tasks.TaskType) Dict[str, float]¶Get the metrics of one specific model
Dictionary with the metrics names as a key and the metrics values as value
migrate(s3_bucket: str, s3_prefix_list: List[str])¶Update models and backup old training data
Here is how to call this function:
`python
from actableai.utils.resources.predict.predictor import ResourcesPredictorsActor
ResourcesPredictorsActor.migrate("actable-ai-resources-predictors", ["dev"])
`
In this example we are migrating the dev folder, but you can replace it by prod or even update both at the same time.
How it works? This function will create a backup folder and will move every models, model metrics, and training data already existing in this backup folder. Once this is done it will create brand new pipelines and will train each of these pipelines using the training data collected previously and uploaded in AWS. Finally it will save these new pipelines overriding the previous ones (which are backed up).
predict(resource_predicted: actableai.utils.resources.predict.ResourcePredictorType, task: actableai.tasks.TaskType, features: dict) float¶Make a prediction for a resource usage
The predicted value