gnn_tracking_hpo.tune
#
Module Contents#
Classes#
For most arguments, see corresponding command line interface. |
Functions#
|
|
|
Interpret timeout string as seconds. |
|
Simple run without tuning for testing purposes. |
|
Dispatch with ray tune Arguments see Dispater.__call__. |
- gnn_tracking_hpo.tune.add_common_options(parser: argparse.ArgumentParser)#
- gnn_tracking_hpo.tune.get_timeout_stopper(timeout: str | None = None) ray.tune.Stopper | None #
Interpret timeout string as seconds.
- gnn_tracking_hpo.tune.simple_run_without_tune(trainable, suggest_config: Callable) None #
Simple run without tuning for testing purposes.
- class gnn_tracking_hpo.tune.Dispatcher(*, test=False, cpu=False, restore=None, enqueue: None | list[str] = None, only_enqueued=False, fixed: None | str = None, timeout: None | str = None, tags=None, group=None, note=None, fail_fast=False, dname: str | None = None, metric='trk.double_majority_pt1.5', no_tune=False, num_samples: None | int = None, no_scheduler=False, local=False, grace_period=3, no_improvement_patience=10, additional_stoppers=None)#
For most arguments, see corresponding command line interface.
- Parameters:
grace_period – Grace period for ASHA scheduler.
no_improvement_patience – Number of iterations without improvement before stopping
- __call__(trainable: type[ray.tune.Trainable], suggest_config: Callable) ray.tune.ResultGrid #
- Parameters:
trainable – The trainable to run.
suggest_config – A function that returns a config dictionary.
Returns:
- get_resources() dict[str, int] #
- get_tuner(trainable: type[ray.tune.Trainable], suggest_config: Callable) ray.tune.Tuner #
- get_no_improvement_stopper() rt_stoppers_contrib.NoImprovementTrialStopper | None #
- get_stoppers() list[ray.tune.Stopper] #
- get_wandb_callbacks() list[ray.tune.Callback] #
- get_callbacks() list[ray.tune.Callback] #
- points_to_evaluate() list[dict[str, Any]] #
- get_optuna_sampler()#
- get_optuna_search(suggest_config: Callable) ray.tune.search.optuna.OptunaSearch #
- get_num_samples() int #
Return number of samples/trials to run
- get_scheduler() None | ray.tune.schedulers.ASHAScheduler #
- get_tune_config(suggest_config: Callable) ray.tune.TuneConfig #
- get_checkpoint_config() ray.air.CheckpointConfig #
- get_run_config() ray.air.RunConfig #
- gnn_tracking_hpo.tune.main(trainable, suggest_config, *args, **kwargs) ray.tune.ResultGrid #
Dispatch with ray tune Arguments see Dispater.__call__.