【问题标题】:Naming of Keras Tuner Trials directory for TensorBoardTensorBoard 的 Keras Tuner Trials 目录的命名
【发布时间】:2021-04-26 04:00:22
【问题描述】:

我正在使用Keras 调谐器的BayesianOptimization 来搜索模型的最佳超参数,我还使用TensorBoard 回调来可视化每个模型/试验的性能。

但是,来自 Tuner 的试验名称/标签很奇怪(例如 trial_1dc4838863f2e4e8a84f0e415ee1db33)。有没有办法让调谐器将试验命名为“trial_1”、“trial_2”等?而不是后面的所有数字和字母?

我在Keras 文档中找不到如何操作,或者在创建 Tuner 实例时是否有参数。

【问题讨论】:

    标签: python-3.x tensorflow keras tensorboard keras-tuner


    【解决方案1】:

    我能够通过覆盖 BayesianOptimizationBayesianOptimizationOracle 类来解决这个问题。它只是将每个试验命名为“0”、“1”、“2”等。

    但是,如果这更灵活,那就太好了,因为我可能最终会为其他超调谐器方法这样做。也是。

    from kerastuner.engine import trial as trial_lib
    from kerastuner.tuners import BayesianOptimization
    from kerastuner.tuners.bayesian import \
        BayesianOptimization, BayesianOptimizationOracle
    
    
    class CustomBayesianOptimizationOracle(BayesianOptimizationOracle):
    
        def __init__(self,
                     objective,
                     max_trials,
                     num_initial_points=None,
                     alpha=1e-4,
                     beta=2.6,
                     seed=None,
                     hyperparameters=None,
                     allow_new_entries=True,
                     tune_new_entries=True):
            super(CustomBayesianOptimizationOracle, self).__init__(
                objective=objective,
                max_trials=max_trials,
                num_initial_points=num_initial_points,
                alpha=alpha,
                beta=beta,
                seed=seed,
                hyperparameters=hyperparameters,
                tune_new_entries=tune_new_entries,
                allow_new_entries=allow_new_entries)
    
            self.trial_id = '0'
    
        def create_trial(self, tuner_id):
            """Create a new `Trial` to be run by the `Tuner`.
    
            A `Trial` corresponds to a unique set of hyperparameters to be run
            by `Tuner.run_trial`.
    
            Args:
              tuner_id: A ID that identifies the `Tuner` requesting a
              `Trial`. `Tuners` that should run the same trial (for instance,
               when running a multi-worker model) should have the same ID.
    
            Returns:
              A `Trial` object containing a set of hyperparameter values to run
              in a `Tuner`.
            """
            # Allow for multi-worker DistributionStrategy within a Trial.
            if tuner_id in self.ongoing_trials:
                return self.ongoing_trials[tuner_id]
    
            if self.max_trials and len(self.trials) >= self.max_trials:
                status = trial_lib.TrialStatus.STOPPED
                values = None
            else:
                response = self._populate_space(self.trial_id)
                status = response['status']
                values = response['values'] if 'values' in response else None
    
            hyperparameters = self.hyperparameters.copy()
            hyperparameters.values = values or {}
            trial = trial_lib.Trial(
                hyperparameters=hyperparameters,
                trial_id=self.trial_id,
                status=status)
    
            if status == trial_lib.TrialStatus.RUNNING:
                self.ongoing_trials[tuner_id] = trial
                self.trials[self.trial_id] = trial
                self._save_trial(trial)
                self.save()
    
            self.trial_id = str(int(self.trial_id) + 1)
    
            return trial
    
    
    class CustomBayesianOptimization(BayesianOptimization):
    
        def __init__(self,
                     hypermodel,
                     objective,
                     max_trials,
                     num_initial_points=2,
                     seed=None,
                     hyperparameters=None,
                     tune_new_entries=True,
                     allow_new_entries=True,
                     **kwargs):
            oracle = CustomBayesianOptimizationOracle(
                objective=objective,
                max_trials=max_trials,
                num_initial_points=num_initial_points,
                seed=seed,
                hyperparameters=hyperparameters,
                tune_new_entries=tune_new_entries,
                allow_new_entries=allow_new_entries)
            super(BayesianOptimization, self).__init__(
                oracle=oracle,
                hypermodel=hypermodel,
                **kwargs)
    

    【讨论】:

    • 我打算尝试走那条路,但想仔细检查是否有简单的方法。还是很有帮助的,谢谢!但是你为什么要创建一个自定义类,你不能只编辑现有的吗?我假设您现在需要调用“CustomBayesianOptimization”作为您的调谐器类,对吧?
    • 我不想修改已安装的软件包,这通常不是一个好主意,如果这就是您编辑现有的意思。是的,我正在调用自定义类。
    猜你喜欢
    • 1970-01-01
    • 2020-04-05
    • 2019-02-01
    • 2018-04-09
    • 2021-05-23
    • 2021-12-28
    • 2019-02-07
    • 1970-01-01
    • 2020-09-24
    相关资源
    最近更新 更多