Python tensorflow.keras.optimizers() Examples
The following are 3
code examples of tensorflow.keras.optimizers().
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example.
You may also want to check out all available functions/classes of the module
tensorflow.keras
, or try the search function
.
Example #1
Source File: utils.py From MultiPlanarUNet with MIT License | 5 votes |
def init_optimizer(optimizer_string, logger=None, **kwargs): """ Same as 'init_losses', but for optimizers. Please refer to the 'init_losses' docstring. """ optimizer = _init( optimizer_string, tf_funcs=[optimizers, addon_optimizers], custom_funcs=None, logger=logger )[0] return optimizer(**kwargs)
Example #2
Source File: utils.py From MultiPlanarUNet with MIT License | 5 votes |
def init_activation(activation_string, logger=None, **kwargs): """ Same as 'init_losses', but for optimizers. Please refer to the 'init_losses' docstring. """ activation = _init( activation_string, tf_funcs=[activations, addon_activations], custom_funcs=None, logger=logger )[0] return activation
Example #3
Source File: keras_classification_model.py From DeepPavlov with Apache License 2.0 | 5 votes |
def compile(self, model: Model, optimizer_name: str, loss_name: str, learning_rate: Optional[Union[float, List[float]]], learning_rate_decay: Optional[Union[float, str]]) -> Model: """ Compile model with given optimizer and loss Args: model: keras uncompiled model optimizer_name: name of optimizer from keras.optimizers loss_name: loss function name (from keras.losses) learning_rate: learning rate. learning_rate_decay: learning rate decay. Returns: """ optimizer_func = getattr(tensorflow.keras.optimizers, optimizer_name, None) if callable(optimizer_func): if isinstance(learning_rate, float) and isinstance(learning_rate_decay, float): # in this case decay will be either given in config or, by default, learning_rate_decay=0. self.optimizer = optimizer_func(lr=learning_rate, decay=learning_rate_decay) else: self.optimizer = optimizer_func() else: raise AttributeError("Optimizer {} is not defined in `tensorflow.keras.optimizers`".format(optimizer_name)) loss_func = getattr(tensorflow.keras.losses, loss_name, None) if callable(loss_func): loss = loss_func else: raise AttributeError("Loss {} is not defined".format(loss_name)) model.compile(optimizer=self.optimizer, loss=loss) return model