Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
keras-team
GitHub Repository: keras-team/keras-io
Path: blob/master/templates/api/layers/regularizers.md
3297 views

Layer weight regularizers

Regularizers allow you to apply penalties on layer parameters or layer activity during optimization. These penalties are summed into the loss function that the network optimizes.

Regularization penalties are applied on a per-layer basis. The exact API will depend on the layer, but many layers (e.g. Dense, Conv1D, Conv2D and Conv3D) have a unified API.

These layers expose 3 keyword arguments:

  • kernel_regularizer: Regularizer to apply a penalty on the layer's kernel

  • bias_regularizer: Regularizer to apply a penalty on the layer's bias

  • activity_regularizer: Regularizer to apply a penalty on the layer's output

from keras import layers from keras import regularizers layer = layers.Dense( units=64, kernel_regularizer=regularizers.L1L2(l1=1e-5, l2=1e-4), bias_regularizer=regularizers.L2(1e-4), activity_regularizer=regularizers.L2(1e-5) )

The value returned by the activity_regularizer object gets divided by the input batch size so that the relative weighting between the weight regularizers and the activity regularizers does not change with the batch size.

You can access a layer's regularization penalties by calling layer.losses after calling the layer on inputs:

from keras import ops layer = layers.Dense(units=5, kernel_initializer='ones', kernel_regularizer=regularizers.L1(0.01), activity_regularizer=regularizers.L2(0.01)) tensor = ops.ones(shape=(5, 5)) * 2.0 out = layer(tensor) # The kernel regularization term is 0.25 # The activity regularization term (after dividing by the batch size) is 5 print(ops.sum(layer.losses)) # 5.25 (= 5 + 0.25)

Available regularizers

The following built-in regularizers are available as part of the keras.regularizers module:

{{autogenerated}}

Creating custom regularizers

Simple callables

A weight regularizer can be any callable that takes as input a weight tensor (e.g. the kernel of a Conv2D layer), and returns a scalar loss. Like this:

def my_regularizer(x): return 1e-3 * ops.sum(ops.square(x))

Regularizer subclasses

If you need to configure your regularizer via various arguments (e.g. l1 and l2 arguments in l1_l2), you should implement it as a subclass of keras.regularizers.Regularizer.

Here's a simple example:

class MyRegularizer(regularizers.Regularizer): def __init__(self, strength): self.strength = strength def __call__(self, x): return self.strength * ops.sum(ops.square(x))

Optionally, you can also implement the method get_config and the class method from_config in order to support serialization -- just like with any Keras object. Example:

class MyRegularizer(regularizers.Regularizer): def __init__(self, strength): self.strength = strength def __call__(self, x): return self.strength * ops.sum(ops.square(x)) def get_config(self): return {'strength': self.strength}