Keras 2 API documentation / Layers API / Layer weight initializers

Layer weight initializers

[source]

RandomNormal class

tf_keras.initializers.RandomNormal(mean=0.0, stddev=0.05, seed=None)

Initializer that generates tensors with a normal distribution.

Also available via the shortcut function tf.keras.initializers.random_normal.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.RandomNormal(mean=0., stddev=1.)
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.RandomNormal(mean=0., stddev=1.)
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • mean: a python scalar or a scalar tensor. Mean of the random values to generate.
  • stddev: a python scalar or a scalar tensor. Standard deviation of the random values to generate.
  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will produce the same random values across multiple calls.

[source]

RandomUniform class

tf_keras.initializers.RandomUniform(minval=-0.05, maxval=0.05, seed=None)

Initializer that generates tensors with a uniform distribution.

Also available via the shortcut function tf.keras.initializers.random_uniform.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.RandomUniform(minval=0., maxval=1.)
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.RandomUniform(minval=0., maxval=1.)
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • minval: A python scalar or a scalar tensor. Lower bound of the range of random values to generate (inclusive).
  • maxval: A python scalar or a scalar tensor. Upper bound of the range of random values to generate (exclusive).
  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will produce the same random values across multiple calls.

[source]

TruncatedNormal class

tf_keras.initializers.TruncatedNormal(mean=0.0, stddev=0.05, seed=None)

Initializer that generates a truncated normal distribution.

Also available via the shortcut function tf.keras.initializers.truncated_normal.

The values generated are similar to values from a tf.keras.initializers.RandomNormal initializer except that values more than two standard deviations from the mean are discarded and re-drawn.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.TruncatedNormal(mean=0., stddev=1.)
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.TruncatedNormal(mean=0., stddev=1.)
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • mean: a python scalar or a scalar tensor. Mean of the random values to generate.
  • stddev: a python scalar or a scalar tensor. Standard deviation of the random values to generate before truncation.
  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will produce the same random values across multiple calls.

[source]

Zeros class

tf_keras.initializers.Zeros()

Initializer that generates tensors initialized to 0.

Also available via the shortcut function tf.keras.initializers.zeros.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.Zeros()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.Zeros()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

[source]

Ones class

tf_keras.initializers.Ones()

Initializer that generates tensors initialized to 1.

Also available via the shortcut function tf.keras.initializers.ones.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.Ones()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.Ones()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

[source]

GlorotNormal class

tf_keras.initializers.GlorotNormal(seed=None)

The Glorot normal initializer, also called Xavier normal initializer.

Also available via the shortcut function tf.keras.initializers.glorot_normal.

Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.GlorotNormal()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.GlorotNormal()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will not produce the same random values across multiple calls, but multiple initializers will produce the same sequence when constructed with the same seed value.

References


[source]

GlorotUniform class

tf_keras.initializers.GlorotUniform(seed=None)

The Glorot uniform initializer, also called Xavier uniform initializer.

Also available via the shortcut function tf.keras.initializers.glorot_uniform.

Draws samples from a uniform distribution within [-limit, limit], where limit = sqrt(6 / (fan_in + fan_out)) (fan_in is the number of input units in the weight tensor and fan_out is the number of output units).

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.GlorotUniform()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.GlorotUniform()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will not produce the same random values across multiple calls, but multiple initializers will produce the same sequence when constructed with the same seed value.

References


[source]

HeNormal class

tf_keras.initializers.HeNormal(seed=None)

He normal initializer.

Also available via the shortcut function tf.keras.initializers.he_normal.

It draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / fan_in) where fan_in is the number of input units in the weight tensor.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.HeNormal()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.HeNormal()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will not produce the same random values across multiple calls, but multiple initializers will produce the same sequence when constructed with the same seed value.

References


[source]

HeUniform class

tf_keras.initializers.HeUniform(seed=None)

He uniform variance scaling initializer.

Also available via the shortcut function tf.keras.initializers.he_uniform.

Draws samples from a uniform distribution within [-limit, limit], where limit = sqrt(6 / fan_in) (fan_in is the number of input units in the weight tensor).

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.HeUniform()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.HeUniform()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will not produce the same random values across multiple calls, but multiple initializers will produce the same sequence when constructed with the same seed value.

References


[source]

Identity class

tf_keras.initializers.Identity(gain=1.0)

Initializer that generates the identity matrix.

Also available via the shortcut function tf.keras.initializers.identity.

Only usable for generating 2D matrices.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.Identity()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.Identity()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • gain: Multiplicative factor to apply to the identity matrix.

[source]

Orthogonal class

tf_keras.initializers.Orthogonal(gain=1.0, seed=None)

Initializer that generates an orthogonal matrix.

Also available via the shortcut function tf.keras.initializers.orthogonal.

If the shape of the tensor to initialize is two-dimensional, it is initialized with an orthogonal matrix obtained from the QR decomposition of a matrix of random numbers drawn from a normal distribution. If the matrix has fewer rows than columns then the output will have orthogonal rows. Otherwise, the output will have orthogonal columns.

If the shape of the tensor to initialize is more than two-dimensional, a matrix of shape (shape[0] * ... * shape[n - 2], shape[n - 1]) is initialized, where n is the length of the shape vector. The matrix is subsequently reshaped to give a tensor of the desired shape.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.Orthogonal()
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.Orthogonal()
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • gain: multiplicative factor to apply to the orthogonal matrix
  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will produce the same random values across multiple calls.

References


[source]

Constant class

tf_keras.initializers.Constant(value=0)

Initializer that generates tensors with constant values.

Also available via the shortcut function tf.keras.initializers.constant.

Only scalar values are allowed. The constant value provided must be convertible to the dtype requested when calling the initializer.

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.Constant(3.)
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.Constant(3.)
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • value: A Python scalar.

[source]

VarianceScaling class

tf_keras.initializers.VarianceScaling(
    scale=1.0, mode="fan_in", distribution="truncated_normal", seed=None
)

Initializer that adapts its scale to the shape of its input tensors.

Also available via the shortcut function tf.keras.initializers.variance_scaling.

With distribution="truncated_normal" or "untruncated_normal", samples are drawn from a truncated/untruncated normal distribution with a mean of zero and a standard deviation (after truncation, if used) stddev = sqrt(scale / n), where n is:

  • number of input units in the weight tensor, if mode="fan_in"
  • number of output units, if mode="fan_out"
  • average of the numbers of input and output units, if mode="fan_avg"

With distribution="uniform", samples are drawn from a uniform distribution within [-limit, limit], where limit = sqrt(3 * scale / n).

Examples

>>> # Standalone usage:
>>> initializer = tf.keras.initializers.VarianceScaling(
... scale=0.1, mode='fan_in', distribution='uniform')
>>> values = initializer(shape=(2, 2))
>>> # Usage in a TF-Keras layer:
>>> initializer = tf.keras.initializers.VarianceScaling(
... scale=0.1, mode='fan_in', distribution='uniform')
>>> layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)

Arguments

  • scale: Scaling factor (positive float).
  • mode: One of "fan_in", "fan_out", "fan_avg".
  • distribution: Random distribution to use. One of "truncated_normal", "untruncated_normal", or "uniform".
  • seed: A Python integer. Used to make the behavior of the initializer deterministic. Note that a seeded initializer will produce the same random values across multiple calls.