tf.keras.initializers.VarianceScaling
Stay organized with collections
Save and categorize content based on your preferences.
Initializer capable of adapting its scale to the shape of weights tensors.
Inherits From: Initializer
tf.keras.initializers.VarianceScaling(
scale=1.0, mode='fan_in', distribution='truncated_normal',
seed=None
)
Also available via the shortcut function
tf.keras.initializers.variance_scaling
.
With distribution="truncated_normal" or "untruncated_normal"
, samples are
drawn from a truncated/untruncated normal distribution with a mean of zero and
a standard deviation (after truncation, if used) stddev = sqrt(scale / n)
,
where n
is:
- number of input units in the weight tensor, if
mode="fan_in"
- number of output units, if
mode="fan_out"
- average of the numbers of input and output units, if
mode="fan_avg"
With distribution="uniform"
, samples are drawn from a uniform distribution
within [-limit, limit]
, where limit = sqrt(3 * scale / n)
.
Examples:
# Standalone usage:
initializer = tf.keras.initializers.VarianceScaling(
scale=0.1, mode='fan_in', distribution='uniform')
values = initializer(shape=(2, 2))
# Usage in a Keras layer:
initializer = tf.keras.initializers.VarianceScaling(
scale=0.1, mode='fan_in', distribution='uniform')
layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)
Args |
scale
|
Scaling factor (positive float).
|
mode
|
One of "fan_in", "fan_out", "fan_avg".
|
distribution
|
Random distribution to use. One of "truncated_normal",
"untruncated_normal" and "uniform".
|
seed
|
A Python integer. An initializer created with a given seed will
always produce the same random tensor for a given shape and dtype.
|
Methods
from_config
View source
@classmethod
from_config(
config
)
Instantiates an initializer from a configuration dictionary.
Example:
initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)
Args |
config
|
A Python dictionary, the output of get_config .
|
get_config
View source
get_config()
Returns the configuration of the initializer as a JSON-serializable dict.
Returns |
A JSON-serializable Python dict.
|
__call__
View source
__call__(
shape, dtype=None, **kwargs
)
Returns a tensor object initialized as specified by the initializer.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2021-08-16 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-08-16 UTC."],[],[],null,["# tf.keras.initializers.VarianceScaling\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/master/keras/initializers/initializers_v2.py#L424-L525) |\n\nInitializer capable of adapting its scale to the shape of weights tensors.\n\nInherits From: [`Initializer`](../../../tf/keras/initializers/Initializer)\n\n#### View aliases\n\n\n**Main aliases**\n\n[`tf.initializers.VarianceScaling`](https://www.tensorflow.org/api_docs/python/tf/keras/initializers/VarianceScaling), [`tf.initializers.variance_scaling`](https://www.tensorflow.org/api_docs/python/tf/keras/initializers/VarianceScaling), [`tf.keras.initializers.variance_scaling`](https://www.tensorflow.org/api_docs/python/tf/keras/initializers/VarianceScaling)\n\n\u003cbr /\u003e\n\n tf.keras.initializers.VarianceScaling(\n scale=1.0, mode='fan_in', distribution='truncated_normal',\n seed=None\n )\n\nAlso available via the shortcut function\n[`tf.keras.initializers.variance_scaling`](../../../tf/keras/initializers/VarianceScaling).\n\nWith `distribution=\"truncated_normal\" or \"untruncated_normal\"`, samples are\ndrawn from a truncated/untruncated normal distribution with a mean of zero and\na standard deviation (after truncation, if used) `stddev = sqrt(scale / n)`,\nwhere `n` is:\n\n- number of input units in the weight tensor, if `mode=\"fan_in\"`\n- number of output units, if `mode=\"fan_out\"`\n- average of the numbers of input and output units, if `mode=\"fan_avg\"`\n\nWith `distribution=\"uniform\"`, samples are drawn from a uniform distribution\nwithin `[-limit, limit]`, where `limit = sqrt(3 * scale / n)`.\n\n#### Examples:\n\n # Standalone usage:\n initializer = tf.keras.initializers.VarianceScaling(\n scale=0.1, mode='fan_in', distribution='uniform')\n values = initializer(shape=(2, 2))\n\n # Usage in a Keras layer:\n initializer = tf.keras.initializers.VarianceScaling(\n scale=0.1, mode='fan_in', distribution='uniform')\n layer = tf.keras.layers.Dense(3, kernel_initializer=initializer)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|----------------|------------------------------------------------------------------------------------------------------------------------------------|\n| `scale` | Scaling factor (positive float). |\n| `mode` | One of \"fan_in\", \"fan_out\", \"fan_avg\". |\n| `distribution` | Random distribution to use. One of \"truncated_normal\", \"untruncated_normal\" and \"uniform\". |\n| `seed` | A Python integer. An initializer created with a given seed will always produce the same random tensor for a given shape and dtype. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `from_config`\n\n[View source](https://github.com/keras-team/keras/tree/master/keras/initializers/initializers_v2.py#L89-L108) \n\n @classmethod\n from_config(\n config\n )\n\nInstantiates an initializer from a configuration dictionary.\n\n#### Example:\n\n initializer = RandomUniform(-1, 1)\n config = initializer.get_config()\n initializer = RandomUniform.from_config(config)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|--------------------------------------------------|\n| `config` | A Python dictionary, the output of `get_config`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A [`tf.keras.initializers.Initializer`](../../../tf/keras/initializers/Initializer) instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/keras-team/keras/tree/master/keras/initializers/initializers_v2.py#L519-L525) \n\n get_config()\n\nReturns the configuration of the initializer as a JSON-serializable dict.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A JSON-serializable Python dict. ||\n\n\u003cbr /\u003e\n\n### `__call__`\n\n[View source](https://github.com/keras-team/keras/tree/master/keras/initializers/initializers_v2.py#L485-L517) \n\n __call__(\n shape, dtype=None, **kwargs\n )\n\nReturns a tensor object initialized as specified by the initializer.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `shape` | Shape of the tensor. |\n| `dtype` | Optional dtype of the tensor. Only floating point types are supported. If not specified, [`tf.keras.backend.floatx()`](../../../tf/keras/backend/floatx) is used, which default to `float32` unless you configured it otherwise (via [`tf.keras.backend.set_floatx(float_dtype)`](../../../tf/keras/backend/set_floatx)) |\n| `**kwargs` | Additional keyword arguments. |\n\n\u003cbr /\u003e"]]