tf.keras.layers.experimental.preprocessing.RandomZoom
Stay organized with collections
Save and categorize content based on your preferences.
Randomly zoom each image during training.
Inherits From: PreprocessingLayer
, Layer
, Module
tf.keras.layers.experimental.preprocessing.RandomZoom(
height_factor, width_factor=None, fill_mode='reflect',
interpolation='bilinear', seed=None, name=None, fill_value=0.0,
**kwargs
)
Arguments |
height_factor
|
a float represented as fraction of value, or a tuple
of size 2 representing lower and upper bound for zooming vertically.
When represented as a single float, this value is used for both the
upper and lower bound. A positive value means zooming out, while a
negative value means zooming in.
For instance, height_factor=(0.2, 0.3) result in an output zoomed out
by a random amount in the range [+20%, +30%].
height_factor=(-0.3, -0.2) result in an output zoomed in by a random
amount in the range [+20%, +30%].
|
width_factor
|
a float represented as fraction of value, or a tuple
of size 2 representing lower and upper bound for zooming horizontally.
When represented as a single float, this value is used for both the
upper and lower bound.
For instance, width_factor=(0.2, 0.3) result in an output zooming out
between 20% to 30%.
width_factor=(-0.3, -0.2) result in an output zooming in between 20%
to 30%. Defaults to None , i.e., zooming vertical and horizontal
directions by preserving the aspect ratio.
|
fill_mode
|
Points outside the boundaries of the input are filled according
to the given mode (one of {'constant', 'reflect', 'wrap', 'nearest'} ).
- reflect:
(d c b a | a b c d | d c b a)
The input is extended by reflecting about the edge of the last pixel.
- constant:
(k k k k | a b c d | k k k k)
The input is extended by filling all values beyond the edge with the
same constant value k = 0.
- wrap:
(a b c d | a b c d | a b c d)
The input is extended by wrapping around to the opposite edge.
- nearest:
(a a a a | a b c d | d d d d)
The input is extended by the nearest pixel.
|
interpolation
|
Interpolation mode. Supported values: "nearest", "bilinear".
|
seed
|
Integer. Used to create a random seed.
|
name
|
A string, the name of the layer.
|
fill_value
|
a float represents the value to be filled outside the
boundaries when fill_mode is "constant".
|
Example:
input_img = np.random.random((32, 224, 224, 3))
layer = tf.keras.layers.experimental.preprocessing.RandomZoom(.5, .2)
out_img = layer(input_img)
out_img.shape
TensorShape([32, 224, 224, 3])
4D tensor with shape:
(samples, height, width, channels)
, data_format='channels_last'.
Output shape:
4D tensor with shape:
(samples, height, width, channels)
, data_format='channels_last'.
Raise |
ValueError
|
if lower bound is not between [0, 1], or upper bound is
negative.
|
Methods
adapt
View source
adapt(
data, reset_state=True
)
Fits the state of the preprocessing layer to the data being passed.
Arguments |
data
|
The data to train on. It can be passed either as a tf.data
Dataset, or as a numpy array.
|
reset_state
|
Optional argument specifying whether to clear the state of
the layer at the start of the call to adapt , or whether to start
from the existing state. This argument may not be relevant to all
preprocessing layers: a subclass of PreprocessingLayer may choose to
throw if 'reset_state' is set to False.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2021-02-18 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-02-18 UTC."],[],[],null,["# tf.keras.layers.experimental.preprocessing.RandomZoom\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.4.0/tensorflow/python/keras/layers/preprocessing/image_preprocessing.py#L887-L1041) |\n\nRandomly zoom each image during training.\n\nInherits From: [`PreprocessingLayer`](../../../../../tf/keras/layers/experimental/preprocessing/PreprocessingLayer), [`Layer`](../../../../../tf/keras/layers/Layer), [`Module`](../../../../../tf/Module)\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.keras.layers.experimental.preprocessing.RandomZoom`](https://www.tensorflow.org/api_docs/python/tf/keras/layers/experimental/preprocessing/RandomZoom)\n\n\u003cbr /\u003e\n\n tf.keras.layers.experimental.preprocessing.RandomZoom(\n height_factor, width_factor=None, fill_mode='reflect',\n interpolation='bilinear', seed=None, name=None, fill_value=0.0,\n **kwargs\n )\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments --------- ||\n|-----------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `height_factor` | a float represented as fraction of value, or a tuple of size 2 representing lower and upper bound for zooming vertically. When represented as a single float, this value is used for both the upper and lower bound. A positive value means zooming out, while a negative value means zooming in. For instance, `height_factor=(0.2, 0.3)` result in an output zoomed out by a random amount in the range \\[+20%, +30%\\]. `height_factor=(-0.3, -0.2)` result in an output zoomed in by a random amount in the range \\[+20%, +30%\\]. |\n| `width_factor` | a float represented as fraction of value, or a tuple of size 2 representing lower and upper bound for zooming horizontally. When represented as a single float, this value is used for both the upper and lower bound. For instance, `width_factor=(0.2, 0.3)` result in an output zooming out between 20% to 30%. `width_factor=(-0.3, -0.2)` result in an output zooming in between 20% to 30%. Defaults to `None`, i.e., zooming vertical and horizontal directions by preserving the aspect ratio. |\n| `fill_mode` | Points outside the boundaries of the input are filled according to the given mode (one of `{'constant', 'reflect', 'wrap', 'nearest'}`). \u003cbr /\u003e - *reflect* : `(d c b a | a b c d | d c b a)` The input is extended by reflecting about the edge of the last pixel. - *constant* : `(k k k k | a b c d | k k k k)` The input is extended by filling all values beyond the edge with the same constant value k = 0. - *wrap* : `(a b c d | a b c d | a b c d)` The input is extended by wrapping around to the opposite edge. - *nearest* : `(a a a a | a b c d | d d d d)` The input is extended by the nearest pixel. |\n| `interpolation` | Interpolation mode. Supported values: \"nearest\", \"bilinear\". |\n| `seed` | Integer. Used to create a random seed. |\n| `name` | A string, the name of the layer. |\n| `fill_value` | a float represents the value to be filled outside the boundaries when `fill_mode` is \"constant\". |\n\n#### Example:\n\n input_img = np.random.random((32, 224, 224, 3))\n layer = tf.keras.layers.experimental.preprocessing.RandomZoom(.5, .2)\n out_img = layer(input_img)\n out_img.shape\n TensorShape([32, 224, 224, 3])\n\n#### Input shape:\n\n4D tensor with shape:\n`(samples, height, width, channels)`, data_format='channels_last'.\n\n#### Output shape:\n\n4D tensor with shape:\n`(samples, height, width, channels)`, data_format='channels_last'.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raise ----- ||\n|--------------|---------------------------------------------------------------------|\n| `ValueError` | if lower bound is not between \\[0, 1\\], or upper bound is negative. |\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `adapt`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.4.0/tensorflow/python/keras/engine/base_preprocessing_layer.py#L53-L66) \n\n adapt(\n data, reset_state=True\n )\n\nFits the state of the preprocessing layer to the data being passed.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Arguments ||\n|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `data` | The data to train on. It can be passed either as a tf.data Dataset, or as a numpy array. |\n| `reset_state` | Optional argument specifying whether to clear the state of the layer at the start of the call to `adapt`, or whether to start from the existing state. This argument may not be relevant to all preprocessing layers: a subclass of PreprocessingLayer may choose to throw if 'reset_state' is set to False. |\n\n\u003cbr /\u003e"]]