tf_agents.bandits.policies.ranking_policy.DescendingScoreSampler
Stay organized with collections
Save and categorize content based on your preferences.
Base neural network module class.
tf_agents.bandits.policies.ranking_policy.DescendingScoreSampler(
unused_features: tf_agents.typing.types.Tensor
,
num_slots: int,
scores: tf_agents.typing.types.Tensor
,
unused_penalty_mixture_coefficient: float
)
A module is a named container for tf.Variable
s, other tf.Module
s and
functions which apply to user input. For example a dense layer in a neural
network might be implemented as a tf.Module
:
class Dense(tf.Module):
def __init__(self, input_dim, output_size, name=None):
super().__init__(name=name)
self.w = tf.Variable(
tf.random.normal([input_dim, output_size]), name='w')
self.b = tf.Variable(tf.zeros([output_size]), name='b')
def __call__(self, x):
y = tf.matmul(x, self.w) + self.b
return tf.nn.relu(y)
You can use the Dense layer as you would expect:
d = Dense(input_dim=3, output_size=2)
d(tf.ones([1, 3]))
<tf.Tensor: shape=(1, 2), dtype=float32, numpy=..., dtype=float32)>
By subclassing tf.Module
instead of object
any tf.Variable
or
tf.Module
instances assigned to object properties can be collected using
the variables
, trainable_variables
or submodules
property:
d.variables
(<tf.Variable 'b:0' shape=(2,) dtype=float32, numpy=...,
dtype=float32)>,
<tf.Variable 'w:0' shape=(3, 2) dtype=float32, numpy=..., dtype=float32)>)
Subclasses of tf.Module
can also take advantage of the _flatten
method
which can be used to implement tracking of any other types.
All tf.Module
classes have an associated tf.name_scope
which can be used
to group operations in TensorBoard and create hierarchies for variable names
which can help with debugging. We suggest using the name scope when creating
nested submodules/parameters or for forward methods whose graph you might want
to inspect in TensorBoard. You can enter the name scope explicitly using
with self.name_scope:
or you can annotate methods (apart from __init__
)
with @tf.Module.with_name_scope
.
class MLP(tf.Module):
def __init__(self, input_size, sizes, name=None):
super().__init__(name=name)
self.layers = []
with self.name_scope:
for size in sizes:
self.layers.append(Dense(input_dim=input_size, output_size=size))
input_size = size
@tf.Module.with_name_scope
def __call__(self, x):
for layer in self.layers:
x = layer(x)
return x
module = MLP(input_size=5, sizes=[5, 5])
module.variables
(<tf.Variable 'mlp/b:0' shape=(5,) dtype=float32, numpy=..., dtype=float32)>,
<tf.Variable 'mlp/w:0' shape=(5, 5) dtype=float32, numpy=...,
dtype=float32)>,
<tf.Variable 'mlp/b:0' shape=(5,) dtype=float32, numpy=..., dtype=float32)>,
<tf.Variable 'mlp/w:0' shape=(5, 5) dtype=float32, numpy=...,
dtype=float32)>)
Methods
sample
View source
sample(
shape=(), seed=None
)
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2024-04-26 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-04-26 UTC."],[],[],null,["# tf_agents.bandits.policies.ranking_policy.DescendingScoreSampler\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/agents/blob/v0.19.0/tf_agents/bandits/policies/ranking_policy.py#L323-L336) |\n\nBase neural network module class. \n\n tf_agents.bandits.policies.ranking_policy.DescendingScoreSampler(\n unused_features: ../../../../tf_agents/typing/types/Tensor,\n num_slots: int,\n scores: ../../../../tf_agents/typing/types/Tensor,\n unused_penalty_mixture_coefficient: float\n )\n\nA module is a named container for [`tf.Variable`](https://www.tensorflow.org/api_docs/python/tf/Variable)s, other [`tf.Module`](https://www.tensorflow.org/api_docs/python/tf/Module)s and\nfunctions which apply to user input. For example a dense layer in a neural\nnetwork might be implemented as a [`tf.Module`](https://www.tensorflow.org/api_docs/python/tf/Module): \n\n class Dense(tf.Module):\n def __init__(self, input_dim, output_size, name=None):\n super().__init__(name=name)\n self.w = tf.Variable(\n tf.random.normal([input_dim, output_size]), name='w')\n self.b = tf.Variable(tf.zeros([output_size]), name='b')\n def __call__(self, x):\n y = tf.matmul(x, self.w) + self.b\n return tf.nn.relu(y)\n\nYou can use the Dense layer as you would expect: \n\n d = Dense(input_dim=3, output_size=2)\n d(tf.ones([1, 3]))\n \u003ctf.Tensor: shape=(1, 2), dtype=float32, numpy=..., dtype=float32)\u003e\n\nBy subclassing [`tf.Module`](https://www.tensorflow.org/api_docs/python/tf/Module) instead of `object` any [`tf.Variable`](https://www.tensorflow.org/api_docs/python/tf/Variable) or\n[`tf.Module`](https://www.tensorflow.org/api_docs/python/tf/Module) instances assigned to object properties can be collected using\nthe `variables`, `trainable_variables` or `submodules` property: \n\n d.variables\n (\u003ctf.Variable 'b:0' shape=(2,) dtype=float32, numpy=...,\n dtype=float32)\u003e,\n \u003ctf.Variable 'w:0' shape=(3, 2) dtype=float32, numpy=..., dtype=float32)\u003e)\n\nSubclasses of [`tf.Module`](https://www.tensorflow.org/api_docs/python/tf/Module) can also take advantage of the `_flatten` method\nwhich can be used to implement tracking of any other types.\n\nAll [`tf.Module`](https://www.tensorflow.org/api_docs/python/tf/Module) classes have an associated [`tf.name_scope`](https://www.tensorflow.org/api_docs/python/tf/name_scope) which can be used\nto group operations in TensorBoard and create hierarchies for variable names\nwhich can help with debugging. We suggest using the name scope when creating\nnested submodules/parameters or for forward methods whose graph you might want\nto inspect in TensorBoard. You can enter the name scope explicitly using\n`with self.name_scope:` or you can annotate methods (apart from `__init__`)\nwith [`@tf.Module.with_name_scope`](https://www.tensorflow.org/api_docs/python/tf/Module#with_name_scope). \n\n class MLP(tf.Module):\n def __init__(self, input_size, sizes, name=None):\n super().__init__(name=name)\n self.layers = []\n with self.name_scope:\n for size in sizes:\n self.layers.append(Dense(input_dim=input_size, output_size=size))\n input_size = size\n @tf.Module.with_name_scope\n def __call__(self, x):\n for layer in self.layers:\n x = layer(x)\n return x\n\n module = MLP(input_size=5, sizes=[5, 5])\n module.variables\n (\u003ctf.Variable 'mlp/b:0' shape=(5,) dtype=float32, numpy=..., dtype=float32)\u003e,\n \u003ctf.Variable 'mlp/w:0' shape=(5, 5) dtype=float32, numpy=...,\n dtype=float32)\u003e,\n \u003ctf.Variable 'mlp/b:0' shape=(5,) dtype=float32, numpy=..., dtype=float32)\u003e,\n \u003ctf.Variable 'mlp/w:0' shape=(5, 5) dtype=float32, numpy=...,\n dtype=float32)\u003e)\n\nMethods\n-------\n\n### `sample`\n\n[View source](https://github.com/tensorflow/agents/blob/v0.19.0/tf_agents/bandits/policies/ranking_policy.py#L335-L336) \n\n sample(\n shape=(), seed=None\n )"]]