This operation creates a new tensor by applying sparse updates to the passed
in tensor.
This operation is very similar to tf.scatter_nd, except that the updates are
scattered onto an existing tensor (as opposed to a zero-tensor). If the memory
for the existing tensor cannot be re-used, a copy is made and updated.
If indices contains duplicates, then their updates are accumulated (summed).
indices is an integer tensor containing indices into a new tensor of shape
shape. The last dimension of indices can be at most the rank of shape:
indices.shape[-1] <=shape.rank
The last dimension of indices corresponds to indices into elements
(if indices.shape[-1] = shape.rank) or slices
(if indices.shape[-1] < shape.rank) along dimension indices.shape[-1] of
shape. updates is a tensor with shape
indices.shape[:-1]+shape[indices.shape[-1]:]
The simplest form of scatter is to insert individual elements in a tensor by
index. For example, say we want to insert 4 scattered elements in a rank-1
tensor with 8 elements.
In Python, this scatter operation would look like this:
We can also, insert entire slices of a higher rank tensor all at once. For
example, if we wanted to insert two slices in the first dimension of a
rank-3 tensor with two matrices of new values.
In Python, this scatter operation would look like this:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[],null,["# tf.tensor_scatter_nd_update\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------|\n| [TensorFlow 1 version](/versions/r1.15/api_docs/python/tf/tensor_scatter_nd_update) |\n\nScatter `updates` into an existing tensor according to `indices`.\n\n#### View aliases\n\n\n**Compat aliases for migration**\n\nSee\n[Migration guide](https://www.tensorflow.org/guide/migrate) for\nmore details.\n\n[`tf.compat.v1.tensor_scatter_nd_update`](/api_docs/python/tf/tensor_scatter_nd_update), [`tf.compat.v1.tensor_scatter_update`](/api_docs/python/tf/tensor_scatter_nd_update)\n\n\u003cbr /\u003e\n\n tf.tensor_scatter_nd_update(\n tensor, indices, updates, name=None\n )\n\nThis operation creates a new tensor by applying sparse `updates` to the passed\nin `tensor`.\nThis operation is very similar to [`tf.scatter_nd`](../tf/scatter_nd), except that the updates are\nscattered onto an existing tensor (as opposed to a zero-tensor). If the memory\nfor the existing tensor cannot be re-used, a copy is made and updated.\n\nIf `indices` contains duplicates, then their updates are accumulated (summed).\n| **Warning:** The order in which updates are applied is nondeterministic, so the output will be nondeterministic if `indices` contains duplicates -- because of some numerical approximation issues, numbers summed in different order may yield different results.\n\n`indices` is an integer tensor containing indices into a new tensor of shape\n`shape`. The last dimension of `indices` can be at most the rank of `shape`: \n\n indices.shape[-1] \u003c= shape.rank\n\nThe last dimension of `indices` corresponds to indices into elements\n(if `indices.shape[-1] = shape.rank`) or slices\n(if `indices.shape[-1] \u003c shape.rank`) along dimension `indices.shape[-1]` of\n`shape`. `updates` is a tensor with shape \n\n indices.shape[:-1] + shape[indices.shape[-1]:]\n\nThe simplest form of scatter is to insert individual elements in a tensor by\nindex. For example, say we want to insert 4 scattered elements in a rank-1\ntensor with 8 elements. \n\nIn Python, this scatter operation would look like this: \n\n indices = tf.constant([[4], [3], [1], [7]])\n updates = tf.constant([9, 10, 11, 12])\n tensor = tf.ones([8], dtype=tf.int32)\n updated = tf.tensor_scatter_update(tensor, indices, updates)\n with tf.Session() as sess:\n print(sess.run(scatter))\n\nThe resulting tensor would look like this: \n\n [1, 11, 1, 10, 9, 1, 1, 12]\n\nWe can also, insert entire slices of a higher rank tensor all at once. For\nexample, if we wanted to insert two slices in the first dimension of a\nrank-3 tensor with two matrices of new values.\n\nIn Python, this scatter operation would look like this: \n\n indices = tf.constant([[0], [2]])\n updates = tf.constant([[[5, 5, 5, 5], [6, 6, 6, 6],\n [7, 7, 7, 7], [8, 8, 8, 8]],\n [[5, 5, 5, 5], [6, 6, 6, 6],\n [7, 7, 7, 7], [8, 8, 8, 8]]])\n tensor = tf.ones([4, 4, 4])\n updated = tf.tensor_scatter_update(tensor, indices, updates)\n with tf.Session() as sess:\n print(sess.run(scatter))\n\nThe resulting tensor would look like this: \n\n [[[5, 5, 5, 5], [6, 6, 6, 6], [7, 7, 7, 7], [8, 8, 8, 8]],\n [[1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1]],\n [[5, 5, 5, 5], [6, 6, 6, 6], [7, 7, 7, 7], [8, 8, 8, 8]],\n [[1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1], [1, 1, 1, 1]]]\n\nNote that on CPU, if an out of bound index is found, an error is returned.\nOn GPU, if an out of bound index is found, the index is ignored.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-----------|----------------------------------------------------------------------------------|\n| `tensor` | A `Tensor`. Tensor to copy/update. |\n| `indices` | A `Tensor`. Must be one of the following types: `int32`, `int64`. Index tensor. |\n| `updates` | A `Tensor`. Must have the same type as `tensor`. Updates to scatter into output. |\n| `name` | A name for the operation (optional). |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A `Tensor`. Has the same type as `tensor`. ||\n\n\u003cbr /\u003e"]]