Analogous to batch_gather. This assumes that ref, indices and updates
have a series of leading dimensions that are the same for all of them, and the
updates are performed on the last dimension of indices. In other words, the
dimensions should be the following:
To avoid this operation there would be 2 alternatives:
1) Reshaping the variable by merging the first ndims dimensions. However,
this is not possible because tf.reshape returns a Tensor, which we
cannot use tf.compat.v1.scatter_update on.
2) Looping over the first ndims of the variable and using
tf.compat.v1.scatter_update on the subtensors that result of slicing the
first
dimension. This is a valid option for ndims = 1, but less efficient than
this implementation.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[],null,["# tf.compat.v1.batch_scatter_update\n\n\u003cbr /\u003e\n\n|----------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.0.0/tensorflow/python/ops/state_ops.py#L818-L915) |\n\nGeneralization of [`tf.compat.v1.scatter_update`](../../../tf/compat/v1/scatter_update) to axis different than 0. (deprecated) \n\n tf.compat.v1.batch_scatter_update(\n ref, indices, updates, use_locking=True, name=None\n )\n\n| **Warning:** THIS FUNCTION IS DEPRECATED. It will be removed after 2018-11-29. Instructions for updating: Use the batch_scatter_update method of Variable instead.\n\nAnalogous to `batch_gather`. This assumes that `ref`, `indices` and `updates`\nhave a series of leading dimensions that are the same for all of them, and the\nupdates are performed on the last dimension of indices. In other words, the\ndimensions should be the following:\n\n`num_prefix_dims = indices.ndims - 1`\n`batch_dim = num_prefix_dims + 1`\n`updates.shape = indices.shape + var.shape[batch_dim:]`\n\nwhere\n\n`updates.shape[:num_prefix_dims]`\n`== indices.shape[:num_prefix_dims]`\n`== var.shape[:num_prefix_dims]`\n\nAnd the operation performed can be expressed as:\n\n`var[i_1, ..., i_n, indices[i_1, ..., i_n, j]] = updates[i_1, ..., i_n, j]`\n\nWhen indices is a 1D tensor, this operation is equivalent to\n[`tf.compat.v1.scatter_update`](../../../tf/compat/v1/scatter_update).\n\nTo avoid this operation there would be 2 alternatives:\n\n1) Reshaping the variable by merging the first `ndims` dimensions. However,\nthis is not possible because [`tf.reshape`](../../../tf/reshape) returns a Tensor, which we\ncannot use [`tf.compat.v1.scatter_update`](../../../tf/compat/v1/scatter_update) on.\n2) Looping over the first `ndims` of the variable and using\n[`tf.compat.v1.scatter_update`](../../../tf/compat/v1/scatter_update) on the subtensors that result of slicing the\nfirst\ndimension. This is a valid option for `ndims = 1`, but less efficient than\nthis implementation.\n\nSee also [`tf.compat.v1.scatter_update`](../../../tf/compat/v1/scatter_update) and [`tf.compat.v1.scatter_nd_update`](../../../tf/compat/v1/scatter_nd_update).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|---------------|-----------------------------------------------------------|\n| `ref` | `Variable` to scatter onto. |\n| `indices` | Tensor containing indices as described above. |\n| `updates` | Tensor of updates to apply to `ref`. |\n| `use_locking` | Boolean indicating whether to lock the writing operation. |\n| `name` | Optional scope name string. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Ref to `variable` after it has been modified. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|-----------------------------------------------------------------------------|\n| `ValueError` | If the initial `ndims` of `ref`, `indices`, and `updates` are not the same. |\n\n\u003cbr /\u003e"]]