Given file pattern (or list of files), will setup a shared queue for file
names, setup a worker queue that pulls from the shared queue, read Example
protos using provided reader, use batch queue to create batches of examples
of size batch_size. This provides at most once visit guarantees. Note that
this only works if the parameter servers are not pre-empted or restarted or
the session is not restored from a checkpoint since the state of a queue
is not checkpointed and we will end up restarting from the entire list of
files.
All queue runners are added to the queue runners collection, and may be
started via start_queue_runners.
All ops are added to the default graph.
Use parse_fn if you need to do parsing / processing on single examples.
Args
file_pattern
List of files or patterns of file paths containing
Example records. See tf.io.gfile.glob for pattern rules.
batch_size
An int or scalar Tensor specifying the batch size to use.
reader
A function or class that returns an object with
read method, (filename tensor) -> (example tensor).
randomize_input
Whether the input should be randomized.
num_epochs
Integer specifying the number of times to read through the
dataset. If None, cycles through the dataset forever.
NOTE - If specified, creates a variable that must be initialized, so call
tf.compat.v1.local_variables_initializer() and run the op in a session.
queue_capacity
Capacity for input queue.
num_threads
The number of threads enqueuing examples.
read_batch_size
An int or scalar Tensor specifying the number of
records to read at once.
parse_fn
Parsing function, takes Example Tensor returns parsed
representation. If None, no parsing is done.
name
Name of resulting op.
seed
An integer (optional). Seed used if randomize_input == True.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.learn.read_keyed_batch_examples_shared_queue\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/learn/python/learn/learn_io/graph_io.py#L186-L256) |\n\nAdds operations to read, queue, batch `Example` protos. (deprecated) \n\n tf.contrib.learn.read_keyed_batch_examples_shared_queue(\n file_pattern, batch_size, reader, randomize_input=True, num_epochs=None,\n queue_capacity=10000, num_threads=1, read_batch_size=1, parse_fn=None,\n name=None, seed=None\n )\n\n| **Warning:** THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: Use tf.data.\n\nGiven file pattern (or list of files), will setup a shared queue for file\nnames, setup a worker queue that pulls from the shared queue, read `Example`\nprotos using provided `reader`, use batch queue to create batches of examples\nof size `batch_size`. This provides at most once visit guarantees. Note that\nthis only works if the parameter servers are not pre-empted or restarted or\nthe session is not restored from a checkpoint since the state of a queue\nis not checkpointed and we will end up restarting from the entire list of\nfiles.\n\nAll queue runners are added to the queue runners collection, and may be\nstarted via `start_queue_runners`.\n\nAll ops are added to the default graph.\n\nUse `parse_fn` if you need to do parsing / processing on single examples.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `file_pattern` | List of files or patterns of file paths containing `Example` records. See [`tf.io.gfile.glob`](../../../tf/io/gfile/glob) for pattern rules. |\n| `batch_size` | An int or scalar `Tensor` specifying the batch size to use. |\n| `reader` | A function or class that returns an object with `read` method, (filename tensor) -\\\u003e (example tensor). |\n| `randomize_input` | Whether the input should be randomized. |\n| `num_epochs` | Integer specifying the number of times to read through the dataset. If `None`, cycles through the dataset forever. NOTE - If specified, creates a variable that must be initialized, so call [`tf.compat.v1.local_variables_initializer()`](../../../tf/initializers/local_variables) and run the op in a session. |\n| `queue_capacity` | Capacity for input queue. |\n| `num_threads` | The number of threads enqueuing examples. |\n| `read_batch_size` | An int or scalar `Tensor` specifying the number of records to read at once. |\n| `parse_fn` | Parsing function, takes `Example` Tensor returns parsed representation. If `None`, no parsing is done. |\n| `name` | Name of resulting op. |\n| `seed` | An integer (optional). Seed used if randomize_input == True. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| Returns tuple of: \u003cbr /\u003e - `Tensor` of string keys. - String `Tensor` of batched `Example` proto. ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|---------------------|\n| `ValueError` | for invalid inputs. |\n\n\u003cbr /\u003e"]]