This function is deprecated in favor of
tf.config.experimental.get_memory_info. Calling this function is equivalent
to calling tf.config.experimental.get_memory_info()['current'].
For GPUs, TensorFlow will allocate all the memory by default, unless changed
with tf.config.experimental.set_memory_growth. This function only returns
the memory that TensorFlow is actually using, not the memory that TensorFlow
has allocated on the GPU.
Args
device
Device string to get the bytes in use for, e.g. "GPU:0"
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2022-11-04 UTC."],[],[]]