Module: tfp.optimizer
Stay organized with collections
Save and categorize content based on your preferences.
TensorFlow Probability Optimizer python package.
Modules
convergence_criteria
module: TensorFlow Probability convergence criteria for optimizations.
linesearch
module: Line-search optimizers package.
Classes
class StochasticGradientLangevinDynamics
: An optimizer module for stochastic gradient Langevin dynamics.
class VariationalSGD
: An optimizer module for constant stochastic gradient descent.
Functions
bfgs_minimize(...)
: Applies the BFGS algorithm to minimize a differentiable function.
converged_all(...)
: Condition to stop when all batch members have converged or failed.
converged_any(...)
: Condition to stop when any batch member converges, or all have failed.
differential_evolution_minimize(...)
: Applies the Differential evolution algorithm to minimize a function.
differential_evolution_one_step(...)
: Performs one step of the differential evolution algorithm.
lbfgs_minimize(...)
: Applies the L-BFGS algorithm to minimize a differentiable function.
nelder_mead_minimize(...)
: Minimum of the objective function using the Nelder Mead simplex algorithm.
nelder_mead_one_step(...)
: A single iteration of the Nelder Mead algorithm.
proximal_hessian_sparse_minimize(...)
: Minimize using Hessian-informed proximal gradient descent.
proximal_hessian_sparse_one_step(...)
: One step of (the outer loop of) the minimization algorithm.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-11-21 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-11-21 UTC."],[],[],null,["# Module: tfp.optimizer\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/probability/blob/v0.23.0/tensorflow_probability/python/optimizer/__init__.py) |\n\nTensorFlow Probability Optimizer python package.\n\nModules\n-------\n\n[`convergence_criteria`](../tfp/optimizer/convergence_criteria) module: TensorFlow Probability convergence criteria for optimizations.\n\n[`linesearch`](../tfp/optimizer/linesearch) module: Line-search optimizers package.\n\nClasses\n-------\n\n[`class StochasticGradientLangevinDynamics`](../tfp/optimizer/StochasticGradientLangevinDynamics): An optimizer module for stochastic gradient Langevin dynamics.\n\n[`class VariationalSGD`](../tfp/optimizer/VariationalSGD): An optimizer module for constant stochastic gradient descent.\n\nFunctions\n---------\n\n[`bfgs_minimize(...)`](../tfp/optimizer/bfgs_minimize): Applies the BFGS algorithm to minimize a differentiable function.\n\n[`converged_all(...)`](../tfp/optimizer/converged_all): Condition to stop when all batch members have converged or failed.\n\n[`converged_any(...)`](../tfp/optimizer/converged_any): Condition to stop when any batch member converges, or all have failed.\n\n[`differential_evolution_minimize(...)`](../tfp/optimizer/differential_evolution_minimize): Applies the Differential evolution algorithm to minimize a function.\n\n[`differential_evolution_one_step(...)`](../tfp/optimizer/differential_evolution_one_step): Performs one step of the differential evolution algorithm.\n\n[`lbfgs_minimize(...)`](../tfp/optimizer/lbfgs_minimize): Applies the L-BFGS algorithm to minimize a differentiable function.\n\n[`nelder_mead_minimize(...)`](../tfp/optimizer/nelder_mead_minimize): Minimum of the objective function using the Nelder Mead simplex algorithm.\n\n[`nelder_mead_one_step(...)`](../tfp/optimizer/nelder_mead_one_step): A single iteration of the Nelder Mead algorithm.\n\n[`proximal_hessian_sparse_minimize(...)`](../tfp/optimizer/proximal_hessian_sparse_minimize): Minimize using Hessian-informed proximal gradient descent.\n\n[`proximal_hessian_sparse_one_step(...)`](../tfp/optimizer/proximal_hessian_sparse_one_step): One step of (the outer loop of) the minimization algorithm."]]