Published/scientific-latex-templates/optimization-control-systems / pythontex-files-main / py_default_default_2.stdout
225 viewsubuntu2404
Gradient-Based Optimization Algorithms: Starting point: [-1. 1.] Starting function value: 4.000000 Momentum gradient descent converged in 3069 iterations Adam optimizer converged in 1568 iterations Final results: Gradient Descent: x = [0.99248173 0.98498977], f(x) = 0.00005662 Momentum GD: x = [0.99999888 0.99999776], f(x) = 0.00000000 Adam: x = [0.99999889 0.99999778], f(x) = 0.00000000 True minimum: x = [1, 1], f(x) = 0