Difference between revisions of "Lbfgs.m"

From Spinach Documentation Wiki
Jump to: navigation, search
(Returns)
(Arguments)
 
(One intermediate revision by the same user not shown)
Line 4: Line 4:
 
==Syntax==
 
==Syntax==
  
     direction=lbfgs(dx_hist,dg_hist,g,n_grads)
+
     direction=lbfgs(dx_hist,dg_hist,g)
  
 
==Arguments==
 
==Arguments==
 
              
 
              
     dx_hist        - history of x increments,
+
     dx_hist        - history of x increments, a stack
                       bookshelf array
+
                       of column vectors, from the latest
 +
                      to the earliest
 
   
 
   
 
     dg_hist        - history of gradient increments,
 
     dg_hist        - history of gradient increments,
                       bookshelf array
+
                       a stack of column vectors, from
+
                      the latest to the earliest
 +
 
 
     g              - current gradient
 
     g              - current gradient
 
    n_grads        - max number of past gradients to
 
                      use for the Hessian estimate
 
  
 
==Output==
 
==Output==

Latest revision as of 17:27, 13 August 2018

Calculates an approximation to the Newton-Raphson search direction using past gradients to build a serviceable substitute to a Hessian. The Hessian matrix is never explicitly formed or inverted. This function is the implementation from section 4 of http://dx.doi.org/10.1090/S0025-5718-1980-0572855-7

Syntax

    direction=lbfgs(dx_hist,dg_hist,g)

Arguments

   dx_hist         - history of x increments, a stack 
                     of column vectors, from the latest
                     to the earliest

   dg_hist         - history of gradient increments,
                     a stack of column vectors, from 
                     the latest to the earliest
 
   g               - current gradient

Output

   direction       - LBFGS approximation to the 
                     search direction

Notes

The L-BFGS algorithm is the default of fminnewton.m, and is a good mix of computational efficiency and fast convergence.

See also

fminnewton.m, hessreg.m, linesearch.m


Version 2.2, authors: Ilya Kuprov, David Goodwin