2 .\" Copyright (c) 2007 Massachusetts Institute of Technology
4 .\" Copying and distribution of this file, with or without modification,
5 .\" are permitted in any medium without royalty provided the copyright
6 .\" notice and this notice are preserved.
8 .TH NLOPT_MINIMIZE 3 2007-08-23 "MIT" "NLopt programming manual"
10 nlopt_minimize \- Minimize a multivariate nonlinear function
15 .BI "nlopt_result nlopt_minimize(nlopt_algorithm " "algorithm" ,
18 .BI " nlopt_func " "f" ,
19 .BI " void* " "f_data" ,
20 .BI " const double* " "lb" ,
21 .BI " const double* " "ub" ,
23 .BI " double* " "fmin" ,
24 .BI " double " "fmin_max" ,
25 .BI " double " "ftol_rel" ,
26 .BI " double " "ftol_abs" ,
27 .BI " double " "xtol_rel" ,
28 .BI " const double* " "xtol_abs" ,
29 .BI " int " "maxeval" ,
30 .BI " double " "maxtime" );
34 attempts to minimize a nonlinear function
38 design variables using the specified
40 The minimum function value found is returned in
42 with the corresponding design variable values stored in the array
52 containing lower and upper bounds, respectively, on the design variables
54 The other parameters specify stopping criteria (tolerances, the maximum
55 number of function evaluations, etcetera) and other information as described
56 in more detail below. The return value is a integer code indicating success
57 (positive) or failure (negative), as described below.
59 By changing the parameter
61 among several predefined constants described below, one can switch easily
62 between a variety of minimization algorithms. Some of these algorithms
63 require the gradient (derivatives) of the function to be supplied via
65 and other algorithms do not require derivatives. Some of the
66 algorithms attempt to find a global minimum within the given bounds,
67 and others find only a local minimum (with the initial value of
73 function is a wrapper around several free/open-source minimization packages.
74 You could, of course, compile and call these packages separately, and in
75 some cases this will provide greater flexibility than is available via the
77 interface. However, depending upon the specific function being minimized,
78 the different algorithms will vary in effectiveness. The intent of
80 is to allow you to quickly switch between algorithms in order to experiment
81 with them for your problem, by providing a simple unified interface to
83 .SH OBJECTIVE FUNCTION
85 minimizes an objective function
89 .BI " double f(int " "n" ,
91 .BI " const double* " "x" ,
93 .BI " double* " "grad" ,
95 .BI " void* " "f_data" );
97 The return value should be the value of the function at the point
101 points to an array of length
103 of the design variables. The dimension
105 is identical to the one passed to
106 .BR nlopt_minimize ().
108 In addition, if the argument
112 points to an array of length
114 which should (upon return) be set to the gradient of the function with
115 respect to the design variables at
119 should upon return contain the partial derivative df/dx[i],
123 Not all of the optimization algorithms (below) use the gradient information:
124 for algorithms listed as "derivative-free," the
126 argument will always be NULL and need never be computed. (For
127 algorithms that do use gradient information, however,
129 may still be NULL for some calls.)
133 argument is the same as the one passed to
134 .BR nlopt_minimize (),
135 and may be used to pass any additional data through to the function.
136 (That is, it may be a pointer to some caller-defined data
137 structure/type containing information your function needs, which you
138 convert from void* by a typecast.)
142 parameter can take on any of the following values:
144 .B NLOPT_GLOBAL_DIRECT
145 Perform a global derivative-free optimization using the DIRECT search
146 algorithm by Jones et al., based on the free implementation by Gablonsky
149 The value returned is one of the following enumerated constants.
150 (Positive return values indicate successful termination, while negative
151 return values indicate an error condition.)
153 Currently the NLopt library is in pre-alpha stage. Most algorithms
154 currently do not support all termination conditions: the only
155 termination condition that is consistently supported right now is
158 Written by Steven G. Johnson.
160 Copyright (c) 2007 Massachusetts Institute of Technology.