>> [parameters,criterion,message,output] = fmin(objective, guess, options)where the input arguments are:
>> banana = @(x)100*(x(2)x(1)^2)^2+(1x(1))^2;and in order to get the whole optimisation information (uncertainty, ...) in 'output'
>> fmin(banana, [0 0])
ans =
0.9359 0.8723
>> [parameters, fval, exitval, output] = fmin(banana, [0 0]);
>> [parameters, fval, exitval, output] = fminimfil(function, starting_parameters, options, {constraints})
where the options and output arguments are
structures. All of these methods support constraints on parameters
during the optimization process (see below).function y=objective(p)In case the objective requires additional arguments, just pass them to the fits method (5th, 6th, ... arguments)
y = ...
>> p=fminsce(objective, [ 0.5 1 0.01 0 ],options, constraints, additional_arguments);assumes the objective function has syntax (see below for more details and an example)
objective(p, additional_arguments)Last, the starting parameter 'guess' can also be specified as a structure which fields contain numbers, or as a string with members separated with the ';' character, such as in the following example:
>> p.par1=4; p.par2=5; % optimize two parameters with names par1 and par2.In this case, the returned optimised parameters are also given as named structures:
>> fminsce(objective, p) % The result is also returned as a structure.
>> fminsce(objective, 'par1=4 ;par2=5') % create the structure above and optimize...
>> banana = @(x)100*(x(2)x(1)^2)^2+(1x(1))^2;
>> fmin(banana, struct('A',0,'B',0))
ans =
A: 0.9499
B: 0.9026
>> fmin(objective, p)
When more than one optimizer is suitable, a random choice is
performed, with a weighting by the success ratio and the number of
function calls (optimization speed). Successive calls to fmin with
the same problem to solve may result in different optimizers and
solutions.>> fminpso(objective, p)
>> fmin(objective, p,'optimizer=fminpso')
Optimizer 
Description
(continuous problems) 
Mean Success ratio (%) 
Cost function calls (iterations) 
fminpso [8]  Particle Swarm Optimization  97.9  2757 
fminpowell [13]  Powell with Coggins line search  99.21  324 
fminhooke [12]  HookeJeeves direct search  97.2 
1177 
fminimfil [3]  Unconstrained Implicit
filtering 
93.5  1424 
fminralg [4]  Shor Ralgorithm  88.9  165 
fminsimpsa [1]  simplex/simulated annealing  97.1 
2795 
Optimizer 
Description
(noisy problems) 
Mean Success ratio
(%) 
Cost function calls (iterations) 
fminpso [8]  Particle Swarm Optimization  84.1 
3079 
fminsimpsa [1]  simplex/simulated annealing  84.8 
3672 
fminsce [10]  Shuffled Complex Evolution  85.2 
3184 
fmincmaes [9]  Evolution Strategy with Covariance Matrix Adaptation  71.3 
2066 
fminimfil [3]  Unconstrained Implicit filtering  56.8 
2805 
fminhooke [12]  HookeJeeves direct search  56.3 
3067 
>> fits(iData)
Function Name 
Description 
Success ratio 
Success ratio (noisy) [%] 
Comments 
fminanneal [18] 
Simulated annealing 
83.0 
21.1 
fast 
fminbfgs [19] 
BroydenFletcherGoldfarbShanno 
76.1 
3.4 
fastest gradient 
fmincgtrust [5] 
Steihaug NewtonConjugateGradientTrust 
88.0 
13.9 
rather fast 
fmincmaes [9] 
Evolution Strategy with Covariance Matrix Adaptation 
88.9 
71.2 
fastest swarm 
fminga [15] 
Genetic Algorithm (real coding) 
97.2 
77.6 
very slow 
fmingradrand [6] 
Random Gradient 
78.3 
25.8 
slowest gradient 
fminhooke [12] 
HookeJeeves direct search 
97.1 
56.3 
Fast and successful 
fminimfil [3] 
Implicit filtering 
93.5 
53.8 
most successful gradient 
fminkalman [20] 
unscented Kalman filter 
75.9 
36.9 

fminlm [7] 
LevenbergMaquardt 
80.2 
4.7 

fminmarkov [23] 
Markov Chain Monte Carlo 
not benchmarked, but usually very robust 

fminnewton [21] 
Newton gradient search 
76.7 
21.3 

fminpowell [13] 
Powell Search with Coggins 
99.2 
51.7 
fast and successful 
Powell Search with Golden rule options.Hybrid='Coggins' 
slower than with Coggins  
fminpso [8] 
Particle Swarm Optimization 
97.9 
84.1 

fminralg [4] 
Shor ralgorithm 
88.9 
16.2 
very fast 
fminrand [22] 
adaptive random search 
74.6 
62.2 

fminsce [10] 
Shuffled Complex Evolution 
95.7 
85.2 
slow 
fminsearchbnd [2] (fminsearch) 
NelderMead simplex (fminsearch) 
65.8 
11.5 
Matlab default. 
fminsimplex [14] 
NelderMead simplex (alternate
implementation than fminsearch) 
82.5 
39.9 

fminsimpsa [1] 
simplex/simulated annealing 
97.1 
84.8 
slow 
fminswarm [11] 
particle Swarm Optimizer (alternate
implementation than fminpso) 
95.0 
75.3 
slow 
fminswarmhybrid [11] 
Hybrid particle Swarm Optimizer 
91.6 
47.1 
fast swarm 
>> [parameters,criterion,message,output]= fminimfil(model, initial_parameters, options, constraints)
In short, the constraints is a structure with the following members, which should all have the same length as the model parameter vector:
constraints='min=[0 0 0 0]; max=[1 10 3 0.1]; eval=p(4)=0';
>> p=fminimfil(objective, starting, options, [ 1 0 0 0 ])will fix the first model parameter. A similar behavior is obtained when setting constraints as a structure with a fixed member :
>> constraints.fixed = [ 1 0 0 0 ];The constraints vector should have the same length as the model parameter vector.
>> p=fminimfil(objective, starting, options, [ 0.5 0.8 0 0 ], [ 1 1.2 1 1 ])A similar behavior is obtained by setting constraints as a structure with members min and max :
>> constraints.min = [ 0.5 0.8 0 0 ];The constraints vectors should have the same length as the model parameter vector, and NaN values can be used not to apply min/max constraints on specified parameters.
>> constraints.max = [ 1 1.2 1 1 ];
The constraints.eval
member can be used to specify any other constraint/restraint
by mean of
For instance one could use constraints.eval='p(3)=p(1);'.
objective(p, additional_arguments)the syntax for the optimization is e.g.
>> p=fminsce(objective, [ 0.5 1 0.01 0 ],options, constraints, additional_arguments);that is additional arguments are specified as 5th, 6th... arguments. In this case, would you need to set optimizer configuration or restraints/constraints, we recommend to give 'options' and 'constraints' arguments 34th as structure, as explained above.
>> banana = @(x, cte) 100*(x(2)x(1)^2)^2+(1x(1))^2 + 0*fprintf(1,'%s', cte);
>> p=fminsce(banana, [ 0.5 1 0.01 0 ],'', '', '*');
.......................
ans =
0.1592 0
p_final=fminsce( @(p)least_square(y, sqrt(y), f(p,x)) , p_start);This is what is done in the 'fits' methods of the iData and iFunc classes.