I have two algorithms. One minimizes mean squared error in Deltax, and the other one maximizes loglikelihood assuming that the deviations are exponentially distributed (the first one is equivalent to assume that they are normally distributed around the mean. This is clearly not true, so the second one should work better).
I first calibrate the method with a synthetic elegans with exponent 2:
>> load(‘e:\hipertec\optimesto\PresentacionesPapers\PaperMolon\Figuras\elegans_sintetico_pot2_r06_todos.mat’)
>> load datos_celegans_struct
>> pos_cuad=coste2pos_cuad(todas.A*.05,todas.M*1.5+todas.S,todas.f);
>> desv_sint=abs(pos_todos(:,1)-pos_cuad);
>> omega=sum([todas.A*.05 todas.M*1.5+todas.S],2);
>> [xi,b]=omegadesv2bestfittingexponent_exp(desv_sint,omega)
xi =
1.9973
b =
0.3085
>> [xi,b]=omegadesv2bestfittingexponent(desv_sint,omega)
xi =
2.4142
b =
0.2777
The algorithm that assumes exponential distribution of deviations and finds maximum log-likelihood works better, as expected.
Now, with the real data from C. elegans:
>> desv=abs(todas.pos_real-pos_cuad);
>> [xi,b]=omegadesv2bestfittingexponent_exp(desv,omega)
xi =
5.8572
b =
0.1184
>> [xi,b]=omegadesv2bestfittingexponent(desv,omega)
xi =
7.2291
b =
0.1122
Nearer with the exponential one, but around 5.
Now I use the optimal for linear cost:
>> pos_lin=coste2pos_num_ruido(todas.A*.05,todas.M*1.5+todas.S,todas.f,1,0,1,0);
>> desv_lin=abs(todas.pos_real-pos_lin);
>> clf
>> [xi,b]=omegadesv2bestfittingexponent_exp(desv_lin,omega)
xi =
5.2203
b =
0.1324
Still not working…
I remove the three outliers, and try again (both with the optimum for quadratic cost, and for the optimum for linear cost):
>> buenas=omega<20 | desv<.2;
>> sum(~buenas)
ans =
3
>> [xi,b]=omegadesv2bestfittingexponent_exp(desv(buenas),omega(buenas))
xi =
4.7116
b =
0.1235
>> [xi,b]=omegadesv2bestfittingexponent_exp(desv_lin(buenas),omega(buenas))
xi =
4.1978
b =
0.1388
So 4 is the best one.
One possibility is to try assuming other probability distributions, that may describe better the data, and see whether that takes us nearer to 2.
Leave a Reply