Idea about Bayes

The problem with Bayes is that one must try all probability distributions with all possible parameters. I still think there should be a way to avoid this. What about this one?

What Bayes is actually doing is comparing how low is the cost in the actual configuration, compared with how low it could have been. We might do it by finding the ratio of the cost in the actual configuration, over the average cost for all configurations. This average cost might be computed along each neuron, or at a random sample in the whole space.

Advertisements

New Figure 3 (Bayes)

>> fig_pap_bayes_09(1,zeros(100))

Study of the cost distributions found by Bayes

>> clear
>> load info_posreal_zonapeque
>> prob=infoarchivos2prob(infoarchivos,1:6);

Parameters of the most likely value:

>> [m,ind_max]=max(prob(:));
>> [pot_max,alfa_max,beta_max,lambda_max,mu_max,tipo_max]=ind2sub(size(prob),ind_max);

Now, we will study all values with a relatively high probability, but only those with alfa, beta and pot equal to the most likely ones:

>> ind=find(prob(:)>m/100);
>> [c_pot,c_alfa,c_beta,c_lambda,c_mu,c_tipo]=ind2sub(size(prob),ind);
>> ind=ind(c_pot==pot_max & c_alfa==alfa_max & c_beta==beta_max);
>> [loglambda,logmu,tipo]=infoarchivos2parametrosdistrib(infoarchivos,ind)
>> x=0:.001:4;
>> P=parametrosdistrib2prob(x,loglambda,logmu,tipo);
>> plot(x,P)

>> semilogy(x,P)

>> loglog(x,P)

Very similar, although coming from different types:

>> tipo

 

tipo =

 

1.0000 -1.0000

2.1000 1.0000

2.1000 2.0000

2.1000 2.0000

2.2000 1.0000

2.2000 1.0000

2.2000 1.0000

2.2000 1.0000

2.2000 1.0000

Carpenter plots of the bayesian estimator

>> t_reaccion=bayestim_medias_nosave(10^5,10,1000,2,0,1,10^-3,.995);
>> hist(t_reaccion,0:1000)

>> t_reaccion2histog_carpenter(t_reaccion,20,’r.-‘);

Study of the reaction times, depending on the parameters of the Bayesian estimator

Depending on x:

>> clear
>> x=10.^(-1:-1:-8);
>> umbral=1-10.^(0:-1:-8);
>> sigma=[.5 1 2 4 8];
>> for c_x=1:length(x)
[time{c_x},s,Ph]=tiempo_TH(10^3,10,1000,2,0,1,x(c_x),.99);
media(c_x)=mean(time{c_x});
desvest(c_x)=std(time{c_x});
end

I think the problem with high x is that the probability never passes the threshold. I repeat with lower threshold:

for c_x=1:length(x)
[time{c_x},s,Ph]=tiempo_TH(10^3,10,1000,2,0,1,x(c_x),.9);
media(c_x)=mean(time{c_x});
desvest(c_x)=std(time{c_x});
end
>> plot(log10(x),media)

>> plot(log10(x),desvest)

>> plot(log10(x),desvest./media)

THRESHOLD:

>> for c=1:length(umbral)
[time{c},s,Ph]=tiempo_TH(10^3,10,1000,2,0,1,1/10^4,umbral(c));
media(c)=mean(time{c});
desvest(c)=std(time{c});
end
>> plot(log10(1-umbral),media)

>> plot(log10(1-umbral),desvest)

>> plot(log10(1-umbral),desvest./media)

SIGMA:

>> clear media desvest
>> for c=1:length(sigma)
[time{c},s,Ph]=tiempo_TH(10^3,10,1000,sigma(c),0,1,1/10^4,1-10^-2);
media(c)=mean(time{c});
desvest(c)=std(time{c});
end
>> plot(sigma,media)

>> plot(sigma,desvest)

>> plot(sigma,desvest./media)

The last point is not valid. It is due to the fact that there are no more than 1000 timepoints.

Calibration of Bayes

fig_pap_calibracionbayesglobal_trocitos_01(1,0)

Works fine for most cases (except simulation 3). I will repeat that simulation with better binning. Anyway, simulations with xi=1 seem to work always well, so we can be still sure that xi<1.

New calibration of Bayes: Exponent 0.5

Calculations made in remoton

I explore different values of the noise:

ruidos2=[.01 .05 .1 .5 1 5];
matlabpool open local 4
parfor (c=1:length(ruidos2))
[pos{c},coste_hist{c},error_hist{c}]=simulaelegans_saltopeque_ruidonorm(todas.A*10^-1.1,todas.M*10^-.9+todas.S,todas.f,.5,.001,ruidos2(c),5*10^6,0);
end
pos_opt=NaN(279,100)

parfor (c=1:100)

pos_opt(:,c)=coste2pos_num_ruido(todas.A*10^-1.1,todas.M*10^-.9+todas.S,todas.f,.5,0);

end
for c=1:100

costes(c)=pos2coste(todas.A*10^-1.1,todas.M*10^-.9+todas.S,todas.f,pos_opt(:,c),.5);

end
[m,ind]=min(costes);
for c=1:length(pos)

errores(c)=mean(abs(pos{c}-pos_reopt));

end

I would say that below 0.5 the system gets stuck in global minima, and above 0.5 the standard deviation increases due to the noise. I take the point with noise 0.5, which has a convenient mean error.

I run Bayes:

infoarchivos=Bayes_alfabeta(todas.A,todas.M,todas.S,todas.f,pos{4},0:.02:1,0:.25:4,10.^(-2:.3:1),10.^(-2:.3:1),10.^(-2:6/6:4),10.^(-11:26/6:15),[],[2 2],[10 10],’Calibracionueva_predichoporbayes’,[],4,1);

On my desktop in the lab:
load(‘c:\hipertec\optimesto\bayes\resultados\info_Calibracionueva_predichoporbayes.mat’)
prob=infoarchivos2prob(infoarchivos,[1 2 3]);
>> plot(infoarchivos.pot,sum(sum(prob,2),3),’r’)
>> imagesc(log10(infoarchivos.beta),log10(infoarchivos.alfa),squeeze(sum(prob)))
>> close all
>> imagesc(log10(infoarchivos.beta),log10(infoarchivos.alfa),squeeze(sum(prob)))
>> hold on
>> plot(-.9,-1.1,’w.’)

This second figure seemed different when plotted in remotón. There was a long diagonal towards the right-bottom corner. But the maximum was in the same place, and the right value was inside the high probability area.

Anyway, works fine. I run it with a binning that includes the exact values for alfa and beta…

>> load(‘c:\hipertec\optimesto\bayes\resultados\info_Calibracionueva_predichoporbayes_binbuenos.mat’)
>> prob=infoarchivos2prob(infoarchivos,[1 2 3]);
>> close all
>> plot(infoarchivos.pot,sum(sum(prob,2),3),’r’)
>> figure
>> imagesc(log10(infoarchivos.beta),log10(infoarchivos.alfa),squeeze(sum(prob)))
>> hold on
>> plot(-.9,-1.1,’w.’)

Good enough. If we have time after running controls for other values of alpha, beta and the exponent, we should run a few more controls in this situation to see the dispersion. Also, the lower sd observed in real data might be because local minima may give rise to very specific configurations. We might try with lower noise, and see what happens.