New figures for E. coli (not definitive)

Maintenance:

>> fig_pap_coli_mantenimiento_02(1)

Figures for growth 0.1, 0.2, 0.3 and 0.4, respectively:

>> fig_pap_coli_13(1,0,.1)

>> fig_pap_coli_13(1,0,.2)

>> fig_pap_coli_13(1,0,.3)

>> fig_pap_coli_13(1,0,.4)

Advertisements

Calibration of the metric for E. coli: Preliminary results

I use the model for growth that is used in Figure 4.

fig_pap_coli_11(1,0)

(and use internal variables from now on)

New metric:

p=calibrametrica(model_growth,flujosexp_finales,1000,100,Delta);
K>> hist(p)

K>> hist(log10(p))

K>> sum(p<.05)
ans =
2

Looks good, but I will repeat it with more repetitions.

Now I test the old metric:

p2=calibrametrica(model_growth,flujosexp_finales,1000,100,Delta,2);
K>> hist(p2)

K>> hist(log10(p2))


K>> sum(p2<.05)
ans =
0

K>> sum(p2<.1)

ans =
3

It seems there was a bias. I will also repeat it with more repetitions.

New Figure 4

Growth 0.1. New maintenance. Old data. New metric.

>> fig_pap_coli_11(1,1)

 

Growth 0.4:

>> fig_pap_coli_11(1,1)

Warning: Perhaps all wrong in neuron branching

In principle, our way of finding the optimal diameter was wrong: Murray’s law is satisfied by optimal systems, but is not a recipe to find the optimum. To do so, one must know the cost parameters (relation between the two terms and relative ‘importance’ of each branch), and then calculate the cost. When we just fix two branches and calculate the third with Murray’s law, we are implicitly changing the parameters of the cost (we go to a bifurcation whose ‘best fitting’ parameters are different than the initial ones). If we calculate the new cost with the old parameters, in some cases it may increase instead of decrease. For example:

>> d1=2; d2=1; d3=1;
>> C=L1*d1^2 + L1/d1 + L2*d2^2 + .5*L2/d2 + L3*d3^2 + .5*L3/d3
C =
12.7500
>> d1^3-d2^3-d3^3
ans =
6
>> d2=(d1^3-d3^3)^(1/3);
>> d1^3-d2^3-d3^3
ans =
1.7764e-015
>> C=L1*d1^2 + L1/d1 + L2*d2^2 + .5*L2/d2 + L3*d3^2 + .5*L3/d3
C =
16.3810

However, I think it is possible to do it correctly. For example, one can first calculate reasonable parameters for the network, and then calculate the optimum for each branch, with respect to those parameters. Also, do not think that the results found so far are completely meaningless: Although they are strictly incorrect, they do indicate a trend to larger deviations for shorter branch lengths.

New Figure 3 (Bayes)

>> fig_pap_bayes_09(1,zeros(100))

Correction in the neuron branching data (lengths)

Something trivial: For some reason, they were in tenths of milimiter. I divide over 10, so that they are now in mm.

>> clear
>> load(‘c:\hipertec\BranchingNeurons\Datos\datos_branching.mat’)
>> longitudes=longitudes/10;
>> save(‘c:\hipertec\BranchingNeurons\Datos\datos_branching.mat’)

Branching in neurons: Looks good

I will calculate the diameter of mother branches, instead of the diameter of children branches.

Advantage: Only one branch is used per calculation, while if we calculate the children we use each branch twice.
Disadvantage: We get fewer data (about one half).
Note: It is unclear to me which is the correct way of doing it. Depends on what is actually tuned in the animal, and that is unknown (it will probably be a combination of the three diameters).

>> load datos_branching
>> diametros_opt=primarios_opt(diametros,bifurcaciones,3);
>> plot(abs(diametros-diametros_opt),longitudes,’.’)
>> xlabel(‘Deviation (\mum)’)
>> ylabel(‘Length (mm)’)

Looks right, but lacking statistics. I remove the absolute value:

>> plot((diametros-diametros_opt),longitudes,’.’)

More positive deviations than negative. We see this better with the histogram:

>> hist(diametros-diametros_opt,20)
>> xlabel(‘Deviation (\mum)’)

This agrees with the shape of the cost, which is much steeper in for negative deviations.

We compute the metric, which takes into account both the effect associated to length, and the effect associated to the sign.

>> [m,coste_real,coste_perm,errormedio]=metrica_branching_02(diametros,bifurcaciones,longitudes,10^5,3,1);
>> m
m =
0.0291
>> hist(coste_perm,200)
>> hold on
>> ejes=axis;
>> axis(axis)
>> plot(coste_real*[1 1],ejes(3:4),’r’)

I’d say it is as good as was to be expected, taking into account the small amount of data that we have.

THE SAME, BUT OPTIMIZING THE CHILDREN:

Surprisingly, it works worse:

>> diametros_opt=secundarios_opt(diametros,bifurcaciones,3);
>> plot(abs(diametros-diametros_opt),longitudes,’.’)

This looks reasonable, and the three possible outlayers may be compensated by the better statistics. But the sign of the deviations does not help here:

>> plot((diametros-diametros_opt),longitudes,’.’)

>> hist(diametros-diametros_opt,20)

Deviations are mostly negative. Now I see that this had to happen: If the mother is too thick (positive deviation) then the children are too thin (negative deviation).

The metric is not so good as in the other case:
>> [m,coste_real,coste_perm,errormedio]=metrica_branching_02(diametros,bifurcaciones,longitudes,10^5,3,2);
>> m
m =
0.0671

I compute the metric again, but using the old program, which does not take into account the sign of the deviation (or at least not fully, because it keeps the sign of deviations, only randomizing the order. Thus, in the permuted systems there are the same number of deviations of each sign, so we still have the bias).

>> [m,coste_real,coste_perm,errormedio]=metrica_branching_02_03ago09(diametros,bifurcaciones,longitudes,10^5,3,2);
>> m
m =
0.0097

Results improve, being better than those got for the mothers (probably due to the better statistics).