Warning: Perhaps all wrong in neuron branching

In principle, our way of finding the optimal diameter was wrong: Murray’s law is satisfied by optimal systems, but is not a recipe to find the optimum. To do so, one must know the cost parameters (relation between the two terms and relative ‘importance’ of each branch), and then calculate the cost. When we just fix two branches and calculate the third with Murray’s law, we are implicitly changing the parameters of the cost (we go to a bifurcation whose ‘best fitting’ parameters are different than the initial ones). If we calculate the new cost with the old parameters, in some cases it may increase instead of decrease. For example:

>> d1=2; d2=1; d3=1;
>> C=L1*d1^2 + L1/d1 + L2*d2^2 + .5*L2/d2 + L3*d3^2 + .5*L3/d3
C =
12.7500
>> d1^3-d2^3-d3^3
ans =
6
>> d2=(d1^3-d3^3)^(1/3);
>> d1^3-d2^3-d3^3
ans =
1.7764e-015
>> C=L1*d1^2 + L1/d1 + L2*d2^2 + .5*L2/d2 + L3*d3^2 + .5*L3/d3
C =
16.3810

However, I think it is possible to do it correctly. For example, one can first calculate reasonable parameters for the network, and then calculate the optimum for each branch, with respect to those parameters. Also, do not think that the results found so far are completely meaningless: Although they are strictly incorrect, they do indicate a trend to larger deviations for shorter branch lengths.

Advertisement

Correction in the neuron branching data (lengths)

Something trivial: For some reason, they were in tenths of milimiter. I divide over 10, so that they are now in mm.

>> clear
>> load(‘c:\hipertec\BranchingNeurons\Datos\datos_branching.mat’)
>> longitudes=longitudes/10;
>> save(‘c:\hipertec\BranchingNeurons\Datos\datos_branching.mat’)

Branching in neurons: Looks good

I will calculate the diameter of mother branches, instead of the diameter of children branches.

Advantage: Only one branch is used per calculation, while if we calculate the children we use each branch twice.
Disadvantage: We get fewer data (about one half).
Note: It is unclear to me which is the correct way of doing it. Depends on what is actually tuned in the animal, and that is unknown (it will probably be a combination of the three diameters).

>> load datos_branching
>> diametros_opt=primarios_opt(diametros,bifurcaciones,3);
>> plot(abs(diametros-diametros_opt),longitudes,’.’)
>> xlabel(‘Deviation (\mum)’)
>> ylabel(‘Length (mm)’)

Looks right, but lacking statistics. I remove the absolute value:

>> plot((diametros-diametros_opt),longitudes,’.’)

More positive deviations than negative. We see this better with the histogram:

>> hist(diametros-diametros_opt,20)
>> xlabel(‘Deviation (\mum)’)

This agrees with the shape of the cost, which is much steeper in for negative deviations.

We compute the metric, which takes into account both the effect associated to length, and the effect associated to the sign.

>> [m,coste_real,coste_perm,errormedio]=metrica_branching_02(diametros,bifurcaciones,longitudes,10^5,3,1);
>> m
m =
0.0291
>> hist(coste_perm,200)
>> hold on
>> ejes=axis;
>> axis(axis)
>> plot(coste_real*[1 1],ejes(3:4),’r’)

I’d say it is as good as was to be expected, taking into account the small amount of data that we have.

THE SAME, BUT OPTIMIZING THE CHILDREN:

Surprisingly, it works worse:

>> diametros_opt=secundarios_opt(diametros,bifurcaciones,3);
>> plot(abs(diametros-diametros_opt),longitudes,’.’)

This looks reasonable, and the three possible outlayers may be compensated by the better statistics. But the sign of the deviations does not help here:

>> plot((diametros-diametros_opt),longitudes,’.’)

>> hist(diametros-diametros_opt,20)

Deviations are mostly negative. Now I see that this had to happen: If the mother is too thick (positive deviation) then the children are too thin (negative deviation).

The metric is not so good as in the other case:
>> [m,coste_real,coste_perm,errormedio]=metrica_branching_02(diametros,bifurcaciones,longitudes,10^5,3,2);
>> m
m =
0.0671

I compute the metric again, but using the old program, which does not take into account the sign of the deviation (or at least not fully, because it keeps the sign of deviations, only randomizing the order. Thus, in the permuted systems there are the same number of deviations of each sign, so we still have the bias).

>> [m,coste_real,coste_perm,errormedio]=metrica_branching_02_03ago09(diametros,bifurcaciones,longitudes,10^5,3,2);
>> m
m =
0.0097

Results improve, being better than those got for the mothers (probably due to the better statistics).