-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Units and behavior of diffusive particles #2084
Comments
So, on the behaviour the section you quoted says this:
and the NMODL docs teach how to manipulate the quantity X from a mechanism. The developer section talks about It's a simple process so I couldn't think of anything else of interest. Other than that I'll add the units, good point. |
Thanks @thorstenhater. Regarding the mathematical details, do you mean this document? And thanks for including the unit into the documentation. However, I think we should still have a discussion about its foundation (I'll write you on Gitter). |
Please note that I've edited the initial post once more. Here's a formal derivation of the units which may be of interest (note that this is for the current implementation, and there seems to be a flaw). If we want to compute the amount of particles in
|
Choosing |
I am a bit confused by now. You mention
This is correct. However,
in the model we used here
As noted elsewhere, the diffusion coefficient is in SI units
That sentence is unclear to me. Now to my confusion. so far I was assuming the kind-of-mysterious factor
where But now we are talking about surface densities? Back to the normalisation question. By definition Is this broken? If not, please explain what else is wrong. |
I'm not sure I understand this. The effect of a point process should be the injection of a total amount of ions irrespective of the CV volume or area. The same way as a synapse injects a total amount of charge. Therefore we would need to normalise to the CV volume. |
@thorstenhater okay, that's good to know.
I personally wouldn't mind whether it'd be a shell or a volume model, but I think at the moment it is neither. For a volume model, which it's supposed to be at the moment if I understood the quote above correctly,
For now, however, it seems that the third dimension is given simply by the factor of
|
Ok, now I understand, our scaling is such that the actual concentration is lower than you would expect.
No, this is still the conversion from SI -> biophysical unit that's attached to setting the coefficient
I would disagree, the injection is a flux through a surface and thus we should normalize by CV area not volume. That said I would still like to see what your experiment, observed outcome, and expected outcome are. Overall, I hope that at least clarifies the motivations. |
Thanks for the elaboration @thorstenhater , I think we're getting closer to a solution.
Almost. The total amount of particles is lower than we would expect, or alternatively, the concentration is too high.
Okay, it's |
Here is a toy example of one compartment, which should demonstrate how the concentration and amount of two types of diffusing particles depend on the normalization factor and the radius. In this example, we inject an amount of particles The amount of The gray-shaded dotted lines in the plots below show that the mechanisms and the normalization are working. From the plots, you can also see that the concentration is inversely proportional to the radius while the total amount remains invariant with respect to the radius. This is as it should be for surface area normalization, but for volume normalization the concentration should depend on the radius in a quadratic manner. Output for r=1:
Output for r=2:
Output for r=4:
|
We normalise the flux to the area. Reason being -- lifted from a discussion with @schmitts last fall -- that we
is understood as a change to
We do not normalise |
Okay, that sounds sensible. So this normalization is not done in density processes, correct? What remains is the unit issue. As a suggestion, I think either could the concentration be re-defined as |
Why would we redefine So, again, this was a request from @schmitts wrt discretisation independence, if opinions have change about this, |
I will go into deep medidation about this. In case I resurface, I let you know. |
🧘 |
I'd want to say 'I warned you', but I thought it was sensible, too ;) |
Here is a newer summary of this issue (we've discussed this here). DefinitionsAll variables are defined per CV.
Assumptions and Requirements
Expected BehaviorGiven delta-function-like particle input
Current BehaviorThe expected behavior of
ProposalThe solution to issue #2105 will help to solve the issue here, since it enables the normalization of concentration values – according to the current behavior described above – directly within the mechanisms. Next, the units of the concentration and of the amount of particles can be sorted out more easily. |
I've just edited the previous post to include more recent 'findings' and to make it more accessible. And here is a new example with volume normalization (similar to the example with surface area normalization above). The new example (also see the plots below) shall make clear how the diffusion in Arbor currently behaves (master version 235b291). First, we inject an amount of particles With the mathematical formulation given in the previous post, the diffusion now almost behaves as expected, in particular, the concentration depends on the CV radius in a quadratic manner. It now appears that the amount of injected particles is given in units of One might discuss whether to leave things like this or to adapt Arbor so as to get rid of this distinction between density and point mechanisms. Beyond this, I think we've reached at a proper description of Arbor's particle diffusion mechanisms which should in condensed form be transferred to the documentation. This should also include the unit of the diffusivity parameter - I remember that @thorstenhater once mentioned it'd be Output for r=1:
Output for r=2:
Output for r=4:
|
While for a single-compartment neuron everything seems to work (if the factor I'd suggest that we keep the issue here open until the description of the diffusion processes has been integrated into the documentation (I'm working on that). |
I'll add here a note on the unit of the diffusivity constant, which poses an issue that also remains to be resolved. Based on the comparison of diffusion dynamics in Arbor and in Shirin's custom simulator (see this repo) and also indicated by Thorsten's previous statement here, Arbor does not seem to consider diffusivity in units of From Shirin's rather complex model, we reasoned that Arbor internally scales the diffusivity constant by a factor |
In the documentation,
mmol/L
is mentioned as the unit forion X diffusive concentration
, but it is not stated what it refers to (e.g., to point mechanisms, as it is stated for other quantities on the same documentation page). More importantly, the unit of the amount of diffusive particles is neither stated in the documentation nor does it immediately become clear.Here is an example to specify the issue. Consider a compartment with the surface area
A = 2⋅π⋅radius⋅length
. To get from a point measurement of the diffusive particle concentrationXd
to the amount of particles on the whole surface of the compartment, we find that one needs to computeXd⋅A⋅10^9
. Then, withXd
in units ofmmol/L
,A
in units ofm^2
, and10^9
in units ofµm
, the unit of the amount of diffusive particles would beµmol
, which seems fine. Nevertheless, the exact definition of the amount of particles (and thus its biophysical meaning) remains unclear because the origin and interpretation of the huge factor10^9 µm
is unclear. I guess, the problem may originate from the fact that the concentration is defined 'per volume' while all the dynamics currently considered by Arbor happen on a surface.Furthermore, it is not clear from the documentation how the particle diffusion behaves in general (but some information has already been provided in this PR).
To summarize, it would be nice to have a thorough description of the behavior of the particle diffusion in the documentation. Moreover, fixing the unit of the amount of particles (perhaps
µmol
) is likely necessary, and its definition should be mentioned in the documentation.(edited)
The text was updated successfully, but these errors were encountered: