Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Individual star particles and feedback #233

Open
wants to merge 957 commits into
base: main
Choose a base branch
from

Conversation

kaleybrauer
Copy link

Individual star particles for all Pop II stars > 2 Msun, additional feedback models for the star particles, handling of many different metal tracers, bug fixes to handle hundreds of thousands of star particles

Andrew Emerick and others added 30 commits January 3, 2020 19:43
placed before, and PE heating was applied above 10^4 K
…lude radiating individual stars as requiring refinement to must refine level, even if no mass / energy feedback
…vior when mass above max bounds (e.g. switch to extrapolation)
…r of 1.7 due to a normalization issue in how I computed the efficiency scaling
  1) Paramter for including an R process model and associated
     passive scalar tracer field. Models do nothing at the moment
  2) Allow for AGB, SN, PopIII, and R Process tracer fields to be
     saved as particle attributes if ouput chemical abundances is
     turned off
  3) Allowed for GalaxySimulation to use above species type
     tracer fields
…d. Ensures the mapping of the sphere onto the grid has correct volume
…del field initialization in nested cosmology
…rn to yields. This model is very simple and assumes yield table yields do not depend on surface abundances and just adds on surface abundances with no re-scaling
…d physical radius rather than grid cell size
…eedback. A little bit worried this will do very bad things if particle is near the edge of a grid and feedback sphere overlaps with cells beyond the ghost zones. The feedback will be deposited correctly on the off-grid zones with the all star particle communication, but the routine may think its missing this volume and try and blow up the values to account for it (and if both grids do this then this is bad). Need to think about this some more. Still a problem even if feedback zone radius is <= number of ghost zones.
…ting to free memory that hasn't been allocated in StellarAbundances arrays on each grid. Just checking if IndividualStarOutputChemicalTags is used now every time this needs to be initialized or deleted so shouldn't be a problem?
aemerick and others added 27 commits July 5, 2020 22:42
…ot sure why this didn't throw error earlier)
…PrintInfo. Optional verbose option in computing popIII-ness
In commit 793c134 the TopGrid dimensions were forced to be
even. However this was not applied to the routines that move particles
between top grids. This mismatch led to memory corruption. It only
caused problems in large top grids (>=512^3) and/or core counts when
odd numbers were present.
* MPI bugfix: When summing up the particle mass flagging field, MPI
  messages would be mixed up, causing an MPI truncation error.  Using
  a pseudo-unique tag instead of the one defined in
  macros_and_parameters.h
* Fixed several minor memory errors and leaks during the course of
  debugging because I thought it was a memory corruption issue for a
  long time.
* Improved MPI error reporting
@bwoshea bwoshea changed the title Individual star particles and feedback WIP: Individual star particles and feedback Feb 20, 2024
@bwoshea
Copy link
Contributor

bwoshea commented Feb 20, 2024

@kaleybrauer Thank you! I'm renaming this to 'WIP' because there are a bunch of merge conflicts that need to be resolved, and all of the tests are failing. Would you mind fixing those issues before we start reviewing the code? Thank you!

@kaleybrauer
Copy link
Author

@bwoshea Certainly! I just got back from traveling and will start addressing the tests and conflicts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants