diff --git a/active_projects.html b/active_projects.html index 69399ed..224eefb 100644 --- a/active_projects.html +++ b/active_projects.html @@ -93,11 +93,20 @@

What is NeuroDiffHub

NeuroDiffEq

One-Shot Transfer Learning for Nonlinear Differential Equations with Neural Networks

-

The ability to rapidly adapt neural networks for solving various differential equations holds immense potential. Achieving "one-shot transfer learning" would pave the way for foundational models applicable to entire families of differential equations, encompassing both ordinary (ODEs) and partial differential equations (PDEs). Such models could efficiently handle diverse initial conditions, forcing functions, and other parameters, offering a universally reusable solution framework.

+

The ability to rapidly adapt neural networks for solving various differential equations holds + immense potential. Achieving "one-shot transfer learning" would pave the way for + foundational models applicable to entire families of differential equations, encompassing both + ordinary (ODEs) and partial differential equations (PDEs). Such models could efficiently handle + diverse initial conditions, forcing functions, and other parameters, offering a universally + reusable solution framework.

Background and Prior Work:

-

Our research has made significant strides in this direction. We previously demonstrated one-shot transfer learning for linear equations [1,2]. Subsequently, we built upon this success by employing perturbation methods to achieve iterative one-shot transfer learning for simple polynomial nonlinearities in differential equations [3].

+

Our research has made significant strides in this direction. We previously demonstrated one-shot + transfer learning for linear equations [1,2]. Subsequently, we built upon this success by employing + perturbation methods to achieve iterative one-shot transfer learning for simple polynomial + nonlinearities in differential equations [3].

Project Goals:

-

This project aims to extend our prior work by tackling non-polynomial nonlinearities in differential equations. While our prior work utilized the homotopy perturbation method, its limited convergence regions pose a challenge. Here, we propose exploring alternative expansion techniques, such as Pade approximations [3], as a means to effectively handle a broader range of nonlinearities.

+

This project aims to extend our prior work by tackling non-polynomial nonlinearities in differential equations. While our prior work utilized the homotopy perturbation method, its limited convergence regions pose a challenge. + Here, we propose exploring alternative expansion techniques, such as Pade approximations [4], as a means to effectively handle a broader range of nonlinearities.

Methodology:

  1. Exploration of Expansion Techniques: We will delve into Pade approximations and potentially other expansion methods suitable for representing diverse nonlinearities in differential equations.
  2. @@ -124,39 +133,65 @@

    Improved Project Proposal Draft: Stiffness Regimes in ODEs with Multi-Head PINNs and Transfer Learning

    -

    Introduction

    -

    Ordinary differential equations (ODEs) are ubiquitous in scientific computing, modeling a vast range of phenomena from planetary motion to circuit design. However, solving these equations can be computationally expensive, particularly for stiff systems. Stiffness arises when the solution contains components with vastly different timescales. Capturing the rapid transient phases alongside slower variations becomes a challenge for traditional numerical methods, requiring very small timesteps and significant computational cost.

    -

    This project proposes a novel approach to tackle stiffness in ODEs using Physics-Informed Neural Networks (PINNs)with a multi-head architecture and transfer learning. PINNs integrate governing equations into neural network structures via automatic differentiation, offering a data-driven alternative to traditional methods. However, encoding solutions in stiff regimes remains challenging due to the difficulty in capturing rapid transients.

    -

    Proposed Methodology

    -

    This project extends previous PINN methodologies by leveraging transfer learning. We propose the following approach:

    -
      -
    1. Multi-Head Architecture: Train a neural network with multiple "heads," each specializing in capturing solutions for a specific stiffness regime (e.g., non-stiff or moderately stiff).
    2. -
    3. Transfer Learning: Train the multi-head architecture on a non-stiff regime. Subsequently, for a stiff regime, utilize the pre-trained network architecture (weights) and fine-tune it with limited training data from the stiff system. This leverages the network's existing knowledge to learn the solution in the new regime without extensive retraining.
    4. -
    -

    Advantages

    - -

    Validation and Testing

    -

    We will evaluate the proposed method on a set of benchmark linear and non-linear ODEs with varying stiffness ratios. The performance will be compared to vanilla PINNs and established numerical methods like RK45 and Radau in terms of accuracy and computational efficiency. Metrics such as average absolute error and execution time will be used for evaluation.

    -

    Future Work

    -

    Building upon the success of this project, future research directions include:

    - -

    References

    + +

    Future Directions in Stiffness Modeling: Expanding Multi-Head PINNs

    + +

    Ordinary differential equations (ODEs) are fundamental in modeling a vast range of physical, + biological, and engineering systems. However, solving these equations, particularly for stiff + systems, remains a significant computational challenge. Stiffness arises when solutions + evolve on vastly different timescales, requiring specialized numerical methods to capture + rapid transients and slow dynamics simultaneously. Traditional solvers like Runge-Kutta + methods often struggle with efficiency and stability, necessitating extremely small time + steps for stiff systems. This inefficiency is amplified when exploring varying initial + conditions or force functions within stiff regimes.

    + +

    In this context, Physics-Informed Neural Networks (PINNs) offer a promising alternative. + By integrating governing equations into neural network structures via automatic differentiation, + PINNs can approximate solutions directly without traditional mesh-based discretization. + Building on this foundation, this work introduces a novel multi-head PINN architecture and + leverages transfer learning to address the unique challenges of stiffness. These methods + aim to improve computational efficiency and broaden the applicability of PINNs across diverse + stiffness regimes.

    + +

    Previous Work

    + +

    In our prior work [1], we proposed a novel approach to solving stiff ODEs using Physics-Informed + Neural Networks (PINNs) with a multi-head architecture and transfer learning. By integrating + governing equations directly into neural networks through automatic differentiation, + PINNs provide an alternative to traditional numerical solvers.

    + +

    Our method introduced a multi-head architecture, where each “head” specializes in a specific + stiffness regime. The network was first trained on non-stiff regimes, then fine-tuned for + stiff systems using transfer learning to leverage pre-trained weights. This strategy significantly + reduced computational costs compared to methods like RK45 and Radau, particularly when exploring + varying initial conditions or force functions.

    + +

    We validated the approach on benchmark linear and nonlinear ODEs with varying stiffness ratios, + demonstrating improvements in accuracy and execution time over vanilla PINNs and + traditional solvers.

    + +

    Future Work

    + +

    Building on the success of this project, we aim to extend the applicability of the proposed + approach in the following directions:

    +
      -
    1. Wang, Z., Xiao, H., & Sun, J. (2023). Physics-informed neural networks for stiff partial differential equations with transfer learning. Journal of Computational Physics, 482, 108522. [reference for transfer learning approach in PDEs]
    2. -
    3. Brasseur, G. P., & Jacob, D. J. (2017). Atmospheric chemistry and global change. Oxford University Press. [reference for advection-reaction system as a stiff problem]
    4. +
    5. Extension to Stiff PDEs: Expand the use of the multi-head architecture and transfer learning to partial differential equations (PDEs) with stiff dynamics. This includes addressing complex problems like the one-dimensional advection-reaction system, a benchmark in atmospheric modeling [2, 3], and extending to systems relevant in fluid dynamics and materials science.
    6. +
    7. Broadening Stiffness Regime Coverage: Investigate the effectiveness of the multi-head architecture across diverse stiffness types, such as boundary layer stiffness, oscillatory stiffness, and thermal runaway stiffness. This work aims to generalize the methodology for applicability to various domains.
    8. +
    9. Applications in Astronomy and Physics: Explore the use of this framework for astrophysical simulations, such as modeling stellar interiors, planetary atmospheres, or accretion disk dynamics, where stiffness arises from coupled thermodynamic and radiative processes. Similarly, in physics, apply the method to problems like plasma dynamics or high-energy particle interactions, where disparate timescales and sharp gradients are prevalent.
    10. +
    11. Other High-Impact Domains: Extend the approach to industrial applications, including chemical reaction networks, biological systems, and climate modeling, which often involve stiff systems and require efficient, accurate solvers.
    12. +
    +

    References

    + +
      +
    1. Emilien Sellier and Pavlos Protopapas, submitted to AISTATS
    2. +
    3. Physics-informed neural networks for stiff partial differential equations with transfer learning.
    4. +
    5. Brasseur, G. P., & Jacob, D. J. (2017). Atmospheric chemistry and global change. Oxford University Press.
    + + diff --git a/index.html b/index.html index 6a34934..7d14519 100644 --- a/index.html +++ b/index.html @@ -136,6 +136,10 @@

    StellarDNN

    the group aims to gain a deeper understanding of these phenomena and how they can be characterized and studied.

    + +

    +

    If you are seeking information on current research projects, + please view the list of open projects here.

    @@ -339,18 +343,17 @@

    StellarDNN

    Latest Publications

    +

    Gravitational duals from equations of state +
    Bea, Y., Jiménez, R., Mateos, D., Liu, S., Protopapas, P., Tarancón-Álvarez, P., Tejerina-Pérez, P.

    +

    Behavioral Malware Detection using a Language Model Classifier Trained on sys2vec Embeddings +
    John Carter, Spiros Mancoridis, Pavlos Protopapas, Erick Galinkin

    +

    Faster Bayesian inference with neural network bundles and new results for ΛCDM models +
    A.T. Chantada, S.J. Landau, P. Protopapas, C.G. Scóccola

    Transfer Learning with Physics-Informed Neural Networks for Efficient Simulation of Branched Flows
    R Pellegrin, B Bullwinkel, M Mattheakis, P Protopapas

    Error-Aware B-PINNs: Improving Uncertainty Quantification in Bayesian Physics-Informed Neural Networks
    O Graf, P Flores, P Protopapas, K Pichara

    -

    Deqgan: Learning the loss function for pinns with generative adversarial networks -
    Blake Bullwinkel, Dylan Randle, Pavlos Protopapas, David Sondak

    -

    Evaluating Error Bound for Physics-Informed Neural Networks on Linear Dynamical Systems -
    S Liu, X Huang, P Protopapas -

    -

    RcTorch: a PyTorch Reservoir Computing Package with Automated Hyper-Parameter Optimization -
    Hayden Joy, Marios Mattheakis, Pavlos Protopapas -

    +

     

    diff --git a/projects.html b/projects.html index 2e0b208..aefa3fe 100644 --- a/projects.html +++ b/projects.html @@ -51,14 +51,17 @@
    -

    Project descriptions

    - - +

    Open Project Descriptions

    -

    In the laboratory, there are several subsidiary investigations ongoing. Descriptions and links providing further details and references can be found below. +

    + There are several ongoing research projects. Detailed descriptions, additional information, and references for these projects are available below. - To access the details of the active project, please click here.

    - + To view details about our active projects, please click here. +

    + + + + diff --git a/projects/nneht.html b/projects/nneht.html index 8b8b218..6035cfc 100644 --- a/projects/nneht.html +++ b/projects/nneht.html @@ -88,6 +88,27 @@

    NN-EHT projects

    +
    +
    +
    + Real and Fake Blackholes +
    +
    +

    Parameterization of the M87* blackhole using Generative Adversarial Networks

    +

    People: Lily , Pavlos Protopapas

    +

    Accurate parameterisation of the M87* blackhole is + challenging as the simulations are computationally expensive resulting in + sparse training datasets. In order to increase the size of the training grid, + we propose a data augmentation methodology based + on Conditional Progressive Generative Adversarial Networks to generate a + variety of synthetic black hole images based on its + spin and electron distribution parameters.

    + + Paper + Code   

    +
    +
    +