From 654747d02d8f3f0e69de333b1b1989fb246cc517 Mon Sep 17 00:00:00 2001 From: cemachelen Date: Wed, 4 Aug 2021 16:59:43 +0100 Subject: [PATCH] :pencil2: fix bunch of typos before main adjustments --- PINNs_1DHeatEquation_nonML.ipynb | 25 +++++++++++++------------ 1 file changed, 13 insertions(+), 12 deletions(-) diff --git a/PINNs_1DHeatEquation_nonML.ipynb b/PINNs_1DHeatEquation_nonML.ipynb index b3a6671..2edf39c 100644 --- a/PINNs_1DHeatEquation_nonML.ipynb +++ b/PINNs_1DHeatEquation_nonML.ipynb @@ -21,7 +21,7 @@ "\n", "This notebook is based on two papers: *[Physics-Informed Neural Networks: A Deep LearningFramework for Solving Forward and Inverse ProblemsInvolving Nonlinear Partial Differential Equations](https://www.sciencedirect.com/science/article/pii/S0021999118307125)* and *[Hidden Physics Models: Machine Learning of NonlinearPartial Differential Equations](https://www.sciencedirect.com/science/article/pii/S0021999117309014)* with the help of Fergus Shone and Michael Macraild.\n", "\n", - "These tutorials will go through solving Partial Differential Equations using Physics Informed Neuaral Networks focusing on the Burgers Equation and a more complex example using the Navier Stokes Equation\n", + "These tutorials will go through solving Partial Differential Equations using Physics Informed Neural Networks focusing on the Burgers Equation and a more complex example using the Navier Stokes Equation\n", "\n", "**This introduction section is replicated in all PINN tutorial notebooks (please skip if you've already been through)** \n", "\n", @@ -45,20 +45,21 @@ "\n", "

Physics Informed Neural Networks

\n", "\n", - "For a typical Neural Network using algorithims like gradient descent to look for a hypothesis, data is the only guide, however if the data is noisy or sparse and we already have governing physical models we can use the knowledge we already know to optamize and inform the algoithms. This can be done via [feature enginnering]() or by adding a physicall inconsistency term to the loss function.\n", + "For a typical Neural Network using algorithms like gradient descent to look for a hypothesis, data is the only guide, however if the data is noisy or sparse and we already have governing physical models we can use the knowledge we already know to optimize and inform the algorithms. This can be done via [feature engineering]() or by adding a physical inconsistency term to the loss function.\n", "\n", "\n", " \n", - " \n", + " \n", " \n", "## The very basics\n", "\n", - "If you know nothing about neural networks there is a [toy neural network python code example](https://github.com/cemac/LIFD_ENV_ML_NOTEBOOKS/tree/main/ToyNeuralNetwork) included in the [LIFD ENV ML Notebooks Repository]( https://github.com/cemac/LIFD_ENV_ML_NOTEBOOKS). Creating a 2 layer neural network to illustrate the fundamentals of how Neural Networks work and the equivlent code using the python machine learning library [tensorflow](https://keras.io/). \n", + "If you know nothing about neural networks there is a [toy neural network python code example](https://github.com/cemac/LIFD_ENV_ML_NOTEBOOKS/tree/main/ToyNeuralNetwork) included in the [LIFD ENV ML Notebooks Repository]( https://github.com/cemac/LIFD_ENV_ML_NOTEBOOKS). Creating a 2 layer neural network to illustrate the fundamentals of how Neural Networks work and the equivalent code using the python machine learning library [tensorflow](https://keras.io/).\n", "\n", " \n", - "## Recommended reading \n", + "## Recommended reading\n", " \n", - "The in-depth theory behind neural networks will not be covered here as this tutorial is focusing on application of machine learning methods. If you wish to learn more here are some great starting points. \n", + "The in-depth theory behind neural networks will not be covered here as this tutorial is focusing on application of machine learning methods. If you wish to learn more here are some great starting points. \n", + " \n", "\n", "* [All you need to know on Neural networks](https://towardsdatascience.com/nns-aynk-c34efe37f15a) \n", "* [Introduction to Neural Networks](https://victorzhou.com/blog/intro-to-neural-networks/)\n", @@ -86,16 +87,16 @@ " \n", "## Physics informed Neural Networks\n", "\n", - "Neural networks work by using lots of data to calculate weights and biases from data alone to minimise the loss function enabling them to act as universal fuction approximators. However these loose their robustness when data is limited. However by using know physical laws or empirical validated relationships the solutions from neural networks can be sufficiently constrianed by disregardins no realistic solutions.\n", + "Neural networks work by using lots of data to calculate weights and biases from data alone to minimise the loss function enabling them to act as universal function approximators. However these lose their robustness when data is limited. However by using known physical laws or empirical validated relationships the solutions from neural networks can be sufficiently constrained by disregarding no realistic solutions.\n", " \n", - "A Physics Informed Nueral Network considers a parameterized and nonlinear partial differential equation in the genral form;\n", + "A Physics Informed Neural Network considers a parameterized and nonlinear partial differential equation in the general form;\n", "$$\n", "\\begin{align}\n", " u_t + \\mathcal{N}[u; \\lambda] &= 0, && x \\in \\Omega, t \\in [0,T],\\\\\n", "\\end{align}\n", "$$\n", "\n", - "where $\\mathcal{u(t,x)}$ denores the hidden solution, $\\mathcal{N}$ is a nonlinear differential operator acting on $u$, $\\mathcal{\\lambda}$ and $\\Omega$ is a \\subset of \\mathbb{R}^D$ (the perscribed data). This set up an encapuslate a wide range of problems such as diffusion processes, conservation laws, advection-diffusion-reaction systems, and kinetic equations and conservation laws. \n", + "where $\\mathcal{u(t,x)}$ denores the hidden solution, $\\mathcal{N}$ is a nonlinear differential operator acting on $u$, $\\mathcal{\\lambda}$ and $\\Omega$ is a \\subset of \\mathbb{R}^D$ (the prescribed data). This set up an encapsulation of a wide range of problems such as diffusion processes, conservation laws, advection-diffusion-reaction systems, and kinetic equations and conservation laws.\n", "\n", "Here we will go though this for the Burgers equation and Navier stokes equations\n", "\n", @@ -172,7 +173,7 @@ "metadata": {}, "source": [ "
\n", - "Load in all required modules (includig some auxillary code) and turn off warnings. Make sure Keras session is clear\n", + "Load in all required modules (including some auxiliary code) and turn off warnings. Make sure Keras session is clear\n", "
" ] }, @@ -609,7 +610,7 @@ "\n", "This implies that oscillatory components can not reliably be reconstructed from noisy data since they correspond to small eigenvalues.\n", "\n", - "This means we must take different approach even to a relatively simple looking problem which may require filtering or computationaly expensive calculations. This is with an even sample of data to use. So we might consider looking to PINNs" + "This means we must take a different approach even to a relatively simple looking problem which may require filtering or computationally expensive calculations. This is with an even sample of data to use. So we might consider looking to PINNs" ] }, { @@ -623,7 +624,7 @@ "\n", "## Next steps\n", "\n", - "Now we've gone through a Naive manial approach to solving a simple 1D Heat equation we look at the benefits of using neural networks to solve more complex equations starting with the next notebook linked below: \n", + "Now we've gone through a Naive manual approach to solving a simple 1D Heat equation we look at the benefits of using neural networks to solve more complex equations starting with the next notebook linked below: \n", " \n", "[Burgers Equation PINN Example](PINNs_BurgersEquationExample.ipynb)\n", " \n",