Skip to content

A step-by-step introduction to deep learning (a.k.a. neural network) models for scientists and engineers.

License

Notifications You must be signed in to change notification settings

engineersCode/EngComp6_deeplearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Engineering Computations Module 6

Engineering Computations is a collection of stackable learning modules, flexible for adoption in different situations. It aims to develop computational skills for students in engineering, but it can also be used by students in other science majors. The modules use the Python programming language and the Jupyter open-source tools for interactive computing.

Rather than "learning to code," our vision is "coding to learn."

Module 6: deep learning

A step-by-step introduction to deep learning (a.k.a. neural network) models, aimed at scientists and engineers having a background in calculus and linear algebra.

Tweet

Pre-requisite: learning modules EngComp 1 and EngComp 4 of our collection. Recommended: EngComp 2, or basic use of pandas for data manipulation.

Lesson 1: Linear regression by gradient descent

Find the minimum of a function by gradient descent. Play with SymPy. Key ingredients of building a linear model from data with a single independent variable. Optimize a loss function to find the model parameters.

Lesson 2: Logistic regression

Composition of a linear model with the logistic function. Construct the logistic loss function by integration. Find the model parameters with autograd. Combine with a decision boundary to do classification.

Lesson 3: Multiple linear regression

Use multiple independent variables to build a linear model. Express multiple linear regression in matrix form. Find the weights by gradient descent. Scale (normalize) the features to ensure convergence. Get acquainted with scikit-learn. Model accuracy. Linear regression with scikit-learn and with pseudo-inverse.

Lesson 4: Polynomial regression

Fitting a polynomial to data is a special case of multiple linear regression. Build polynomial features, scale the data, and train the model like in Lesson 3. For predictions with the model, use the scaling from the training data on the new data. Observe underfitting and overfitting. Use regularization to avoid overfitting. This is also called ridge regression. Do it with scikit-learn's Ridge().

Lesson 5: Multiple logistic regression

A taste of more practical machine learning applications: multiple logistic regression for the problem of identifying defective metal-casting parts. Turn an image into a vector of grayscale values to use it as input data, and set up a classification problem from multi-dimensional feature vectors. Split data into training, validation, and test datasets to assess model performance. Normalize the data using z-score. Evaluate the performance of a classification model using F-score.

Lesson 6 Multivariate regression (coming soon)

Lesson 7 Neural network model (coming soon)

Copyright and License

(c) 2021 Lorena A. Barba, Pi-Yueh Chuang, Tingyu Wang. All content is under Creative Commons Attribution CC-BY 4.0, and all code is under BSD-3 clause. We are happy if you re-use the content in any way!

License License: CC BY 4.0

About

A step-by-step introduction to deep learning (a.k.a. neural network) models for scientists and engineers.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages