There is one session available:
There is one session available:
Introduction to Scientific Machine Learning
About this courseSkip About this course
This course provides an introduction to data analytics for individuals with no prior knowledge of data science or machine learning. The course starts with an extensive review of probability theory as the language of uncertainty, discusses Monte Carlo sampling for uncertainty propagation, covers the basics of supervised (Bayesian generalized linear regression, logistic regression, Gaussian processes, deep neural networks, convolutional neural networks), unsupervised learning (k-means clustering, principal component analysis, Gaussian mixtures) and state space models (Kalman filters). The course also reviews the state-of-the-art in physics-informed deep learning and ends with a discussion of automated Bayesian inference using probabilistic programming (Markov chain Monte Carlo, sequential Monte Carlo, and variational inference). Throughout the course, the instructor follows a probabilistic perspective that highlights the first principles behind the presented methods with the ultimate goal of teaching the student how to create and fit their own models.
At a glance
- Institution: PurdueX
- Subject: Engineering
- Level: Advanced
- Working knowledge of multivariate calculus and basic linear algebra
- Basic Python knowledge
- Knowledge of probability and numerical methods for engineering would be helpful, but not required
- Language: English
- Video Transcript: English
- Associated skills: Data Analysis, Sampling (Statistics), K-Means Clustering, Machine Learning, Markov Chain Monte Carlo, Logistic Regression, Gaussian Process, Data Science, State Space, Physics, Teaching, Artificial Neural Networks, Probability Theories, Deep Learning, Bayesian Inference, Convolutional Neural Networks, Unsupervised Learning, Propagation Of Uncertainty, Principal Component Analysis, Linear Regression
What you'll learnSkip What you'll learn
After completing this course, you will be able to:
- Represent uncertainty in parameters in engineering or scientific models using probability theory
- Propagate uncertainty through physical models to quantify the induced uncertainty in quantities of interest
- Solve basic supervised learning tasks, such as: regression, classification, and filtering
- Solve basic unsupervised learning tasks, such as: clustering, dimensionality reduction, and density estimation
- Create new models that encode physical information and other causal assumptions
- Calibrate arbitrary models using data
- Apply various Python coding skills
- Load and visualize data sets in Jupyter notebooks
- Visualize uncertainty in Jupyter notebooks
- Recognize basic Python software (e.g., Pandas, numpy, scipy, scikit-learn) and advanced Python software (e.g., pymc3, pytorch, pyrho, Tensorflow) commonly used in data analytics
Please note: The summer 2022 session of this course will be a condensed 8-week course. The fall 2022 session will be the full 16 weeks.
Section 1: Introduction
- Introduction to Predictive Modeling
Section 2: Review of Probability Theory
- Basics of Probability Theory
- Discrete Random Variables
- Continuous Random Variables
- Collections of Random Variables
- Random Vectors
Section 3: Uncertainty Propagation
- Basic Sampling
- The Monte Carlo Method for Estimating Expectations
- Monte Carlo Estimates of Various Statistics
- Quantify Uncertainty in Monte Carlo Estimates
Section 4: Principles of Bayesian Inference
Selecting Prior Information
Analytical Examples of Bayesian Inference
Section 5: Supervised Learning: Linear Regression and Logistic Regression
- Linear Regression Via Least Squares
- Bayesian Linear Regression
- Advanced Topics in Bayesian Linear Regression
Section 6: Unsupervised Learning
- Clustering and Density Estimation
- Dimensionality Reduction
Section 7: State-Space Models
- State-Space Models – Filtering Basics
- State-Space Models – Kalman Filters
Section 8: Gaussian Process Regression
- Gaussian Process Regression – Priors on Function Spaces
- Gaussian Process Regression – Conditioning on Data
- Bayesian Global Optimization
Section 9: Neural Networks
- Deep Neural Networks
- Deep Neural Networks Continued
- Physics-Informed Deep Neural Networks
Section 10: Advanced Methods for Characterizing Posteriors
- Sampling Methods
- Variational Inference
Learner testimonialsSkip Learner testimonials
Hands-on activities were fun and good way of teaching concepts.
Prof Bilionis clearly explains material. The lectures material is very useful for us to understand how to apply data analysis technique using build in packages.
I appreciate the professor in creating such intense materials for the coursework.
Overall this class is a great addition to the ME curriculum. I wish that more courses were delivered in this method since it makes it very easy to fit this course in around an active research schedule. Specific qualities that are excellent: The edX implementation is excellent Feedback on Piazza helps more students than just the ones who actually asked the questions HW problems that were more like projects really helped to develop understanding Hands-on problem demonstrations helped clarify minor coding difficulty. No exams is excellent and enables focus on more relevant skills than pure memorization.
Professor Billionis did a great job explaining the concepts of the class. Also, he understands that students are struggling during this time and designed the class material for it. I can tell this class was the best one I have had online during the pandemic.
I thought the course was a lot of work, but I enjoyed it in the end, and I think I learned a lot. I liked the asynchronous format that allowed me to re-watch the lectures several times and complete the material when it was convenient. I liked that the hands-on activities gave good examples of how to call the functions we used. The homework was quite practical. As a result of the course, I have the sense that I would know the right place to start on a real data science problem.
The instructor has structured the course really well. The questions to be answered are listed/explained during the beginning of the lecture series, following which the questions are answered during the lectures itself.
Excellent mathematical foundations to understand the real application of data analytics
I found this course extremely helpful, and gives a very complete and overall overview of the state-of-the-art machine learning techniques used for various applications. The hands-on activities were brilliantly designed in a way that the students are kept motivated to complete them. The coding snippets in the activities are very helpful to develop efficient coding skills when dealing with very big data. I am thoroughly satisfied with the course and will recommend this to students.
I believe this was one of my favorite courses I have taken at Purdue University. This class was challenging, but doable. I learned so much about uncertainty analysis, which is directly pertinent to my research. I would say Dr. Bilionis truly cared about what we learned, and wanted us to grasp it on a fundamental level. Though I still have much more to learn, I have a great foundation. I wish Dr. Bilionis would continue to create such in depth lectures and notes. The aid of the jupyter notebooks with the lectures allowed for me to easily follow what I just learned as well as apply it. I also learned much more Python than expected from this class, and I have become a better programmer than before.
I enjoyed course layout, with the lectures, reading material with references, and the hands on activities coupled with the homework. The notebook activities were really excellent.
I think this was a good course for an introduction to machine learning. I also like that the course was mostly application-based in the homework assignments. I mostly deal with the theoretical aspects, but being able to apply them in Python helped me a lot. The material was organized very effectively. The Piazza page is also great. I can always look there for help or if I'm confused with something.
About the instructors
Frequently Asked QuestionsSkip Frequently Asked Questions
Jupyter notebooks are interactive documents that can simultaneously contain text, mathematics, images, and executable code. The executable code can be in many programming languages (e.g., R, Matlab), but we are only going to use Python in this course.
The course uses Jupyter notebooks for the following content: Reading Activities, Hands-on Activities, and Homework Assignments. The rationale behind this choice is that it allows the student to focus on the mathematical methods rather than the programming and it ensures the reproducibility of the course content. Of course, understanding the code in Jupyter notebooks does require knowledge of Python, albeit it does not require knowing how to structure and call Python code from the command line. Jupyter notebooks can be run either on the students’ personal computers (instructions vary with operating system and can be found here) or in several cloud computing resources. The recommended method for this class is to use Google Colab which is available free of charge and requires only a standard Google account. The activity links included in the course will take you automatically to a copy of the latest version of the corresponding Jupyter notebook which you can then save and edit on your Google Drive.
Access to the Jupyter Notebooks Repository
As stated earlier, the recommended method for using the Jupyter notebooks of this class is to use Google Colab. The links to all the activities will take you directly to a Google Colab copy of the Jupyter notebook. If you want to use any alternative method (e.g., your personal computers Purdue’s Jupyter Hub, or anything else), you will need access to the Jupyter Notebook repository for the class. Select this link for a Git version control repository.
(If you have no idea what Git is, this is a good tutorial.) All the activities are inside the “activities” folder and they are named using the following convention.
"activity type-lecture number-activity type-count.ipynb"
For example, “hands-on-06.3.ipynb” is the third hands-on activity of Lecture 6.