Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place.
Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place.
Kalman and Bayesian Filters in Python
Table of Contents -----
Motivation behind writing the book. How to download and read the book. Requirements for IPython Notebook and Python. github links.
Intuitive introduction to the g-h filter, also known as the - Filter, which is a family of filters that includes the Kalman filter. Once you understand this chapter you will understand the concepts behind the Kalman filter.
Chapter 2: The Discrete Bayes Filter
Introduces the discrete Bayes filter. From this you will learn the probabilistic (Bayesian) reasoning that underpins the Kalman filter in an easy to digest form.
Chapter 3: Gaussian Probabilities
Introduces using Gaussians to represent beliefs in the Bayesian sense. Gaussians allow us to implement the algorithms used in the discrete Bayes filter to work in continuous domains.
Chapter 4: One Dimensional Kalman Filters
Implements a Kalman filter by modifying the discrete Bayes filter to use Gaussians. This is a full featured Kalman filter, albeit only useful for 1D problems.
Chapter 5: Multivariate Gaussians
Extends Gaussians to multiple dimensions, and demonstrates how 'triangulation' and hidden variables can vastly improve estimates.
Chapter 6: Multivariate Kalman Filter
We extend the Kalman filter developed in the univariate chapter to the full, generalized filter for linear problems. After reading this you will understand how a Kalman filter works and how to design and implement one for a (linear) problem of your choice.
We gotten about as far as we can without forming a strong mathematical foundation. This chapter is optional, especially the first time, but if you intend to write robust, numerically stable filters, or to read the literature, you will need to know the material in this chapter. Some sections will be required to understand the later chapters on nonlinear filtering.
Chapter 8: Designing Kalman Filters
Building on material in Chapters 5 and 6, walks you through the design of several Kalman filters. Only by seeing several different examples can you really grasp all of the theory. Examples are chosen to be realistic, not 'toy' problems to give you a start towards implementing your own filters. Discusses, but does not solve issues like numerical stability.
Chapter 9: Nonlinear Filtering
Kalman filters as covered only work for linear problems. Yet the world is nonlinear. Here I introduce the problems that nonlinear systems pose to the filter, and briefly discuss the various algorithms that we will be learning in subsequent chapters.
Chapter 10: Unscented Kalman Filters
Unscented Kalman filters (UKF) are a recent development in Kalman filter theory. They allow you to filter nonlinear problems without requiring a closed form solution like the Extended Kalman filter requires.
This topic is typically either not mentioned, or glossed over in existing texts, with Extended Kalman filters receiving the bulk of discussion. I put it first because the UKF is much simpler to understand, implement, and the filtering performance is usually as good as or better then the Extended Kalman filter. I always try to implement the UKF first for real world problems, and you should also.
Chapter 11: Extended Kalman Filters
Extended Kalman filters (EKF) are the most common approach to linearizing non-linear problems. A majority of real world Kalman filters are EKFs, so will need to understand this material to understand existing code, papers, talks, etc.
Particle filters uses Monte Carlo techniques to filter data. They easily handle highly nonlinear and non-Gaussian systems, as well as multimodal distributions (tracking multiple objects simultaneously) at the cost of high computational requirements.
Kalman filters are recursive, and thus very suitable for real time filtering. However, they work extremely well for post-processing data. After all, Kalman filters are predictor-correctors, and it is easier to predict the past than the future! We discuss some common approaches.
Chapter 14: Adaptive Filtering
Kalman filters assume a single process model, but manuevering targets typically need to be described by several different process models. Adaptive filtering uses several techniques to allow the Kalman filter to adapt to the changing behavior of the target.
Appendix A: Installation, Python, NumPy, and FilterPy
Brief introduction of Python and how it is used in this book. Description of the companion library FilterPy.
Appendix B: Symbols and Notations
Most books opt to use different notations and variable names for identical concepts. This is a large barrier to understanding when you are starting out. I have collected the symbols and notations used in this book, and built tables showing what notation and names are used by the major books in the field.
Still just a collection of notes at this point.
Appendix C: Walking through the Kalman Filter code
A brief walkthrough of the KalmanFilter class from FilterPy.
Appendix D: H-Infinity Filters
Describes the filter.
I have code that implements the filter, but no supporting text yet.
Appendix E: Ensemble Kalman Filters
Discusses the ensemble Kalman Filter, which uses a Monte Carlo approach to deal with very large Kalman filter states in nonlinear systems.
Appendix F: FilterPy Source Code
Listings of important classes from FilterPy that are used in this book.
Appendix G: Designing Nonlinear Kalman Filters*
Works through some examples of the design of Kalman filters for nonlinear problems. This is still very much a work in progress.
Appendix H: Least Squares Filter
not written yet
Introduces the least squares filter in batch and recursive forms. I've not made a start on authoring this yet. Many authors develop KF explanations by covering least squares first. I am not, so I may move this chapter deeper in the book, or remove it.
Github repository
http://github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python