Fellow, Director of Studies in Mathematics and Graduate Tutor at Selwyn College
JJ Thompson Avenue
Cambridge CB3 0HE
Anita Faul came to Cambridge after studying two years in Germany. She did Part II and Part III Mathematics at Churchill College, Cambridge. Since these are only two years, and three years are necessary for a first degree, she does not hold one. However, this was followed by a PhD on the Faul-Powell Algorithm for Radial Basis Function Interpolation under the supervision of Professor Mike Powell. She then worked on the Relevance Vector Machine with Mike Tipping at Microsoft Research Cambridge. Ten years in industry followed where she worked on various algorithms on mobile phone networks, image processing and data visualization. Current projects are on machine learning techniques. In teaching she enjoys to bring out the underlying, connecting principles of algorithms which is the emphasis of her book on Numerical Analysis. She is working on a book on machine learning. She has another life, and it definitely feels like another life, as mother of three.
Departments and Institutes
There are several challenges with which data presents us nowadays. For one there is the abundance of data and the necessity to extract the essential information from the data. When tackling this task a balance has to be struck between putting aside irrelevant information and keeping the relevant one without getting lost in detail, known as overfitting. The law of parsimony, also known as Occam’s razor should be a guiding principle, keeping models simple while explaining the data.
The next challenge is the fact that the data is not static. New data arrives constantly through the pipeline. Therefore, there is a need for models which update themselves as the new data becomes available. The models should be flexible enough to become more complex should this be necessary. In addition the models should inform us which data needs to be collected so that the collection process becomes most informative.
The third challenge is the analysis. Can we build systems which inform us of the underlying structure and processes which gave rise to the data? Moreover, it is not enough to discover the structure and processes, we also need to add meaning to it. Here different disciplines need to work together.
Another challenge are the conclusions we draw from the data. After all, as popularised by Mark Twain: "There are three kinds of lies: lies, damned lies, and statistics." An objective measure of confidence is needed to make generalized statements.
Statistics ; Signal Processing ; Neural Networks ; Probabilistic Machine Learning ; Variational inference ; Big data and the environment ; Bayesian Inference ; Big Data Analytics ; Changing Data ; Bayesian Machine Learning ; data mining ; Mixture models ; Self-correction learners ; Statistical Signal Processing ; Data Fusion ; Decision-making ; Variational methods. ; Time Series ; Modelling ; Computer Vision ; Gaussian processes ; Imaging ; Application Independent ; Neuroscience ; Applications in industries ; Incomplete Data ; Statistical Inference ; Cognitive and Computational Neuroscience ; Bayesian methods ; Monte Carlo methods ; Parameter Estimation ; Image analysis ; Experimental Neuroscience ; Linking domain knowledge with ; Experimental particle physics ; Algorithms ; Independent Components Analysis ; Confidence Measures ; Big Data in Biology ; Simulation ; Machine learning ; Inverse Problems ; Medical Imaging ; Statistical Physics ; Big Data ; Computational Neuroscience ; Bayesian Statistics ; Compressed Sensing ; Machine Learning ; Analytics ; Deep Learning ; Universal techniques ; Data Visualisation and Understanding ; Education ; Image processing ; Statistical Learning ; Artificial Intelligence ; Predictive Analytics
This textbook provides an accessible and concise introduction to numerical analysis for upper undergraduate and beginning graduate students from various backgrounds. It was developed from the lecture notes of four successful courses on numerical analysis taught within the MPhil of Scientific Computing at the University of Cambridge. The book is easily accessible, even to those with limited knowledge of mathematics.
Students will get a concise, but thorough introduction to numerical analysis. In addition the algorithmic principles are emphasized to encourage a deeper understanding of why an algorithm is suitable, and sometimes unsuitable, for a particular problem.
- "A Krylov subspace algorithm for multiquadric interpolation in many dimensions", co-authors G. Goodsell and M.J.D. Powell, published in IMA Journal of Numerical Analysis (2005).
- "Fast marginal likelihood maximisation for sparse Baysian models" co-author M. Tipping, published in Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics(2003).
- "Analysis of Sparse Bayesian Learning", co-author M. Tipping, published in Advances in Neural Information Processing Systems 14 (2002).
- "A variational approach to robust regression", co-author M. Tipping, published in the Proceedings of ICANN'01.
- "Proof of convergence of an iterative technique for thin plate spline interpolation in two dimensions", co-author M.J.D. Powell, published in Advances in Computational Mathematics, Vol. 11.
- "Krylov subspace methods for radial basis function interpolation", co-author M.J.D. Powell, published in Numerical Analysis, (1999).
- "Iterative techniques for radial basis function interpolation", Ph.D. thesis.