Unpingco, José.
Python for Probability, Statistics, and Machine Learning [electronic resource] / by José Unpingco. - 2nd ed. 2019. - XIV, 384 p. 165 illus., 37 illus. in color. online resource.
Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index.
This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowledge of Python programming.
9783030185459
10.1007/978-3-030-18545-9 doi
Telecommunication.
Computer science—Mathematics.
Mathematical statistics.
Engineering mathematics.
Engineering—Data processing.
Statistics .
Data mining.
Communications Engineering, Networks.
Probability and Statistics in Computer Science.
Mathematical and Computational Engineering Applications.
Statistics in Engineering, Physics, Computer Science, Chemistry and Earth Sciences.
Data Mining and Knowledge Discovery.
TK5101-5105.9
621.382
Python for Probability, Statistics, and Machine Learning [electronic resource] / by José Unpingco. - 2nd ed. 2019. - XIV, 384 p. 165 illus., 37 illus. in color. online resource.
Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index.
This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowledge of Python programming.
9783030185459
10.1007/978-3-030-18545-9 doi
Telecommunication.
Computer science—Mathematics.
Mathematical statistics.
Engineering mathematics.
Engineering—Data processing.
Statistics .
Data mining.
Communications Engineering, Networks.
Probability and Statistics in Computer Science.
Mathematical and Computational Engineering Applications.
Statistics in Engineering, Physics, Computer Science, Chemistry and Earth Sciences.
Data Mining and Knowledge Discovery.
TK5101-5105.9
621.382