|
|
|
|
LEADER |
05701nam a2200541 4500 |
001 |
978-3-030-18545-9 |
003 |
DE-He213 |
005 |
20200124101054.0 |
007 |
cr nn 008mamaa |
008 |
190629s2019 gw | s |||| 0|eng d |
020 |
|
|
|a 9783030185459
|9 978-3-030-18545-9
|
024 |
7 |
|
|a 10.1007/978-3-030-18545-9
|2 doi
|
040 |
|
|
|d GrThAP
|
050 |
|
4 |
|a TK1-9971
|
072 |
|
7 |
|a TJK
|2 bicssc
|
072 |
|
7 |
|a TEC041000
|2 bisacsh
|
072 |
|
7 |
|a TJK
|2 thema
|
082 |
0 |
4 |
|a 621.382
|2 23
|
100 |
1 |
|
|a Unpingco, José.
|e author.
|4 aut
|4 http://id.loc.gov/vocabulary/relators/aut
|
245 |
1 |
0 |
|a Python for Probability, Statistics, and Machine Learning
|h [electronic resource] /
|c by José Unpingco.
|
250 |
|
|
|a 2nd ed. 2019.
|
264 |
|
1 |
|a Cham :
|b Springer International Publishing :
|b Imprint: Springer,
|c 2019.
|
300 |
|
|
|a XIV, 384 p. 165 illus., 37 illus. in color.
|b online resource.
|
336 |
|
|
|a text
|b txt
|2 rdacontent
|
337 |
|
|
|a computer
|b c
|2 rdamedia
|
338 |
|
|
|a online resource
|b cr
|2 rdacarrier
|
347 |
|
|
|a text file
|b PDF
|2 rda
|
505 |
0 |
|
|a Introduction -- Part 1 Getting Started with Scientific Python -- Installation and Setup -- Numpy -- Matplotlib -- Ipython -- Jupyter Notebook -- Scipy -- Pandas -- Sympy -- Interfacing with Compiled Libraries -- Integrated Development Environments -- Quick Guide to Performance and Parallel Programming -- Other Resources -- Part 2 Probability -- Introduction -- Projection Methods -- Conditional Expectation as Projection -- Conditional Expectation and Mean Squared Error -- Worked Examples of Conditional Expectation and Mean Square Error Optimization -- Useful Distributions -- Information Entropy -- Moment Generating Functions -- Monte Carlo Sampling Methods -- Useful Inequalities -- Part 3 Statistics -- Python Modules for Statistics -- Types of Convergence -- Estimation Using Maximum Likelihood -- Hypothesis Testing and P-Values -- Confidence Intervals -- Linear Regression -- Maximum A-Posteriori -- Robust Statistics -- Bootstrapping -- Gauss Markov -- Nonparametric Methods -- Survival Analysis -- Part 4 Machine Learning -- Introduction -- Python Machine Learning Modules -- Theory of Learning -- Decision Trees -- Boosting Trees -- Logistic Regression -- Generalized Linear Models -- Regularization -- Support Vector Machines -- Dimensionality Reduction -- Clustering -- Ensemble Methods -- Deep Learning -- Notation -- References -- Index.
|
520 |
|
|
|a This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Detailed proofs for certain important results are also provided. Modern Python modules like Pandas, Sympy, Scikit-learn, Tensorflow, and Keras are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This updated edition now includes the Fisher Exact Test and the Mann-Whitney-Wilcoxon Test. A new section on survival analysis has been included as well as substantial development of Generalized Linear Models. The new deep learning section for image processing includes an in-depth discussion of gradient descent methods that underpin all deep learning algorithms. As with the prior edition, there are new and updated *Programming Tips* that the illustrate effective Python modules and methods for scientific programming and machine learning. There are 445 run-able code blocks with corresponding outputs that have been tested for accuracy. Over 158 graphical visualizations (almost all generated using Python) illustrate the concepts that are developed both in code and in mathematics. We also discuss and use key Python modules such as Numpy, Scikit-learn, Sympy, Scipy, Lifelines, CvxPy, Theano, Matplotlib, Pandas, Tensorflow, Statsmodels, and Keras. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowledge of Python programming.
|
650 |
|
0 |
|a Electrical engineering.
|
650 |
|
0 |
|a Mathematical statistics.
|
650 |
|
0 |
|a Applied mathematics.
|
650 |
|
0 |
|a Engineering mathematics.
|
650 |
|
0 |
|a Statistics .
|
650 |
|
0 |
|a Data mining.
|
650 |
1 |
4 |
|a Communications Engineering, Networks.
|0 http://scigraph.springernature.com/things/product-market-codes/T24035
|
650 |
2 |
4 |
|a Probability and Statistics in Computer Science.
|0 http://scigraph.springernature.com/things/product-market-codes/I17036
|
650 |
2 |
4 |
|a Mathematical and Computational Engineering.
|0 http://scigraph.springernature.com/things/product-market-codes/T11006
|
650 |
2 |
4 |
|a Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences.
|0 http://scigraph.springernature.com/things/product-market-codes/S17020
|
650 |
2 |
4 |
|a Data Mining and Knowledge Discovery.
|0 http://scigraph.springernature.com/things/product-market-codes/I18030
|
710 |
2 |
|
|a SpringerLink (Online service)
|
773 |
0 |
|
|t Springer eBooks
|
776 |
0 |
8 |
|i Printed edition:
|z 9783030185442
|
776 |
0 |
8 |
|i Printed edition:
|z 9783030185466
|
776 |
0 |
8 |
|i Printed edition:
|z 9783030185473
|
856 |
4 |
0 |
|u https://doi.org/10.1007/978-3-030-18545-9
|z Full Text via HEAL-Link
|
912 |
|
|
|a ZDB-2-ENG
|
950 |
|
|
|a Engineering (Springer-11647)
|