資料簡介:
機(jī)器學(xué)習(xí)正在蠶食軟件世界。在這本Sebastian Raschka的暢銷書《Python機(jī)器學(xué)習(xí)(第二版)》中,你將了解并學(xué)習(xí)到機(jī)器學(xué)習(xí)、神經(jīng)網(wǎng)絡(luò)和深度學(xué)習(xí)的 前沿知識(shí)。 塞巴斯蒂安·拉施卡、瓦希德·麥加利利著的《Python機(jī)器學(xué)習(xí)》 新并擴(kuò)展了包括scikit-learn、Keras、TensorFlow在內(nèi)的 開源技術(shù)。書中提供了使用Python創(chuàng)建有效的機(jī)器學(xué)習(xí)和深度學(xué)習(xí)應(yīng)用所需的實(shí)用知識(shí)和技術(shù)。 在涉及數(shù)據(jù)分析的 主題之前,Sebastian Raschka和Vahid Mirjalili以其獨(dú)特見解和專業(yè)知識(shí)為你介紹機(jī)器學(xué)習(xí)和深度學(xué)習(xí)算法。本書將機(jī)器學(xué)習(xí)的理論原理與實(shí)際編碼方法相結(jié)合,以求全面掌握機(jī)器學(xué)習(xí)理論及其Python實(shí)現(xiàn)。
資料目錄:
Chapter 1: Giving Computers the Ability_ to Learn from Data
Building intelligent machines to transform data into knowledge
The three different types of machine learning
Making predictions about the future with supervised learning
Classification for predicting class labels
Regression for predicting continuous outcomes
Solving interactive problems with reinforcement learning
Discovering hidden structures with unsupervised learning
Finding subgroups with clustering
Dimensionality reduction for data compression
Introduction to the basic terminology and notations
A roadmap for building machine learning systems
Preprocessing - getting data into shape
Training and selecting a predictive model
Evaluating models and predicting unseen data instances
Using Python for machine learning
Installing Python and packages from the Python Package Index
Using the Anaconda Python distribution and package manager
Packages for scientific computing, data science, and machine learning
Summary
Chapter 2: Training Simple Machine Learning Algorithms
for Classification
Artificial neurons - a brief glimpse into the early history of
machine learning
The formal definition of an artificial neuron
The perceptron learning rule
Implementing a perceptron learning algorithm in Python
An object-oriented perceptron API
Training a perceptron model on the Iris dataset
Adaptive linear neurons and the convergence of learning
Minimizing cost functions with gradient descent
Implementing Adaline in Python
Improving gradient descent through feature scaling
Large-scale machine learning and stochastic gradient descent
Summary
Chapter 3: A Tour of Machine Learning Classifiers
Using scikit-learn
Choosing a classification algorithm
First steps with scikit-learn - training a perceptron
Modeling class probabilities via logistic regression
Logistic regression intuition and conditional probabilities
Learning the weights of the logistic cost function
Converting an Adaline implementation into an algorithm for
logistic regression
Training a logistic regression model with scikit-learn
Tackling overfitting via regularization
Maximum margin classification with support vector machines
Maximum margin intuition
Dealing with a nonlinearly separable case using slack variables