Lecture 1 Introduction to Supervised Learning (1)Expectatin Maximization(EM) Algorithm (期望值最大) (2)Linear Regression Algorithm(线性回归) (3)Local Weighted Regression(局部加权回归) (4)k-Nearest Neighbor Algorithm for Regression(回归k 近邻) (5)Linear Classifier(线性分类) (6)Perceptron Algorithm (线性分类) (7)Fisher Discriminant Analysis or Linear Discriminant Analysis(LDA) (8)k-NN Algorithm for Classifier(分类k 近邻) (9)Bayesian Decision Method(贝叶斯决策方法) Lecture 2 Feed-forward Neural Networks and BP Algorithm (1)Multilayer Perceptron(多层感知器) (2)BP Algorithm Lecture 3 Rudiments of Support Vector Machine (1)Support Vector Machine(支持向量机) (此算法是重点,必考题) 此处有一道必考题 Lecture 4 Introduction to Decision Rule Mining (1)Decision Tree Algorithm (2)ID3 Algorithm (3)C4.5 Algorithm (4)粗糙集… … Lecture 5 Classifier Assessment and Ensemble Methods (1)Bagging (2)Booting (3)Adaboosting Lecture 6 Introduction to Association Rule Mining (1)Apriori Algorithms (2)FP-tree Algorithms Lecture 7 Introduction to Custering Analysis (1)k-means Algorithms (2)fuzzy c-means Algorithms (3)k-mode Algorithms (4)DBSCAN Algorithms Lecture 8 Basics of Feature Selection (1)Relief Algorithms (2)ReliefF Algorithms 1 (3)mRMR Algorithms 最小冗余最大相关算法 (4)attribu te redu ction Algorithms 比较了几种分类算法性质。(以下两个表格来自两篇该领域经典论文) 2 Lecture 1 Introduction to Supervised Learning (1)Expectatin Maximization(EM) Algorithm (期望值最大) ①算法思想: EM 算法又称期望最大化算法,是对参数极大似然估计的一种迭代优化策略,它是一种可以从非完整的数据集中对参数进行极大似然估计的算法,应用于缺损数据,截尾数据,带有噪声的非完整数据。 最大期望算法经过两个步骤交替进行计算: 第一步计算期望(E):也就是将隐藏的变量对象能够观察到的一样包含在内,从而计算最大似然...