ML fundamentals, terminology, model evaluation, and classical algorithms
OLS, gradient descent, regularization techniques, and statistical foundations
Binary/multi-class classification, LDA, and handling imbalanced data
ID3, C4.5, CART algorithms, information gain, and pruning techniques
Perceptron, MLP, activation functions, and backpropagation
Convolutional networks, modern architectures, and transfer learning
Maximum margin, kernel trick, soft margin, and dual formulation
Probabilistic classification using Bayes' theorem and conditional independence
Probabilistic graphical models, DAGs, and inference algorithms
Bagging, boosting, Random Forest, and XGBoost for improved predictions
K-means, hierarchical clustering, and DBSCAN for unsupervised learning
Spectral clustering, Gaussian mixture models, and ensemble methods
PCA, t-SNE, LDA, and autoencoders for feature extraction
Filter, wrapper, and embedded methods for optimal feature subsets
Markov Random Fields, Conditional Random Fields, and factor graphs
All courses include comprehensive theory, practical examples, and practice quizzes to test your understanding