Базовая часть
-
Math for AI
calculus:
function
function limit
logarithms
derivative
partial derivative
gradient
chain rule
probability theory:
discrete random variables
continuous random variables
probability function
probability density function
mean
variance and standard deviation
independent events
joint probability
conditional probability
algebra:
vectors
matrices
operation with vectors and matrices
determinant -
Python for AI
basics recap:
git flow
project structure
code styling
documentation standards
decorators
data manipulation:
efficient math operations (NumPy)
accelerating numerical computations (Numba)
exploratory data analysis (pandas)
performance speed optimization (Cython)
ML algorithms:
basic image/video processing (OpenCV)
high-level machine learning (scikit-learn)
deep dive into neural networks (PyTorch)
forefront NLP (Transformers)
data visualization:
visualization fundamentals (Matplotlib)
high-level plotting (Seaborn)
interactive visualizations (Plotly)
web-oriented tools (Bokeh)
geospatial data plotting (Vega-Altair) -
AI: from Basics to Transformers. Part 1
supervised learning
regression
regularization
classification
metrics
logistic regression
Продвинутая часть
-
Decision making in AI
task formulation
task analysis
data collection
data annotation
exploratory data analysis
proof of concept development
analysis of PoC results
improvement strategy
getting to the final result
presenting results -
MLOps
data:
data management (MongoDB, ClickHouse, PostgreSQL)
data processing manipulations (EDA with Plotly Dash, Streamlit)
model:
model engineering: hyperparameters tuning (Optuna, Ray Tune)
prepare model for production (tracing, quantization, ONNX, TorchServe)
ML pipeline:
training/validation/evaluation logging
experiments tracking (Neptune.ai/MLflow)
serving:
deployment tools (Docker, FastAPI)
memory management and optimization
testing in MLOps scenarios (PyTest, Profilers) -
AI: from Basics to Transformers. Part 2
intro to DL:
intro to deep learning
multilayer perceptron
backpropagation
deep learning mindset
CNNs:
convolutional neural nets
building convolutional architectures
residual networks
CNN training best practices
transformers:
attention mechanism in transformer
multi-head attention
properties of MHA
transformer encoder
transformer decoder
model interpretation
transformer training best practices
vision transformer
improving ViT
introduction to language models
large language models