INF ML AI
Information, machine learning and AI related cards.
- Jacobian and Hessian
- Importance sampling
- Monte Carlo Methods
- ELBO via Jensen
- Covariance matrix and estimation
- Chi-squared distribution
- Binomial distribution. A perspective.
- Random number generation in Numpy
- Characterizing a surface's "color" properties
- Color perception
- Correlated color temperature (CCT)
- Standard illuminants
- Accuracy, precision, recall
- Auto-encoder (VAE?) [stub]
- Latent variables in autoencoders
- Richard–Berry paradox
- The two types of compressors
- Covariance matrix
- Multivariate Gaussian distribution
- Gaussian distribution
- Belief Networks (by Koller)
- Compression and modularity of probabilistic models
- Motivation for graphical models
- Belief networks
- Belief networks: independence
- Belief networks: independence
- Belief networks: independence examples
- How many parameters are needed to describe this distribution?
- The urns
- Logistic regression LED exercise
- Kraft inequality
- Symbol codes
- Jensen's inequality
- Kullback-Leibler divergence and Gibbs' inequality
- Entropy of an ensemble
- Joint entropy of two random variables
- Shannon information content
- What is the derivative of negative log MLE (MLE used as a negative cost) when the variables are passed through softmax activations?
- Softmax
- Incremental average (estimate update)
- Cross-Entropy and KL divergence
- Entropy
- Maximum likelihood estimation (MLE)
- Types of machine learning tasks
- Logistic sigmoid
- Rectifier and the softplus function
- Bernoulli Distribution
- Covariance
- Independence & Conditional Independence
- Machine Learning Definition (Mitchell, 1997)
Download: deck package (import with Anki)