Concepts

Derivatives and Optimization

  • Perform gradient descent in neural networks with different activation and cost functions
  • Visually interpret differentiation of different types of functions commonly used in machine learning
  • Approximately optimize different types of functions commonly used in machine learning using first-order (gradient descent) and second-order (Newton’s method) iterative methods
  • Analytically optimize different types of functions commonly used in machine learning using properties of derivatives and gradients.

Gradients and Gradient Descent

Optimization Techniques

- Gradient Descent and Backpropagation
- Newtons method