Core Concepts

Introduction

  • Recall the definition of differentiation
  • Apply differentiation to simple functions
  • Describe the utility of time saving rules
  • Apply sum, product and chain rules

Multivariate Calculus

  • Recognize that differentiation can be applied to multiple variables in an equation
  • Use multivariate calculus tools on example equations
  • Recognise the utility of vector/matrix structures in multivariate calculus
  • Examine two dimensional problems using the Jacobian

Multivariate Chain Rule and Its Applications

  • Apply the multivariate chain rule to differentiate nested functions
  • Explain the structure and function of a neural net
  • Apply multivariate calculate tools to relate network parameters to outputs
  • Implement backpropagation on a small neural network

Taylor Series and Linearization

The Taylor series is a method for re-expressing functions as polynomial series. This approach is the rational behind the use of simple linear approximations to complicated functions.

  • Recognise power series approximations to functions
  • Interpret the behaviour of power series approximations for ill-behaved functions
  • Explain the meaning and relevance of linearisation
  • Select appropriate representation of multivariate approximations

Intro to Optimization

  • Recognize the principles of gradient descent
  • Implement optimisation using multivariate calculus
  • Examine cases where the method fails to return the best solution
  • Solve gradient descent problems that are subject to a constraints using Lagrange Multipliers

Regression

  • Describe regression as a minimisation of errors problem
  • Distinguish appropriate from inappropriate models for particular data sets
  • Calculate multivariate calculus objects to perform a regression
  • Create code to fit a non-linear function to data using gradient descent