Optimisation methods
1000-MS1MetOpt
Lecture
This course presents both basic and advanced optimization methods used in science, engineering, and data analysis. We begin by reviewing key mathematical concepts: derivatives, gradients, Hessians, Taylor expansions, and necessary and sufficient conditions for extrema.
We then discuss single-variable optimization methods, including bisection, golden section search, Newton’s method, and uniform search. The focus is on correctness, convergence, and stopping criteria.
The next part covers multivariate optimization: gradient descent methods (including backtracking line search), Newton-Raphson, Levenberg-Marquardt, and conjugate gradient techniques. We also introduce stochastic methods such as gradient descent with randomness, Monte Carlo, and simulated annealing, demonstrating their applications when classical methods are insufficient.
Throughout the course, we emphasize the geometric interpretation of methods, convergence analysis, and selecting the right algorithm for specific optimization problems. Lecture materials are provided in the form of slides and code examples in Python.
Laboratory
Laboratory sessions provide hands-on experience with the methods discussed in lectures. Students:
* implement gradient-based, Newton, conjugate gradient, and selected stochastic algorithms,
* experiment with algorithm parameters and study their effect on convergence and runtime,
* analyze efficiency and quality of solutions,
* use numerical libraries (NumPy, SciPy, PyTorch),
* prepare reports and deliver short presentations summarizing their results.
Laboratory activities require active participation during classes and independent work outside of class. Assessment is based on the correctness of implementations, experimental analysis, and quality of reports.
Total student workload
* 30 hrs lecture
* 30 hrs laboratory classes
* 3 hrs exam
* 3 hrs presentation of laboratory assignments
* 30 hrs consultations with instructors
* 20 hrs self-study: preparing assignments
* 20 hrs ongoing preparation for classes, studying literature
* 20 hrs self-study: exam preparation
Total: 156 hours (5 ECTS credits)
Learning outcomes - knowledge
Upon completion of the course, the student:
* W01 - knows methods for designing and analyzing the computational complexity of optimization algorithms for sequential programs and, at an introductory level, distributed programs (K_W12),
* W02 - knows elements of statistical analysis of results (estimation, hypothesis testing, dimensionality reduction) required for evaluating algorithms (K_W03, K_W05),
* W03 - knows methods for solving computationally difficult problems: gradient-based methods, Newton’s method, conjugate gradient methods, and selected heuristics (Monte Carlo, simulated annealing) (K_W12),
* W04 - understands practical aspects of efficient algorithm implementation in Python using numerical libraries (K_W11).
Learning outcomes - skills
Upon completion of the course, the student is able to:
* U01 - select, design, and analyze optimization algorithms, justify their correctness, and assess their complexity (K_U16),
* U02 - implement optimization algorithms and utilize numerical libraries (NumPy, SciPy, PyTorch) (K_U16, K_U15),
* U03 - apply software engineering tools, including code repositories, testing, and performance reporting (K_U15, K_U17),
* U04 - search for and critically evaluate academic literature, documentation, and online resources (K_U02).
Learning outcomes - social competencies
Upon completion of the course, the student:
* K01 - thinks creatively when selecting and adapting optimization methods (K_K01),
* K02 - can independently analyze data and draw logical conclusions (K_K01),
* K03 - works systematically, pays attention to detail, and meets deadlines (K_K03),
* K04 - clearly communicates results using appropriate terminology and collaborates effectively across disciplines (K_K05, K_K06),
* K05 - works efficiently in project teams and contributes to solving complex problems (K_K03, K_K04).
Teaching methods
* Problem-based lectures with illustrative examples,
* Computer-based laboratory sessions with implementation tasks,
* Mini-projects and laboratory reports,
* Discussions and case studies,
* Individual consultations with the instructor.
Expository teaching methods
- informative (conventional) lecture
Exploratory teaching methods
- laboratory
- practical
Online teaching methods
- content-presentation-oriented methods
Type of course
elective course
Prerequisites
Students are expected to have:
* knowledge of basic mathematical analysis: derivatives, gradients, Hessians, Taylor expansions, and limits of functions,
* the ability to perform basic matrix operations: multiplication, inversion, determinants, eigenvalues, and eigenvectors,
* understanding of basic linear algebra, including vector orthogonality and vector norms,
* basic programming skills in Python and experience with numerical libraries,
* familiarity with elementary single-variable optimization methods (recommended): bisection, golden section search, Newton’s method, stopping criteria.
Course coordinators
Assessment criteria
The course equips students with both theoretical foundations and practical skills in optimization. Students learn to design, implement, and evaluate optimization algorithms, analyze their efficiency, and present results effectively through reports and presentations.
Practical placement
Bibliography
Core:
1. Boyd, S., Vandenberghe, L. - Convex Optimization.
2. Nocedal, J., Wright, S. - Numerical Optimization.
3. Bertsekas, D. - Nonlinear Programming.
Additional information
Additional information (registration calendar, class conductors,
localization and schedules of classes), might be available in the USOSweb system: