Logo UAB

Optimisation Techniques for Computer Vision

Code: 44773 ECTS Credits: 6
2024/2025
Degree Type Year
4318299 Computer Vision OB 0

Contact

Name:
Maria Isabel Vanrell Martorell
Email:
maria.vanrell@uab.cat

Teachers

Coloma Ballester Nicolau
Pablo Arias Martínez
Oriol Ramos Terrades

Teaching groups languages

You can view this information at the end of this document.


Prerequisites

Degree in Engineering, Mathematics, Physics or similar.


Objectives and Contextualisation

Module Coordinator: Dra. Coloma Ballester

The aim of this module is to learn about the optimization algorithms and inference techniques that are behind many tasks in computer vision. The main concepts will include energy formulation and minimization, numerical techniques for variational problems, gradient descent optimization algorithms and tools useful for deep learning strategies. convex optimization, and graphical models. These techniques will be applied in the project in the context of image segmentation and inpainting.


Learning Outcomes

  1. CA06 (Competence) Achieve the objectives of a project of vision carried out in a team.
  2. KA02 (Knowledge) Identify the functionals that should be optimised on images to obtain the solution to a problem of vision.
  3. KA09 (Knowledge) Select the best algorithms that can be used to optimise the functionals to be resolved to solve a problem of vision.
  4. SA02 (Skill) Apply and evaluate optimisation techniques on images to solve a specific problem.
  5. SA09 (Skill) Select the best software tools to code optimisation techniques on images for solving a specific problem.
  6. SA15 (Skill) Prepare a report that describes, justifies and illustrates the development of a project of vision.
  7. SA17 (Skill) Prepare oral presentations that allow debate of the results of a project of vision.

Content

  1. Introduction to optimization problems and energy minimization methods. Examples and overview of a variational formulation.
  2. Review of numerical linear algebra: least squares methods, singular value decomposition, pseudoinverse, iterative methods. Applications. 
  3. Numerical techniques for variational problems: Gateaux derivative, Euler-Lagrange equation and gradient methods. Applications: denoising, image inpainting and Poisson editing. The Backpropagation strategy for gradient computation. Gradient descent optimization algorithms useful for deep learning strategies.
  4. Convex optimization. Constrained and unconstrained optimization. Duality principles and methods. Non-convex problems and convex relaxation. Applications: Total Variation restoration, disparity computation, optical flow computation.
  5. Segmentation with variational models. The Mumford and Shah Functional. Explicit and implicit Shape Representations. Level sets formulation.
  6. Bayesian networks and MRFs. Inference types. Main Inference algorithms. Examples: stereo, denoising.
  7. Inference algorithms. Belief propagation: message passing, loopy belief propagation. Example: inference for segmentation.
  8. Sampling methods: Particle-based methods, Markov Chain Monte Carlo, Gibbs Sampling.

Activities and Methodology

Title Hours ECTS Learning Outcomes
Type: Directed      
Lecture sessions 20 0.8 CA06, KA02, KA09, SA02, SA09, SA15, SA17
Type: Supervised      
Project follow-up sessions 8 0.32 CA06, KA02, KA09, SA02, SA09, SA15, SA17
Type: Autonomous      
Homework 113 4.52 CA06, KA02, KA09, SA02, SA09, SA15, SA17

Supervised sessions: (Some of these sessions could be on-line synchronous)

  • Lecture Sessions, where the lecturers will explain general contents about the topics. Some of them will be used to solve the problems.

Directed sessions: 

  • Project  Sessions, where the problems and goals of the projects will be presented and discussed, students will interact with the project coordinator about problems and ideas on solving the project (approx. 1 hour/week)
  • Presentation Session, where the students give an oral presentation about how they have solved the project and a demo of the results.
  • Exam Session, where the students are evaluated individually. Knowledge achievements and problem-solving skills

Autonomous work:

  • Student will autonomously study and work with the materials derived from the lectures.
  • Student will work in groups to solve the problems of the projects with deliverables:
    • Code
    • Reports
    • Oral presentations

Annotation: Within the schedule set by the centre or degree programme, 15 minutes of one class will be reserved for students to evaluate their lecturers and their courses or modules through questionnaires.


Assessment

Continous Assessment Activities

Title Weighting Hours ECTS Learning Outcomes
Exam 0,4 2.5 0.1 CA06, KA02, KA09, SA02, SA09, SA15, SA17
Project 0,55 6 0.24 CA06, KA02, KA09, SA02, SA09, SA15, SA17
Session attendance 0,05 0.5 0.02 CA06, KA02, KA09, SA02, SA09, SA15, SA17

The final marks for this module will be computed with the following formula:

 Final Mark = 0.4 x Exam + 0.55 x Project + 0.05 x Attendance

 where

Exam: is the mark obtained in the Module Exam (must be >=3). This mark can be increased by getting extra points given by delivered exercises in specific lectures, but only if Exam Mark is greater than 3.

Attendance: is the mark derived from the control of attendance at lectures (minimum 70%)

Project: is the mark provided by the project coordinator based on the weekly follow-up of the project and deliverables (must be >=5). All accordingly with specific criteria such as:

  • Participation in discussion sessions and in team work (inter-member evaluations)
  • Delivery of mandatory and optional exercises.
  • Code development (style, comments, etc.)
  • Report (justification of the decisions in your project development)
  • Presentation (Talk and demonstrations on your project)

Only those students that fail (FinalMark < 5.0) can do a retake exam.


Bibliography

Journal articles:

  1. Xavier Bresson and Tony F. Chan. “Fast Dual Minimization of the Vectorial Total Variation Norm and Applications to Color Image Processing. Inverse Problems and Imaging”. American Institue of Mathematical Sciences. Vol 2, No. 4, pp 455-484 2008.
  2. Chan, T. F., & Vese, L. a. “Active contours without edges”. IEEE Transactions on Image Processing : A Publication of the IEEE Signal Processing Society, 10(2), pp 266–77, 2001.
  3. Daphne Koller and Nir Friedman, "Probablistic Graphical Models. Principles and techniques", 2009.
  4. Patrick Pérez, Michel Gangnet, and Andrew Blake. “Poisson image editing”. In ACM SIGGRAPH 2003 Papers (SIGGRAPH '03). ACM, New York, NY, USA, 313-318 2003.
  5. L.I. Rudin, S. Osher, and E. Fatemi. “Nonlinear Total Variation based Noise Removal Algorithms”. Physical D Nonlinear Phenomena, 60, pp 259-268, November 1992.
  6. Ruder, Sebastian. "An overview of gradient descent optimization algorithms." arXiv preprint arXiv:1609.04747(2016).

 Books:

  1. S.P. Boyd, L. Vandenberghe, "Convex optimization",  Cambridge University Press, 2004.
  2. Tony F. Chan and Jianhong Shen. “Image Processing and Analysis: Variational, PDE, Wavelet and Stochastic Methods”. Society for Industrial and Applied Mathematics, 2005.
  3. J. Nocedal, S.J. Wright, “Numerical optimization”, Springer Verlag, 1999.
  4. Aubert Gilles, Pierre Kornprobst.  “Mathematical Problems in Image Processing:  Partial Differential Equations and the Calculus of Variations”.  Springer-Verlag New York.
  5. Joe D. Hoffman. “Numerical Methods for Engineers and Scientists
  6. Daphne Koller and Nir Friedman, "Probablistic Graphical Models. Principles and techniques", 2009.
  7. Sebastian Nowozin and Christoph H. Lampert, "Structured Learning and Prediction in Computer Vision", Foundations and Trends in Computer Graphics and Vision: Vol. 6: No. 3-4, pp 185-365, 2011.
  8. C. Pozrikidis. “Numerical Computation in Science and Engineering”.

Software

Python Programming with special attention to image processing and optimization libraries.

Language list

Name Group Language Semester Turn
(PLABm) Practical laboratories (master) 1 English first semester morning-mixed
(TEm) Theory (master) 1 English first semester morning-mixed