Logo UAB
2020/2021

Optimisation and Inference for Computer Vision

Code: 43086 ECTS Credits: 6
Degree Type Year Semester
4314099 Computer Vision OB 0 1
The proposed teaching and assessment methodology that appear in the guide may be subject to changes as a result of the restrictions to face-to-face class attendance imposed by the health authorities.

Contact

Name:
Maria Vanrell Martorell
Email:
Maria.Vanrell@uab.cat

Use of Languages

Principal working language:
english (eng)

Teachers

Coloma Ballester Nicolau
Juan Francisco Garamendi Bragado
Karim Lekadir
Oriol Ramos Terrades

External teachers

Adrián Martín

Prerequisites

Degree in Engineering, Mathematics, Physics or similar.

Objectives and Contextualisation

Module Coordinator: Dra. Coloma Ballester

The aim of this module is to learn about the optimization algorithms and inference techniques that are behind many tasks in computer vision. The main concepts will include energy formulation and minimization, numerical techniques for variational problems, gradient descent optimization algorithms and tools useful for deep learning strategies. convex optimization, and graphical models. These techniques will be applied in the project in the context of image segmentation and inpainting.

Competences

  • Accept responsibilities for information and knowledge management.
  • Choose the most suitable software tools and training sets for developing solutions to problems in computer vision.
  • Conceptualise alternatives to complex solutions for vision problems and create prototypes to show the validity of the system proposed.
  • Continue the learning process, to a large extent autonomously.
  • Identify concepts and apply the most appropriate fundamental techniques for solving basic problems in computer vision.
  • Plan, develop, evaluate and manage solutions for projects in the different areas of computer vision.
  • Solve problems in new or little-known situations within broader (or multidisciplinary) contexts related to the field of study.
  • Understand, analyse and synthesise advanced knowledge in the area, and put forward innovative ideas.
  • Use acquired knowledge as a basis for originality in the application of ideas, often in a research context.
  • Work in multidisciplinary teams.

Learning Outcomes

  1. Accept responsibilities for information and knowledge management.
  2. Choose optimisation and inference techniques and train them to resolve a particular project.
  3. Continue the learning process, to a large extent autonomously.
  4. Identify the basic concepts of graphical models and inference algorithms.
  5. Identify the basic optimisation techniques and their associated algorithms.
  6. Identify the best representations that can be defined for solving both optimisation and inference problems with graphical models.
  7. Solve problems in new or little-known situations within broader (or multidisciplinary) contexts related to the field of study.
  8. Understand, analyse and synthesise advanced knowledge in the area, and put forward innovative ideas.
  9. Use acquired knowledge as a basis for originality in the application of ideas, often in a research context.
  10. Use optimisation and inference techniques to plan, develop, evaluate and manage a solution to a particular problem.
  11. Work in multidisciplinary teams.

Content

  1. Introduction to optimization problems and energy minimization methods. Examples and overview of a variational formulation.
  2. Review of numerical linear algebra: least squares methods, singular value decomposition, pseudoinverse, iterative methods. Applications. 
  3. Numerical techniques for variational problems: Gateaux derivative, Euler-Lagrange equation and gradient methods. Applications: denoising, image inpainting and Poisson editing. The Backpropagation strategy for gradient computation. Gradient descent optimization algorithms useful for deep learning strategies.
  4. Convex optimization. Constrained and unconstrained optimization. Duality principles and methods. Non-convex problems and convex relaxation. Applications: Total Variation restoration, disparity computation, optical flow computation.
  5. Segmentation with variational models. The Mumford and Shah Functional. Explicit and implicit Shape Representations. Level sets formulation.
  6. Bayesian networks and MRFs. Inference types. Main Inference algorithms. Examples: stereo, denoising.
  7. Inference algorithms. Belief propagation: message passing, loopy belief propagation. Example: inference for segmentation.
  8. Sampling methods: Particle-based methods, Markov Chain Monte Carlo, Gibbs Sampling.

Methodology

Supervised sessions: (Synchronous on-line sessions)

  • Lecture Sessions, where the lecturers will explain general contents about the topics. Some of them will be used to solve the problems.

Directed sessions: (Synchronous on-line sessions)

  • Project  Sessions, where the problems and goals of the projects will be presented and discussed, students will interact with the project coordinator about problems and ideas on solving the project (approx. 1 hour/week)
  • Presentation Session, where the students give an oral presentation about how they have solved the project and a demo of the results.
  • Exam Session, where the students are evaluated individually. Knowledge achievements and problem-solving skills

Autonomous work:

  • Student will autonomously study and work with the materials derived from the lectures.
  • Student will work in groups to solve the problems of the projects with deliverables:
    • Code
    • Reports
    • Oral presentations

Activities

Title Hours ECTS Learning Outcomes
Type: Directed      
Lecture sessions 20 0.8 8, 4, 6, 5, 3, 2, 9, 10
Type: Supervised      
Project follow-up sessions 8 0.32 1, 8, 4, 6, 5, 7, 3, 2, 11, 10
Type: Autonomous      
Homework 113 4.52 1, 4, 6, 5, 7, 3, 2, 11

Assessment

The final marks for this module will be computed with the following formula:

Final Mark = 0.4 x Exam + 0.55 x Project + 0.05 x Attendance

where

Exam: is the mark obtained in the Module Exam. This mark can be increased by getting extra points given by delivered exercises in specific lectures.

Attendance: is the mark derived from the control of attendance at lectures (minimum 70%)

Project: is the mark provided by the project coordinator based on the weekly follow-up of the project and deliverables. All accordingly with specific criteria such as:

  • Participation in discussion sessions and in team work (inter-member evaluations)
  • Delivery of mandatory and optional exercises.
  • Code development (style, comments, etc.)
  • Report (justification of the decisions in your project development)
  • Presentation (Talk and demonstrations on your project)

The Exam mark can be increased by extra points gained from delivered exercises proposed in certain lectures, but only if Exam Mark is greater than 3.

Only those students that fail (FinalMark < 5.0) can do a retake exam.

Assessment Activities

Title Weighting Hours ECTS Learning Outcomes
Exam 0,4 2.5 0.1 8, 4, 6, 5, 7, 9
Project 0,55 6 0.24 1, 4, 6, 5, 7, 3, 2, 11, 10
Session attendance 0,05 0.5 0.02 1, 8, 3, 9, 11

Bibliography

Journal articles:

  1. Xavier Bresson and Tony F. Chan. “Fast Dual Minimization of the Vectorial Total Variation Norm and Applications to Color Image Processing. Inverse Problems and Imaging”. American Institue of Mathematical Sciences. Vol 2, No. 4, pp 455-484 2008.
  2. Chan, T. F., & Vese, L. a. “Active contours without edges”. IEEE Transactions on Image Processing : A Publication of the IEEE Signal Processing Society, 10(2), pp 266–77, 2001.
  3. Daphne Koller and Nir Friedman, "Probablistic Graphical Models. Principles and techniques", 2009.
  4. Patrick Pérez, Michel Gangnet, and Andrew Blake. “Poisson image editing”. In ACM SIGGRAPH 2003 Papers (SIGGRAPH '03). ACM, New York, NY, USA, 313-318 2003.
  5. L.I. Rudin, S. Osher, and E. Fatemi. “Nonlinear Total Variation based Noise Removal Algorithms”. Physical D Nonlinear Phenomena, 60, pp 259-268, November 1992.
  6. Ruder, Sebastian. "An overview of gradient descent optimization algorithms." arXiv preprint arXiv:1609.04747(2016).

 Books:

  1. S.P. Boyd, L. Vandenberghe, "Convex optimization",  Cambridge University Press, 2004.
  2. Tony F. Chan and Jianhong Shen. “Image Processing and Analysis: Variational, PDE, Wavelet and Stochastic Methods”. Society for Industrial and Applied Mathematics, 2005.
  3. J. Nocedal, S.J. Wright, “Numerical optimization”, Springer Verlag, 1999.
  4. Aubert Gilles, Pierre Kornprobst.  “Mathematical Problems in Image Processing:  Partial Differential Equations and the Calculus of Variations”.  Springer-Verlag New York.
  5. Joe D. Hoffman. “Numerical Methods for Engineers and Scientists
  6. Daphne Koller and Nir Friedman, "Probablistic Graphical Models. Principles and techniques", 2009.
  7. Sebastian Nowozin and Christoph H. Lampert, "Structured Learning and Prediction in Computer Vision", Foundations and Trends in Computer Graphics and Vision: Vol. 6: No. 3-4, pp 185-365, 2011.
  8. C. Pozrikidis. “Numerical Computation in Science and Engineering”.