The 35th Jyväskylä Summer School (https://jyu.fi/jss) will be organized at the University of Jyvaskyla, Finland on August 3-14, 2026. It offers one-week intensive courses in various fields of Science and IT. Teaching is provided in English by international top-level lecturers and participation in the courses is free of charge. The application period ends on April 30, 2026.
Please note that Prof. Hisao Ishibuchi and Dr. Lie Meng Pang (Southern University of Science and Technology, China) will deliver a course "COM1: Evolutionary Multi-Objective Optimization”, on August10-14, 2026. It is perfect for master's and PhD students, as well as postdocs, interested in gaining a compact yet comprehensive understanding of evolutionary algorithms for solving multiobjective optimization problems.
In general, real-world optimization problems include multiple objectives. Thus, they can be formulated as multiobjective optimization problems. Those problems do not have a single optimal solution but multiple tradeoff solutions since multiple objectives cannot be simultaneously optimized by a single solution. However, multiobjective optimization problems are usually handled as single-objective optimization problems to find a single solution by focusing only on a main objective or combining multiple objectives into a scalarizing function. In this course, students will learn how to handle multiple objectives to find multiple candidate solutions by considering the tradeoff relation among the objectives. Emphasis will be given on the evolutionary multiobjective optimization (EMO) approach where a variety of solutions with different tradeoffs are evolved as a population to search for the entire tradeoff front of a multiobjective optimization problem. This course will address the following topics:
- Formulations of single-objective and multiobjective optimization problems with some examples
- Pareto optimality and its relation to the objective space dimensionality
- Scalarizing functions and their contour lines
- Decision maker's role and preference information
- EMO approach, MCDM approach and their hybrid approach
- Basic framework of single-objective evolutionary algorithms
- Basic framework of multiobjective evolutionary algorithms
- Search behavior analysis of NSGA-II and its modifications
- Related websites: PlatEMO and Pymoo
- Search behavior analysis of MOEA/D and its modifications
- Search behavior analysis of SMS-EMOA and its modifications
- Search behavior analysis of NSGA-III and its modifications
- Difficulties in performance comparison of EMO algorithms
- Performance indicators: Uniformity, s-energy, GD, IGD, IGD+ and HV
- Anytime performance analysis
- Population size specification for performance comparison
- Artificial test problems and real-world problems
- Performance improvement of EMO algorithms: Archiving and initialization
- Constraint handling in EMO algorithms
- Special multiobjective problems: Many-objective, large-scale, and sparse problems
- Use of machine learning techniques for EMO algorithms
- Use of large language models for EMO research
Learning outcomes: After completing the course, students will have clear ideas about evolutionary algorithms and evolutionary multiobjective optimizations. They will be familiar with basic concepts in multiobjective optimization such as Pareto dominance and Pareto fronts, representative multiobjective evolutionary algorithms such as NSGA-II, MOEA/D and SMS-EMOA, performance indicators such as GD, IGD and hypervolume, and some hot topics such as archiving and Pareto set learning. They will also understand the importance of fair performance comparison.
Prerequisites: Participants are expected to have prior knowledge of the following concepts:
- Basics of probability theory
See the JSS website for more information about the application and other courses. Posted on 2026-04-16 by Sarah Fores