# Chapter 1

# Introduction

1.1

## Problem statement and examples

In this course we will deal with optimization problems. Such problems appear in many practical settings in almost all aspects of science and engineering. Mathematically, we can write the problem as

min subject to subject to

f (x) c_{i}(x) = 0 h_{i}(x) ≥ 0

i = 1, . . . , n_{E }i = 1, . . . , n_{I }

(1.1

a)

(1.1

b)

(1.1

c)

Where x is an n dimensional vector and the function f : R^{n }→ R^{1 }is called the objective function. The c_{i }are called equality constraints and h_{i }are inequality constraints. Our goal is to find the vector x that solves (1.1) assuming that such solution exist. For most applications we assume that f, c_{i }and h_{i }are twice differentiable.

Example 1 Minimization of a function in 1D Let f(x) = x^{2 }then an obvious solution is x = 0. If we add the inequality constraint h(x) =

x

1 ≥ 0 then the solution is x = 1.

## If on the other hand we add the inequality constraint

h(x) = x + 1 ≥ 0 constraints and we

then, the obtain x

solution is the same us the solution of = 0. We see that given an inequality

the problem with no constraint, it can be

active or inactive. Finally, consider the case h_{1}(x) = x

1 ≥ 0 and h_{2}(x) =

1

x ≥ 0. In this case we

see that the constraints are inconsistent and the problem has no solution.

## Example 2 Data fitting

1