Skip to main content

Questions tagged [non-smooth-optimization]

For questions related to non-smooth optimization.

0 votes
0 answers
15 views

How to Understand the Resisting Oracle for Feasibility Problem (Yurii Nesterov, Lectures on Convex Optimization)

I am struggling to understand Section 3.2.7 (Complexity Bounds in Finite Dimension) as whole in Yurii Nesterov's [Lectures on Convex Optimization]. I tried to search for similar results throughout the ...
HansEtherious's user avatar
0 votes
0 answers
24 views

Connection between minimal norm subgradient and steepest descent

In my optimization course it was mentioned that for a non-smooth convex function, the subgradient with smallest norm provides the direction of steepest descent of the function, so if $f: \mathbb{R}^n \...
Len's user avatar
  • 123
0 votes
1 answer
51 views

Why are piecewise linear functions semismooth?

I came across the following exercise, namely proofing that for the minimumfunction f(a,b) = min(a,b) for $ x = (a,b)^T \in \mathbb{R}^2$, it holds for every point that sup $|f(x+s)-f(x) - Ms| = O(||s||...
max_121's user avatar
  • 779
7 votes
1 answer
263 views

When do two functions have the same subdifferentials?

For two functions $f$ and $g$, if $\nabla f(x) = \nabla g(x)$, $f = g + c$ for some constant $c$. Does the same hold if the gradient is replaced by the (convex) subdifferential, ie $\partial f(x) = \...
P. Camilleri's user avatar
0 votes
1 answer
40 views

Directional derivative of maximum function

For convex functions $f_i: \mathbb{R}^n \to \mathbb{R}$, $1 \le i \le m$, let $f:\mathbb{R}^n \to \mathbb{R}$ be defined via $f(x) = \max_{i \in [m]} f_i(x)$. I want to prove that the directional ...
Azgen's user avatar
  • 39
0 votes
0 answers
53 views

Dynamics of Loss in Homogeneous, Non-Smooth Models Using Clarke Subdifferential

tl;dr: Seeking insights on the application of Clarke subdifferential for analyzing the optimization differential inclusion with smooth objective and homogeneous model. I'm interested in its validity, ...
Zach466920's user avatar
  • 8,361
0 votes
0 answers
28 views

Question about Newton derivative of $\max(0,x)$.

From page 204 of this book: a function $f\colon X \to Y$ between Banach spaces is called Newton differentiable at $x$ if $$\lim_{h \to 0}\frac{\lVert f(x+h)-f(x)-f'(x+h)(h)\rVert}{\lVert h\rVert} =0$$ ...
math_guy's user avatar
  • 465
1 vote
1 answer
88 views

Let $f$ be convex, and $g$ be a convex surrogate of $f$. Does a Lipschitz-smooth $g-f$ imply $f$ and $g$ have the same subgradients?

Let $f,g : \mathbb{R}^N \to \mathbb{R}$ be convex functions, but not necessarily differentiable. Suppose $g$ is a 'majorant' or 'surrogate' of $f$ at $\xi$ with the following properties: $g(x) \geq f(...
Pentaki's user avatar
  • 369
0 votes
0 answers
49 views

Same result in every iterations from subgradient and proximal gradient method.

I'm trying to implement the subgradient method and proximal gradient method with constant stepsize for the lasso problem but the result for the subgradient method and proximal gradient is almost ...
Help me pls's user avatar
0 votes
0 answers
49 views

Example of empty Clarke subdifferential for function lipschitz over a closed convex set

If $f:\mathbb{R}^n\rightarrow\mathbb{R}\cup\{+\infty\}$ is a proper lower semi-continuous function that is Lipschitz continuous over $\text{dom}(f)$, where $\text{dom}(f):=\{x\in\mathbb{R}^n:f(x)<+\...
William's user avatar
  • 997
1 vote
0 answers
37 views

Optimality check for non-differentiable convex function

I have a doubt on how to check if a given point of a convex, but nondifferentiable, function is a minimum/maximum. For instance, let's say that I have the following function: $$ f \left( x \right) = ...
TheLearner's user avatar
0 votes
0 answers
33 views

smoothness and convexity of optimization problem

suppose we have an optimization problem in high dimension, how can detect it is smooth problem or convex problem or non of them to apply appropriate tools for solve it? thanks!
fatemeh-g's user avatar
1 vote
0 answers
166 views

Proximal point, Moreau envelope, grad. of Moreau env. when domain is constrained / subset of $\mathbb{R}^d$

Let $f:\mathcal{K}\rightarrow \mathbb{R}$ be a convex function, with $\mathcal{K}\subset \mathbb{R}^d$ a convex set, the domain of $f$ is constrained/a strict subset of $\mathbb{R}^d$. How do we ...
shnnnms's user avatar
  • 313
2 votes
1 answer
311 views

Subgradient method for nonconvex nonsmooth function

Gradient descent or stochastic gradient descent are frequently used to find stationary points (and in some cases even to local minimum) of a nonconvex function. I was wondering if the same can be said ...
blueArrow's user avatar
1 vote
1 answer
123 views

Algorithms/Solvers for Hard Constrained Non-Linear Optimization Problems - Model Predictive Control Example

I have an autonomous robotic swarm path planning/control problem where a set of "leader" robots have predefined (nontrivial) dynamics in the control set, and "follower" robots are ...
Gabriel Souza's user avatar

15 30 50 per page