2
$\begingroup$

I am looking for a way to optimize the function $f$, defined below.

First, fix some positive integer $k$ and let $c_1$ and $c_2$ be vectors in $\mathbb{R}^n$. Let $g$ be an increasing concave function of a single variable.

Now define $$f(S) = g(c_1 \cdot S) + g(c_2 \cdot S),$$

where $S$ is a vector in $\mathbb{R}^n$. I want to maximize $f$ over all vectors $S$ which have a 1 in exactly $k$ coordinates and 0 everywhere else.

This is a combinatorial optimization, but I believe there may be a way to solve this optimization problem. By 'solve' I mean give a algorithm which finds the maximizer of the function $f$ and runs in a reasonable amount of time (polynomial in $n$?).

How would I do this?

$\endgroup$
4
  • 1
    $\begingroup$ If you consider $k$ to be a constant then a simple brute-force search is polynomial in $n$. But I assume that is not what you are looking for... $\endgroup$
    – sebastian
    Commented Feb 18, 2022 at 19:58
  • $\begingroup$ Yes, I'd like something which runs in time $f(k) \cdot O(n^c)$ time for some constant $c$ independent of $k$. The function $f$ can be arbitrary. $\endgroup$ Commented Feb 18, 2022 at 22:00
  • $\begingroup$ In $g(c_1 \cdot S) + g(c_2 \cdot S)$, does the $\cdot$ denote the dot product? (And hence $g$ is a real function from $\mathbb{R}$ to $\mathbb{R}$?) $\endgroup$
    – VTand
    Commented Feb 19, 2022 at 1:37
  • $\begingroup$ @VTand yes, that's right $\endgroup$ Commented Feb 19, 2022 at 19:02

0

You must log in to answer this question.