I am looking for a way to optimize the function $f$, defined below.
First, fix some positive integer $k$ and let $c_1$ and $c_2$ be vectors in $\mathbb{R}^n$. Let $g$ be an increasing concave function of a single variable.
Now define $$f(S) = g(c_1 \cdot S) + g(c_2 \cdot S),$$
where $S$ is a vector in $\mathbb{R}^n$. I want to maximize $f$ over all vectors $S$ which have a 1 in exactly $k$ coordinates and 0 everywhere else.
This is a combinatorial optimization, but I believe there may be a way to solve this optimization problem. By 'solve' I mean give a algorithm which finds the maximizer of the function $f$ and runs in a reasonable amount of time (polynomial in $n$?).
How would I do this?