I have a resource allocation problem. There are $M$ users and $N$ resources (machines). One user can be assigned to multiple resources/machines. But maximum $B$ machines can be activated at a time for all the users.
The assignment variable (binary) vector is defined as $x$, $x$ is of length $M*N$
$x_1$: {user 1, machine 1}
$x_2$: {user 1, machine 2}
. . .
$x_N$: {user 1, machine $N$}
$x_{N+1}$: {user 2, machine 1}
$x_{N+2}$: {user 2, machine 2}
. . .
$x_{2*N}$: {user 2, machine $N$}
. . .
$x_{M*N}$: {user $M$, machine $N$}
I have modeled the constraint as
$L0$ norm of $(Ux)<=B$
where $U$ is a matrix carefully designed based on how $x$ is arranged $U$ has $N$ rows and $M*N$ columns. 1's in each row correspond to the positions of differents users paired with the given resource.
I want to get a better model for this constraint without $L0$ norm. We can linearize $L0$ norm but in that case I need to introduce new variables and constraints which I don't want.
I can also a follow a different approach as mentioned in How to linearize this L0 norm of a vector?, but that will introduce a large number of constraints as U and R are vary large numbers, e.e., $M=100$, $N=100$; $B=5$;
Is there a better/easier modeling?