I'm working on a personal project and came across an algorithm problem which stumped me. I need to know whether a user has given my program the correct parameters to create the amount of random combinations they're asking for. Without getting too much into detail, it essentially boils down to this:
An infinite number of children are arriving to Math StackExchange HQ for tours and meals.
Each child can choose one item from each category of Sandwiches, Fruits, and Chips (for a total of 3 items).
You have 50 PB&J, 40 turkey, and 10 bologna sandwiches. You also have 90 bananas and 10 apples. And finally, 25 each of ranch, nacho, vinegar, and plain chips.
If each child must take a unique combination of items, what is the greatest number of children can you feed before repeating a meal?
- The tricky part is, when a child removes an item, it can no longer be used in future combinations. For example, there are only 10 apples, but 12 possible combinations that involve an apple. Therefore, apples have become a limiting factor.
- What if there are multiple categories unbalanced in that way?
- What if there were $T$ categories, each having 100 items divided into $T_n$ subcategories with minimum item counts of $m$?
- I need to generalize an expression to calculate the maximum number of permutations when categories and subcategories are subject to change.