Variations on this are pretty common questions, but all my google-fu has left me stumped. I would like to calculate the odds of a fair dice roll, but I want to do it efficiently. There are lots of examples out there of how to do this, but all the algorithms I've found are too computationally expensive (exponential time) to work for large numbers of dice with many sides.
The Simple Problem: Calculate the odds of a roll of n, on x y sided dice.
The Simple Solution: Create the n-ary Cartesian product of the roll, sum each product, count the number of times the sum is the target, do a little division and voila.
Example of a Simple Solution in Go: https://play.golang.org/p/KNUS4YBQC0g
The Simple Solution works perfectly. I extended it to allow for cases like dropping the highest/lowest n faces, and the results hold up to spot testing.
But consider {Count: 20,Sides: 20,DropHighest: 0,DropLowest:0, Target: 200}
.
If I evaluated that with the previous solution, my "table" would have 104 some odd septillion cells and will max the CPU pretty easily.
Is there a more efficient way to calculate probability for large numbers of dice with many sides? If so, can it account for more complex selection of "success" conditions like dropping some dice?
I'm convinced it's possible due to the existence of this beautiful website: https://anydice.com/program/969
EDIT:
The solution that worked best for me was David Eisenstat's answer, which I ported to go: https://play.golang.org/p/cpD51opQf5h