Skip to main content
added 1 character in body
Source Link
xskxzr
  • 7.5k
  • 5
  • 23
  • 47

As far as references, you're looking for big-O notation. It groups the "efficiency" of algorithms into things like $O(log n)$$O(\log n)$, $O(n^2)$, $O(n^5)$ and $O(2^n)$ (that last being exponential time). Each category is worse than the next, and exponential is way, way out there. The crazy thing is there's no limit to how much worse. $O(n^2)$ will be 10 times slower than $O(n log(n))$$O(n \log n)$ for a smallish input, but then 100 times slower for a larger input, up to a million times slower and so on as the input size increases. The full explanation is a bit tricky, but "big-O notation" is the search term you were asking for. It's covered in many 2nd-year algorithm classes.

As for the alien, suppose we find out what's different in their universe. Maybe their PC's can spawn an infinite number of pocket dimensions, which can each check a different possible result to see which one is correct. When we tell them that can't be done in this universe, they might say "wow, so I guess solving exponential problems is very inefficient for you?"

Now back to references. Com Sci actually has that idea of gradually spawning infinite parallel copies, which "solves" many exponential problems. Brute-force chess is an example: merely create one pocket dimension for each legal move, each pocket dimension will naturally repeat that for the next move, and so on. There will be gazillions of them, but each only runs for a few hundred moves (which is nothing). This theoretical ability is used in an NFA (non-deterministic finite-automaton). NFA's are just thought-experiments for use in certain proofs, but -- and this may not be current --there was talk that a quantum computer might be able to simulate an NFA (using quantum states?) and thus make a class of exponential problems easily solvable, in this universe.

As far as references, you're looking for big-O notation. It groups the "efficiency" of algorithms into things like $O(log n)$, $O(n^2)$, $O(n^5)$ and $O(2^n)$ (that last being exponential time). Each category is worse than the next, and exponential is way, way out there. The crazy thing is there's no limit to how much worse. $O(n^2)$ will be 10 times slower than $O(n log(n))$ for a smallish input, but then 100 times slower for a larger input, up to a million times slower and so on as the input size increases. The full explanation is a bit tricky, but "big-O notation" is the search term you were asking for. It's covered in many 2nd-year algorithm classes.

As for the alien, suppose we find out what's different in their universe. Maybe their PC's can spawn an infinite number of pocket dimensions, which can each check a different possible result to see which one is correct. When we tell them that can't be done in this universe, they might say "wow, so I guess solving exponential problems is very inefficient for you?"

Now back to references. Com Sci actually has that idea of gradually spawning infinite parallel copies, which "solves" many exponential problems. Brute-force chess is an example: merely create one pocket dimension for each legal move, each pocket dimension will naturally repeat that for the next move, and so on. There will be gazillions of them, but each only runs for a few hundred moves (which is nothing). This theoretical ability is used in an NFA (non-deterministic finite-automaton). NFA's are just thought-experiments for use in certain proofs, but -- and this may not be current --there was talk that a quantum computer might be able to simulate an NFA (using quantum states?) and thus make a class of exponential problems easily solvable, in this universe.

As far as references, you're looking for big-O notation. It groups the "efficiency" of algorithms into things like $O(\log n)$, $O(n^2)$, $O(n^5)$ and $O(2^n)$ (that last being exponential time). Each category is worse than the next, and exponential is way, way out there. The crazy thing is there's no limit to how much worse. $O(n^2)$ will be 10 times slower than $O(n \log n)$ for a smallish input, but then 100 times slower for a larger input, up to a million times slower and so on as the input size increases. The full explanation is a bit tricky, but "big-O notation" is the search term you were asking for. It's covered in many 2nd-year algorithm classes.

As for the alien, suppose we find out what's different in their universe. Maybe their PC's can spawn an infinite number of pocket dimensions, which can each check a different possible result to see which one is correct. When we tell them that can't be done in this universe, they might say "wow, so I guess solving exponential problems is very inefficient for you?"

Now back to references. Com Sci actually has that idea of gradually spawning infinite parallel copies, which "solves" many exponential problems. Brute-force chess is an example: merely create one pocket dimension for each legal move, each pocket dimension will naturally repeat that for the next move, and so on. There will be gazillions of them, but each only runs for a few hundred moves (which is nothing). This theoretical ability is used in an NFA (non-deterministic finite-automaton). NFA's are just thought-experiments for use in certain proofs, but -- and this may not be current --there was talk that a quantum computer might be able to simulate an NFA (using quantum states?) and thus make a class of exponential problems easily solvable, in this universe.

Source Link

As far as references, you're looking for big-O notation. It groups the "efficiency" of algorithms into things like $O(log n)$, $O(n^2)$, $O(n^5)$ and $O(2^n)$ (that last being exponential time). Each category is worse than the next, and exponential is way, way out there. The crazy thing is there's no limit to how much worse. $O(n^2)$ will be 10 times slower than $O(n log(n))$ for a smallish input, but then 100 times slower for a larger input, up to a million times slower and so on as the input size increases. The full explanation is a bit tricky, but "big-O notation" is the search term you were asking for. It's covered in many 2nd-year algorithm classes.

As for the alien, suppose we find out what's different in their universe. Maybe their PC's can spawn an infinite number of pocket dimensions, which can each check a different possible result to see which one is correct. When we tell them that can't be done in this universe, they might say "wow, so I guess solving exponential problems is very inefficient for you?"

Now back to references. Com Sci actually has that idea of gradually spawning infinite parallel copies, which "solves" many exponential problems. Brute-force chess is an example: merely create one pocket dimension for each legal move, each pocket dimension will naturally repeat that for the next move, and so on. There will be gazillions of them, but each only runs for a few hundred moves (which is nothing). This theoretical ability is used in an NFA (non-deterministic finite-automaton). NFA's are just thought-experiments for use in certain proofs, but -- and this may not be current --there was talk that a quantum computer might be able to simulate an NFA (using quantum states?) and thus make a class of exponential problems easily solvable, in this universe.