6
$\begingroup$

Amplitude estimation requires $O(1/\epsilon)$ measurements if we want to estimate an amplitude to absolute precision $\epsilon$. Is this optimal? Why can't we do better than this?

I'm trying to see if there's an explanation in the literature but I'm having a hard time. If you have any references to refer me to, I'd appreciate it!

$\endgroup$

0