6
$\begingroup$

In QECC literature, I often come across the "combined amplitude and phase damping channel" as being representative of a realistic noise model which makes sense (as amplitude damping and de-phasing are the two fundamental error processes for a coherent single-qubit state). It is often twirled to make it simulatable efficiently and thereby turn it into Pauli channel, where the coefficients are related back to the decoherence times (T1 and T2).

My question however is not about the twirling but the starting point where the "combined amplitude and phase damping channel" is assumed to be representative of the circuit noise.

How can this be a sufficient description when noisy gates are in operation? Said differently, how can a noise model focused on single qubits be sufficient when entangling gates are noisy?

If it is indeed the case that the combined amplitude and phase damping channel is not sufficient for modeling errors, how can I model it better (such as the impact of noisy CNOT gates)?

$\endgroup$

0

Browse other questions tagged or ask your own question.