Is there a way to calculate the number of errors that will be caught (or missed) by SECDED when there are more than two errors? I'm thinking of an issue such as a badly-timed data strobe, where the data setup or hold time isn't met and any or all of the bits covered by that strobe could be affected.
One paper that I was reading indicated (if I understood correctly) that a detection means that an even number of errors were detected. If this is true, then it would detect slightly more than 50% of the errors (since single-bit errors are detected and corrected).
Do I have this right? For actual numbers, assume a 64-bit data bus (72 bits total), with a strobe for each byte lane (à la DDRx).