I am trying to measure the absolute time between two 1Hz clocks (GPS PPS events) to within 50 nanoseconds. I do not necessarily care which clock occurs first, I am only looking for the delta T between each pulse. The pulse width is on the order of 100s of milliseconds so it is not a constraint for me here.
My plan is to use a microcontroller to read in these two 1Hz clocks as 2 different interrupt events. The rising edge of the leading PPS signal will trigger a counter which will count cycles until the arrival of the second PPS signal. I assume this clock/crystal for the counter will need to be 20MHz or greater due to the 50nS resolution I need.
My questions is, how do I go about determining how fast of a dedicated micro-controller I need to perform such a task?
I highly doubt an MCU with a core speed of 20Mhz will suffice, since the code to answer interrupts and hold a count will consist of more than one instruction. I understand that if I want to do anything else with this microcontroller such as send this time delay via UART will consume more clock cycles as well. For simplicity, let's assume the controller is just being used to count rising edges from a 20MHz clock between two interrupt events.
I am looking for a guideline as to how I can calculate/ballpark the bare minimum MCU core speed needed to handle a 20Mhz internal/external timer for the measurement portion of this project? I am asking to get a better idea of how to source appropriate hardware instead of simply throwing a >200MHz ARM at the problem and hope I overshoot my requirements. Thank you.