0
\$\begingroup\$

A device trigger input requires 1 V pulse (can stand up to 6 V) and has 0.5 V threshold voltage. Now if I apply a 5 V pulse with 10 ns rise time, how can I estimate the time it takes a rising edge to trigger the input?

Does it mean the effective rise time for the input will be lower?

\$\endgroup\$
1
  • \$\begingroup\$ You needn't put "question about" in the title as it is redundant. \$\endgroup\$
    – JYelton
    Commented Nov 8, 2023 at 19:04

1 Answer 1

0
\$\begingroup\$

I think a good way to estimate this is to assume your current output is linear during the voltage rise. Then, you can compute that your output driver's variation rate and solve for the time at which it reaches 1V.

Then solve for V(t) = 1V. This means you should see 1V on the input approximattely 2 ns after the output starts.

For a more precise approach, you need to extract more information from both the driver's and receiver's datasheets. The rise time of a signal is mostly dependant on:

  • The driver's Output current vs Output voltage curve
  • The decoupling of the supply voltage at your IC
  • The resistive and capacitive load off the receiver and the transmission line

Model of a line to find rise time

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.