I think a good way to estimate this is to assume your current output is linear during the voltage rise. Then, you can compute that your output driver's variation rate and solve for the time at which it reaches 1V.
Then solve for V(t) = 1V. This means you should see 1V on the input approximattely 2 ns after the output starts.
For a more precise approach, you need to extract more information from both the driver's and receiver's datasheets. The rise time of a signal is mostly dependant on:
- The driver's Output current vs Output voltage curve
- The decoupling of the supply voltage at your IC
- The resistive and capacitive load off the receiver and the transmission line