0
\$\begingroup\$

I had my first very exciting and successful go at an oscilloscope today. The question which arose at the end of my first experience has to do with the accuracy I was expecting. My microcontroller was programmed to output signals such as a HIGH signal for 80us, LOW for 50us, HIGH for 25ms, etc. I used a 16MHz crystal with my ATMega8.

When I recorded the output with the scope, I noticed that the measurements were not exact. For example, when I'd expect a 80us HIGH signal on the scope, it would look something like 77us. The signals which lasted longer times (i.e. 25ms) seemed more accurate.

My question is simply: is this what I should expect? I mean, the device I hooked up to my microcontroller requires specific signals for specific amounts of time, and I programmed my microcontroller to deliver it those signals for the lengths of time it requires. Everything works great, I just can't seem to understand why this discrepancy exists. For reference, the sensor is the DHT-11 temperature sensor. If the microcontroller is actually not delivering the precise signals, why is my sensor responding correctly?

Here are some of my guesses:

  1. The sensor can tolerate some amount of error, which is why it works. I can't seem to identify and put that error into perspective based on the data sheet.
  2. The crystal is not perfectly accurate. We are dealing with microsecond resolution here, so it is possible.
  3. Other components like resistors and capacitors introduce error.
  4. The clock on the oscilloscope is not perfect either - albeit much better than the crystal used for my microcontroller

Perhaps there is something else going on which I am not aware of.

Note that the timings I used above are just examples, and may differ from the data sheet and what I actually used in my code.

\$\endgroup\$

2 Answers 2

1
\$\begingroup\$

It's hard to be specific without seeing your code, but I'll take a stab at it.

Your guess #1 is definitely true; any one-wire bus protocol has to have a lot of tolerance for timing variations, since it partly depends on the resistive pullup of a wire with unpredictable capacitance.

Your guesses #2 - #4 are unlikely; the crystals on both the microcontroller and the oscilloscope are fine.

What's probably happening is that the pulses you're generating with your firmware are not exactly the widths that you intended. An offset of a few instruction cycles could easily amount to 3 µs. And this error would not scale with the timing interval, so longer intervals will seem more accurate, as you observed.

If you post the code you're using to generate the pulses, we could give more specific advice.

\$\endgroup\$
1
  • \$\begingroup\$ You might also mention what percentage of error your scope claims, what its advertised bandwidth is, if it has been calibrated since 1973, etc, etc. This may be as good as it gets. \$\endgroup\$ Commented Oct 18, 2012 at 3:48
1
\$\begingroup\$

Please provide information (or pictures!) of your scope. Many scopes have the option to go into an "uncalibrated" mode, that lets you arbitrarily adjust the X axis timing (this can be very useful for certain measurements). You may have accidentally put your scope in this mode. Also, if this is a new scope to you (particularly if it's a use older scope), it may indeed need calibration.

Furthermore, how are you deriving your timing? Are you using a pre-written delay function? Are you sure you're not at the edge of it's range? I know the arduino DelayUS() function starts to behave strangely when the values passed to it are < ~5. Something similar may be happening here.

Lastly, the reason you are seeing valid data from your sensor is simply that the sensor uses a bus protocol that is extremely tolerant of timing variances, and you are within the range of timing error that it can accommodate.


Anyways, if you post more information (either your code, of pictures/description of your setup), we can try to help you more.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.