This page covers a few additional questions and topics related to the Davis Temp/Hum sensors:

Humidity accuracy

Accuracy for relative humidity (RH) tends to be best over the extended mid-range of Relative Humidity (RH), eg 10-90% and declines towards either extreme. The performance of affordable electronic RH sensors is improving all the time, even compared to a few years ago, and the SHT31 used in current VP2 ISS units is significantly better than the SHT11 sensor of Vue and pre-2016 VP2 stations. But it’s still relatively common to find that the maximum RH value a sensor will read may be 97- 99% and not the full 100% that one might expect. This really isn’t a cause for concern, for several reasons:

  • The sensor will still likely be operating within its accuracy specification, which is typically ±3%, if not greater, close to 100%;
  • Conditions of genuine 100% RH are relatively uncommon except in continuous thick fog – a rain shower or even longer periods of rain won’t necessarily give RH approaching 100%;
  • Even if the sensor won’t quite achieve a 100% reading, the chances are that it is operating within its specified accuracy at all lower RH values – any error in the reading close to 100% RH is not representative of accuracy over 10-90%;

Note: It is possible to enter a humidity offset into the console to force the maximum RH reading to 100%, but this really isn’t recommended – it will create very significant errors in RH readings in the mid-range of humidity.

Console inside temperature and humidity readings

The console inside temperature readings are an estimate only and may often be an overestimate for the reasons below. The apparent inside temperature will also impact the humidity estimate.

  1. The console T/H sensor is on the circuit board well inside the console and hence rather constrained by the case. As a result, its response time to any changes in room temperature is relatively slow and any extra heat generated by circuitry within the case will be slow to dissipate. (To be frank, this aspect of the console’s design could easily be improved – the T/H sensor really ought to be located well away from any other internal heat sources and eg behind a small grille in the case to allow faster equilibration with the external air.)
  1. Having the console display light on can potentially put the inside temperature reading up by 2-3C.
  1. An installed logger generates some heat and will inevitably increase the temperature within the console case to some extent. This does vary with the logger type and is more pronounced with network loggers, especially those a microprocessor onboard like Nano or WFL2, but is also influenced by the demands placed on the logger e.g. if you’re doing a lot of frequent downloads and thereby making the logger work harder then there will be a tendency to push the console’s internal temperature up.
  1. While there are no hard details, there is speculation that the T/H sensor inside the console is a lower spec part and, if true, may have a somewhat lower temperature accuracy.

Comparing inside and outside temperatures

It’s not uncommon for new users to set up console and ISS side-by-side pre-installation to try and cross-check temperature/humidity readings and calibration. This is actually not an easy exercise to do for several reasons, but not least the relatively slow response time of the console inside temperature as outlined above, and is best done with no logger installed in the console and with the console backlight off.

If you are able to leave console and ISS side by side in a constant temperature environment for several hours then they should agree reasonably well, e.g. to within the published specs for each sensor. (Though remember that it’s commonplace for one sensor to be reading high and the other one low, but each is within its published spec.) But actually doing this in ordinary domestic circumstances is difficult. Most rooms are not constant temperature and there’s nearly always some sort of draught or source of heat (eg a nearby radiator) which heats or cools one or other component differentially and so the two components are not experiencing stable and constant temperature/humidity conditions. Also, the radiation screen on the ISS is designed such that at least a small breeze is expected so as to allow the sensor inside the screen to equilibrate more quickly. (You can obviously simulate the breeze with a fan, which will help but it’s still not ideal.)

So for all of these reasons, unless this experiment is done under very stable and controlled conditions and over a significant period, the results may not be very meaningful and it is debatable whether doing this check seves any useful purpose beyond a rough check. In general, our experience is that the intrinsic calibration of the ISS temperature sensor is usually excellent, whereas the inside temperature reading on the console is likely to be somewhat less precise and more vulnerable to external influences.

For all of the reasons here and in the previous section, it is difficult to compare inside and outside temperature or RH values meaningfully. Probably the best parameter to compare is the dew point temperature since this will factor in both temperature and RH jointly. But do bear in mind that there is increased errors on dew point estimates, because errors in both temperature and RH contribute to the overall errror.

Resolution of SUPPLEMENTARY temperature and temp/hum sensors

Vue and VP2 stations show two main temperature values – the outside air temperature (from the ISS) and the inside temperature from a sensor (usually) inside the console. Both of these temperature values are available for display and logging to a resolution of 0.1°.

But VP2 stations can of course have additional (‘supplementary’) temperature or temperature/humidity sensors added to various supplementary transmitters such as 6372 temp or 6382 temp/hum transmitters or the multi-sensor 6345 soil moisture/temperature & leaf wetness transmitter. In most station configurations, console or logged readings from these supplementary temperature sensors are limited to a resolution of ±1°F (approx ±0.5°C). For users viewing console readings in °C this has the slightly deceptive effect that readings appear to be shown to a resolution of 0.1°C but eg in a rising temperature there will be a gap of 0.5 or 0.6°C (depending on exact rounding) to the next temperature value displayed.

This happens because of limitations in the console design – the standard VP2 console is able to process 0.1°F resolution data from the ISS but data from supplementary temperature stations can only be processed at the lower 1°F resolution. This is a limitation at the firmware console level and cannot be worked around. It seems that rewriting the firmware to overcome this limitation would be a major task and maybe not even possible because of the restricted console memory size.

To reiterate, supplementary transmitters always transmit full 0.1°F data, but the console cannot receive this in full resolution. However, supplementary stations received by Weatherlink Live or Envoy8X units (or by other types of receiver such as Meteobridge Pro Red) are able to see the 0.1°F data as transmitted. One other special case is that an external 6475/6477 sensor connected direct to a recent 6316 Envoy unit will also show inside temperature to 0.1°F resolution (but this is as expected since inside temp always has 0.1°F resolution).

Last modified: Nov 13, 2021

Feedback

Was this helpful?

Yes No
You indicated this topic was not helpful to you ...
Could you please leave a comment telling us why? Thank you!
Thanks for your feedback.

Post your comment on this topic.

Post Comment