Post by Jason on Feb 7, 2021 6:23:54 GMT -8
I'm going to preface this thread by sharing that I have not been in undergraduate or graduate level mathematics course since 1996 just in case anything that follows is obviously incorrect. :-)
Based on my review of the code, barometric trend is determined by comparing the most recent reading with its predecessor. Most recent greater than its predecessor indicates an upward trend (true) and the opposite indicates a downward trend (false). Because the sample rate is 30s, I wasn't comfortable that the comparison was really indicative of a trend.
As a result, I decided to look for an alternative. Fortunately, PostgreSQL has a built in linear regression function that seemed like a good option. After some reading, I gave it a go. The SQL query is quite simple:
The query compares the pressure reading in hPa (tr.reading->'reading'->'pressure'->>'value)::DECIMAL*10) to seconds since 1/1/1970 00:00:00 UTC (EXTRACT(EPOCH FROM tr.reading_ts) for an hour (tr.reading_ts > CURRENT_TIMESTAMP - INTERVAL '1 HOUR').
Here's where my uncertainty lies. I think the units for the output of the REGR_SLOPE() function are hPa/s based on the function inputs mentioned above. As a result, I'm converting the resulting slope to hPa/h by multiplying the output of REGR_SLOPE(...) * 60 * 60.
Does my math make sense? Or am I misunderstanding the output of linear regression?
Thanks,
Jason
Based on my review of the code, barometric trend is determined by comparing the most recent reading with its predecessor. Most recent greater than its predecessor indicates an upward trend (true) and the opposite indicates a downward trend (false). Because the sample rate is 30s, I wasn't comfortable that the comparison was really indicative of a trend.
As a result, I decided to look for an alternative. Fortunately, PostgreSQL has a built in linear regression function that seemed like a good option. After some reading, I gave it a go. The SQL query is quite simple:
SELECT
REGR_SLOPE(
(tr.reading->'reading'->'pressure'->>'value')::DECIMAL*10,
EXTRACT(EPOCH FROM tr.reading_ts)
) * 60 * 60 barometric_trend
FROM weather.t_readings tr
WHERE tr.reading_device_channel = 21
AND tr.reading_ts > CURRENT_TIMESTAMP - INTERVAL '1 HOUR';
The query compares the pressure reading in hPa (tr.reading->'reading'->'pressure'->>'value)::DECIMAL*10) to seconds since 1/1/1970 00:00:00 UTC (EXTRACT(EPOCH FROM tr.reading_ts) for an hour (tr.reading_ts > CURRENT_TIMESTAMP - INTERVAL '1 HOUR').
Here's where my uncertainty lies. I think the units for the output of the REGR_SLOPE() function are hPa/s based on the function inputs mentioned above. As a result, I'm converting the resulting slope to hPa/h by multiplying the output of REGR_SLOPE(...) * 60 * 60.
Does my math make sense? Or am I misunderstanding the output of linear regression?
Thanks,
Jason