2

In glaucoma management, we study the trend of a global index variable representing the entire visual field of a glaucoma patient. This concern the Mean Defect of visual sentitivity (=MD), expressed in decibels plotted against time (years). This is calculated as a regression analysis of MD vs time.

For practical purposes in clinical practice, trends are considered linear in general, but I want to prove this is not true. We know from studies that age is a risk factor for progression (MD worsening), so the older the patient the stronger the influence of age to "pull down" the trend, challenging linearity.

The same happens with the so called "level of damage": the more advanced is the glaucoma (worse MD level), the faster will be the speed or progression of the disease, again "pulling down" the trend over time. Patients get older and worse over time.

These two variables are included in the equation as risk factors for progression. How to show that a linear approach is not the right one in glaucoma progression?

mkt
  • 18,245
  • 11
  • 73
  • 172
  • Why don't you post an example of your data set . You also might want to look at http://stats.stackexchange.com/questions/251336/statistics-for-time-series-trend-in-r/251354#251354 as it discusses regression analysis using time as a predictor. If you use time (the counting #'s ) you are assuming a deterministic trend which might be inappropriate. – IrishStat Dec 19 '16 at 12:42

0 Answers0