I computed a simple linear regression model from my experiment measures in order to make predictions. I have read that you should not calculate predictions for points that depart too far from the available data. However, I could not find any guidance to help me know how far I can extrapolate. For example, if I calculate the reading speed for a disk size of 50GB, I guess the result will be close to the reality. What about a disk size of 100GB, 500GB? How do I know if my predictions are close to the reality?
The details of my experiment are:
I am measuring the reading speed of a software by using different disk size. So far I have measured it with 5GB to 30GB by increasing the disk size of 5GB between experiments (6 measures in total).
My results are linear and the standard errors are small, in my opinion.
