It's been a while since I've been out of school, and I am unable to remember how to calculate an uncertainty value (and also what this value would be called!). Sorry in advance if this is something that should be easy to figure out, I am but a lowly technician .

I have a multiple linear regression model used to calculate an output based on a number of inputs. This model was created using ~1,500 input and output data points.

I have back calculated predicted y-values for the entire data set, and determined the error between the predicted and actual values. What I would like to do using this data is come up with a generic statement which reflects the accuracy of any future predictions.

For example, if the model predicts a value of 43.912, the accuracy statement might look something like this:

43.912 +/- 0.15%

Where 95% (2 Standard Deviations) of the time, the actual value will be between (43.912 * 0.9985 =) 43.846 and (43.912 * 1.0015 =) 43.978.

Can anybody please help me figure out how to get to this +/- error value in terms of % of reading, or please direct me to a resource which clearly explains this?

All help is appreciated!