Both the Root-Mean-Square-Error (RMSE) and coefficient of determination (R2) offer different, yet complementary, information that should be assessed when evaluating your physical model. Neither is "better", but some reports might focus more on one metric depending on the particular application.
I would use the following as a very general guide to understanding the difference between both metrics:
The RMSE gives you a sense of how close (or far) your predicted values are from the actual data you are attempting to model. This is useful in a variety of applications where you wish to understand the accuracy and precision of your model's predictions (e.g., modelling tree height).
Pros
- It is relatively easy to understand and communicate since reported values are in the same units as the dependent variable being modelled.
Cons
- It is sensitive to large errors (penalizes large prediction errors more than smaller prediction errors).
The coefficient of determination (R2) is useful when you are attempting to understand how well your selected independent variable(s) explain the variability in your dependent variable(s). This is useful when you are attempting to explain what factors might be driving the underlying process of interest (e.g., climatic variables and soil conditions related to tree height).
Pros
- Gives an overall sense of how well your selected variables fit the data.
Cons
- As more independent variables are added to your model, R2 increases (see adj. R2 or Akaike's Information Criterion as potential alternatives).
Of course, the above will be subject to sample size and sampling design, and a general understanding that correlation does not imply causation.
This value shows how well future outcomes can be predicted by the model
- bu son derece yanıltıcı ve sadece düz bir yanlışlığa yöneliyor . Belirli bir modelde yüksek bir belirleme katsayısının gelecekteki sonuçların ne kadar iyi tahmin edilebileceğine dair bir garanti yoktur.