Imagine that you trained a regression model to predict apartment rental prices (in $1000) based on their descriptions. The table below contains the real prices of the apartments, as well as the ones predicted by your model:
| Real price (in $1000) | Predicted price (in $1000) |
|---|---|
| 1 | 0.9 |
| 4 | 4.1 |
| 3 | 2.9 |
| 2 | 2.1 |
Compute the Normalized Root Mean Squared Error (nRMSE) of the prediction, which is a Root Mean Squared Error (RMSE) score normalized by the mean of the target.
Don't apply normalization if the mean value of the target is 0. The final answer should be rounded up to the second decimal.
Hint
Real and predicted prices as python lists:
y = [1, 4, 3, 2]
y_hat = [0.9, 4.1, 2.9, 2.1]
As a reminder, the nRMSE formula is given as follows: