Steadily decrease. As we increase s, we are lifting the restrictions on \beta_j, increasing the flexibility of the model and allowing a continuing reduction in training error.
Decrease initially, and then eventually start increasing in a U shape. As we increase s, we are lifting the restrictions on \beta_j, constantly increasing the flexibility of the model whichi leads to a bias-variance trade-off. At first, the decrease in bias will be larger than the increase in variance which leads to a decrease in test error. But eventually the model can become too flexible and the increase in variance becomes larger than the decrease in bias leading to overfit, and an increase in test error.
Steadily increase. Because we are lifting the restrictions on \beta_j, allowing for a better fit and reduction of the training error, we are constantly increasing the flexibility of the model and therefore its variance.
Remain constant. By definition, the irreducible error is not captured by the model, being independent of the value of s.