2016 - Round 20 Results - Happy With That

The Dees' surprise victory was very much the engine for the weekend's healthy profit though other results also helped ensure that this was the first black-ink round since Round 15.

In the end, Investors finished with the Head-to-Head Fund up 12c on a 2 and 1 performance, the Line Fund up almost 2c on a 3 and 1 performance, and the Overs/Unders Fund up by almost 1.5c on a 2 and 1 performance. That meant the Overall Portfolio rose by 5.8c, its largest single round increase since Round 4, and its third-highest single round increase of the season.

That leaves the Overall Portfolio up now by 12.6% on the season, with the Line Fund (+28%) and Overs/Unders Fund (+25%) the only positive contributors (yes, I'm looking at you Head-to-Head Fund ...).


MoSSBODS carried its wager-informing success into the Head-to-Head Tipster arena by being only one of three Tipsters selecting the winner in seven of the weekend's games. The others were C_Marg and Consult The Ladder.

Notably absent from that list is BKB, whose six from nine score allowed MoSSBODS_Marg to draw level with it at the head of the table. They're now both on 128 from 171 (75%).

The all-Tipster average for the week was 6.3 from 9.

(BTW I suspect I've had an out-by-one error in the calculations for this table in recent weeks, so if you've been trying to reconcile the numbers you might have found some discrepancies. I would just hate to be a spreadsheet auditor.)

RSMP_Weighted was the outstanding Margin Predictor for the week, recording a mean absolute error (MAE) of 34.7 points per game against an all-Predictor average of 36.5 points per game. Bookie_9 had the round's worst MAE of 37.7 points per game.

Bookie_LPSO stumbled a little, but still leads the league, though now by only just under 5 goals from RSMP_Simple and just over 6 goals from RSMP_Weighted.

C_Marg had a somewhat poor week and slipped back into 5th position. It, along with Bookie_LPSO, remain as the only non-ensemble forecasters in the current Top 6.

In fact, as I discovered only this weekend, an ensemble formed by taking a simple average of all MoS Margin Predictors currently has a better MAE than any one of them. Such an ensemble would currently have an MAE of 28.86 points per game, slightly better than Bookie_LPSO's 28.92 points per game.

On probability prediction, Bookie_LPSO was best and remains atop the leaderboard, though the two other bookmaker-based Predictors and MoSSBODS_Prob returned only very slightly lower scores. C_Prob had the round's worst probability score but remains in 4th, ahead of MoSSBODS_Prob.