archive by month
Skip to content

solidity in AIIDE 2020 - part 1

My proposed daring/solid metric turned out to draw a surprising degree of attention. Well, I was going to try it out anyway, but now I have reason to report in detail. Today I did only the first step, finding the elo values.

I chose to rate the players relative to a fictional opponent that scored exactly 50%, giving the fictional player elo 0, because it was easy that way. The game is zero-sum, so that’s the average player of the tournament, in a sense. We only care about elo differences, so the base is arbitrary.

bot%elo
Stardust93.22455
PurpleWave79.44235
BananaBrain69.61144
Dragon62.3888
McRave57.2251
Microwave54.4731
Steamhammer5428
DaQin50.141
ZZZKBot39.89-71
UAlbertaBot31.14-138
WillyT29.44-152
Ecgberht24.28-198
EggBot4.72-522

I calculated the table by hand, which seemed easier for a first cut—I simply printed a big elo table and read it backwards, from winning percentage to elo difference. If the solidity metric works out, I’ll have to automate it. It doesn’t seem hard (maybe invert the function by binary search). In fact, the only reason it was easier to do it by hand the first time is that I’ll have to do it by hand anyway to verify that my code is correct.

Next: I want to draw graphs for each player showing the expected and actual scores against each opponent. That will give a visual indication of how well the metric will work out. If it works well, I’ll choose a way to turn it into a number.

Trackbacks

No Trackbacks

Comments

No comments

Add Comment

E-Mail addresses will not be displayed and will only be used for E-Mail notifications.

To prevent automated Bots from commentspamming, please enter the string you see in the image below in the appropriate input box. Your comment will only be submitted if the strings match. Please ensure that your browser supports and accepts cookies, or your comment cannot be verified correctly.
CAPTCHA

Form options

Submitted comments will be subject to moderation before being displayed.