archive by month
Skip to content

AIIDE 2017 the learning bots

In March 2016 I analyzed which bots learned during the AIIDE 2015 tournament by looking at the data files. Here’s a similar analysis for AIIDE 2017.

I looked at the “write” directory for each bot, to see if it wrote files there, and if so, what the files looked like. Writing data doesn’t mean that the bot actually learned anything—it may not have used the data. Bots not listed in the table did not write anything interesting (maybe log files, nothing more). The table includes 15 bots of the 28 entrants, over half of them.

#botinfo
1ZZZKBotvaried info about each game, including tidbits like the time zone and the processor it was played on
2PurpleWavefor each opponent, a log of games with info including the sequence of strategies followed
5Microwavesame format as UAlbertaBot (Microwave has more strategies)
6CherryPiopening data for each opponent
9Tyrfor each opponent, seems to save info only about the previous game: win or loss, and a flag or two like "FLYERS" or "CANNONS"
11AILien10 values per opponent: scores for zerg unit types, a few odds and ends like macroHeavyness and supplyISawAir
12LetaBotone file which records a log of games, with opponent, race, map, and 3 numbers per game
14UAlbertaBota file for each opponent, giving for each strategy the count of wins and losses; learning was turned on this year
15Aiur91 values per opponent: strategy x map size
17Skyneta file for each opponent, with lines like "build_0_2 14 12"
19MegaBotmany files; the important ones seem to be "MegaBot-vs-[bot name].xml" which give scores for each bot MegaBot can imitate: Skynet, NUSBot, Xelnaga
20Xelnagaa file for each opponent with a single number: -1, 0, 2, or 3
21Overkillmany files with neural network data, reinforcement learning data, and opening learning data for each opponent (more than I thought!)
24Myscbotsame format as UAlbertaBot, but only 1 strategy was played for each opponent; nothing was learned
25HannesBredberg2 numbers per opponent, approximately (not exactly) the win and loss counts

LetaBot seems worth looking into, to see whether its log is learning data and, if so, how it is used. PurpleWave also recorded data essentially as a log, which could be used for a wide range of purposes. And AILien has a unique learning method that I should spend some time on.

UAlbertaBot had learning turned on this year. It has sometimes left learning off because its default strategies were dominant. It’s also notable that Ziabot skipped learning this year. In the past it has learned. Ziabot also finished last.

Next: What AIUR learned.

Trackbacks

No Trackbacks

Comments

Dave Churchill on :

Pretty sure we had a bug in UAB learning this year, it flatlined pretty quickly.

MicroDK on :

I made some chances to the learning code (which is unchanged in Steamhammer). In the sWinRate calculation for each strategy I divide with sGamesPlayed instead of strategyGamesPlayed. Also, I do not consider strategies with sGamesPlayed equal zero, since it is not tested when calculating ucbVal.

LetaBot on :

The data recorded in LetaBot is for testing how good it is at combat. It currently is there to check how much MCTS will improve things.

It might be used for learning, but my bot didn't use learning this CIG and AIIDE.

Add Comment

E-Mail addresses will not be displayed and will only be used for E-Mail notifications.

To prevent automated Bots from commentspamming, please enter the string you see in the image below in the appropriate input box. Your comment will only be submitted if the strings match. Please ensure that your browser supports and accepts cookies, or your comment cannot be verified correctly.
CAPTCHA

Form options

Submitted comments will be subject to moderation before being displayed.