The result of the discussion on the WikiTargeting page.
To take full advantage of BeeWT 1.1's data saving, clear the data file before running a long (100+) match.
BeeWT's 1.0 and 1.1 can be found at the Repository. 1.2 can be found at http://www.ginnezeek.nl/robocode/wiki.rrgc.BeeRRGC_WT1.2.jar because it's too big a file for the repository :-)
My first tests suggest that it works. I ran lots of 35 round matches without data, and the same amount with saved data (data gathered in 100 round matches) and on average there was an increase in score of 2 to 3%. I am curious what it will do in the "real world" of the RR. But that means first BeeWT has to be equipped with data from all RR participants..... volunteers? ;-) --Vic
I can do it. Will take a day or two to do 100 rounds against each. -- PEZ
Hmmm. 100 rounds worth of data for all RR bots won't fit the 200K limit, would it? The files are 1k each.. -- PEZ
Well, as you should easily beat all the HOT bots (due to your movement) an ugly hack to use for testing would just be to manually populate a Vector with all the HOT bots (or enough to go beneath the 200 bots limit you have) and not save any data on these bots. Not good in the long run, but should work for testing purposes... --Brainfade
What CassiusClay does is to only save movement data on bots that are easily tricked. Maybe BeeWT could do something like it, but the opposite (like Brain suggests, only a bit less hard coded). Welcome back btw Brain! -- PEZ
Gah! After 60 bots I did something to the repository (maybe it was packaging a new CC) that made all the data files in BeeWT's data dir be reset to the ones shipped with the jar file... Well. Since it won't fit data for all particpants anyway, I'll wait with starting it all over again until we have figured out how to best use those 200k. -- PEZ
Are there already so many participants?? Maybe we should consider thinning out that population.... But anyway, there are now about 300 participants in 1-v-1. So that means the filesize has to shrink even further to about 667 bytes. That's a f@#$ing small filesize! As a quick fix I have decreased the number of SuperNodes saved to 200. I'm running RoboLeague right now to get the data files.
In the mean time we could try to find a solution for saving the data more efficiently. Currently, I'm using Mue's suggestion of saving in index-value pairs where the index gets 16 bits, the value gets 8 bits. I'm using DataOutputStream? and DataInputStream?, and of course the streams get (un)zipped, although I don't think zip does much to make these files smaller. The mentioned 16+8=24 bits are actually too much. As I suggested before on the WikiTargeting page 14+5=19 bits are enough.
So, does anyone know how to save/load an int in 14 or 5 bits? What in/outputstream should be used? etc..
Any other ideas to save SuperNodes in tiny files?
Thinning out the RR population? No way! Doubling it, yes. Trippling it, yes. Shrinking it, no. =) Maybe if we place all files in the same zip-file things will get smaller. Zip will have a chance to shrink things when two bots have the same movement. And there'll not be that zip overhead on each file (not that i have a clue how big/small that overhead is). Normalizing all int data down to byte size might help to? -- PEZ
So how do you find the right data file in one Zipfile? The ZipInputStream? does not work in a random access manner afaik. I don't quite understand what you mean by normalizing int down to byte size. Maybe I already do that. I save the index using writeShort(int i), and the value using writeByte(int i). Is that what you mean? If so, that's still a total of 24 bytes, where we only need 19. --Vic
I'm not completely sure how to write multiple files in one zip file. I would think it's about
getNextEntry() and such. I'm not sure what writeShort() and writeByte() does. I thought like maybe dividing all values by the ratio of sizeof_int / sizeof_byte before writing them. That should make all values fit in a byte (shouldn't it?). An index needs 5 bits doesn't it? Using some bit shifting tricks and such it should be possible to write out all indexes in a row. And then all values in a row. This is just random babbling. You might have thought about this and disregarded it long ago. =) -- PEZ
I think the functions (from DataOutputStream?) I use do something similar to what you say. Your other solution should work I think. It will be quite a lot of work though. I envision numerous new classes to be written :-) Maybe we should implement something like that when BeeWT 1.2 proves that this data saving thing works at all. --Vic
I've managed to cram data on all participants (and a few more) into WT's data directory. Unfortunately the repository does not accept BeeWT because its size is now too big. :-( It is available here: http://www.ginnezeek.nl/robocode/wiki.rrgc.BeeRRGC_WT1.2.jar
Version 1.2 does not save data for stability reasons. You'll have to use version 1.1 to create the data files.
Thanx! I didn't expect it to do THAT good with only 200 SuperNodes saved. I guess this definitely proves that SuperNodes exist. If you need any help porting the serializing code, just shout. I'm right here behind my computer (designing my new movement scheme actually). --Vic
I'm off to bed! We may have a new king tomorrow..... I'll leave my RR client running. --Vic
A disappointing result... there must be a bug in there. Or maybe you can't just use the same data files with a different movement: in a way, enemies react to your movement too. But I'm not sure about this. Any clues, PEZ? --Vic
That WT data wasn´t acquired with CCWT? If so, i think that Vic has a good point here... Most bots move reacting to the enemy´s moving, and not to enemy´s gun. Excepting, maybe (and that´s a great maybe), WaveSurfers that reacts mostly to enemy´s gun.... -- Axe
I think I know what it is. I use the latest Bee and loading data from an older version. I checked the segment widths and they were the same. Which means the data loads without comlains. But I also drew the conclusion that this meant I used the same segmentations. Which I don't. I have changed velocity segmentation in a way that I think can explain the whole difference between 2063 (which was what I hoped for) and 2047 points. What we need to do is either:
I can't believe that both of you Vic and Axe are starting that crap movement-influsences-targeting thingy on the first little setback! I don't buy that at all. Especieally not when the Raiko movement and Butterfly are so similar from a distancing / orbiting / WallSmoothing point of view.
Well, your last phrase admits that u at least are considering that possibility, right? Taking a extreme (VERY EXTREME, i know!) case: imagine if the data acquired came from a rammer bot...
Anyway, i ain´t a WT expert (not even close, u know), but i´m not getting this: why don´t u use CCWT to acquire that data? -- Axe
Ummm, because I don't want to spend four weeks on figuring out how? =) -- PEZ
Ah. That changed segmentation is probably the bug. If you don't want to reset your current segmentation then, as Axe suggested, you should acquire the data with CCWT. Which is basically the same as your own second solution, because that will result in a new BeeWT. As I said earlier, I'm more than willing to help you with that. I am VERY curious where that would take CC, and also SilverfistWT?! Those bots will ROCK! --Vic
Actually the most interesting bot right now is SilverFist WT. That bot could use the BeeWT 1.2 gun and data as it is. And I won't have to maintain two branches of CC guns. =) Axe, check the BeeWT source and you'll see how to plug in that gun instead of the current Bee. It's pretty similar setup, just a little bit more entangled. -- PEZ
Where is that serialize() method? ...in BeeWT? Anyway, you'd need to have the same exact data structure (same segments in the same array) as that version of Bee had, then taking Bee's code for loading the data should be pretty straightforward. You'd probably want to generate your own data unless you want to be stuck with Bee's segments. Dookious also has pretty clean code for saving / loading WikiTargeting data in its gun - that might be easier to use to build your own. I can give you more specific pointers on finding, using, and/or just understanding that code, if you'd like. =) Let me know. -- Voidious
Ah... The code on the WikiTargeting can't be read back in, unless I'm missing something. I think its only purpose was for diagnostics - to see how many nodes were being used and print some data about them. It also uses a very lengthy ASCII format that you don't need; in Dooki, I use 3 or 4 bytes (can't remember) for each record and write out in binary, while each ASCII character would be a whole byte. To read / write WikiTargeting data, you need to: