Saturday 10th November 2018, it’s time for AstonHack. I’ve been to a couple of hackathons before but I’ve never really done anything other than eat pizza or watch Netflix… which is pretty much what I do with my spare time anyway. My team and I decided to make up for this by spending the entire 24 hours doing a LOT:
– We used machine learning to read your mind and tell if you were watching Pokémon or not (Note: we were most definitely actually classifying a positive emotional experience from the frontal lobe AF7 and AF8 sensors). see this paper for the science behind it.
– We did the same thing as above but for people who were thinking about Spartans, Dinosaurs, or Unicorns (it was for a sponsor’s challenge!). Again, check out this paper if you’d like to know about the technical details that went into this.
– We trained two ML models, one would draw a new Pokémon based on images of them, and the other would try and learn to spot the fake from the real ones. They’d compete against one another for a few hours until they could learn to make new artwork that was convincing. For this, we employed the use of Deep Convolutional Generative Adversarial Networks (DCGAN).
– We used ML to write names and descriptions for the generated Pokémon based on a dataset of text ripped from the games (eg. “Mowirup: They flock to the stars and mountains. This Pokemon glows, it does not construct silk.”)… Again, don’t ask.
Not shabby at all for 24 hours work! The best reward for all of it was the laughter and roaring applause for our presentation. We also won a Google Home Assistant and a bunch of swag from here.com courtesy of MLH.
Pokemon HacksPresentation too small? Click here to view the PDF fullscreen.
The hacks presented at AstonHack 2018 were absolutely mindblowing, my personal highlight was this guy who spent all night rewriting a linux driver from scratch so he could use his Nintendo Switch controllers as a virtual drumkit… and theremin. Click here to take a look at what went on!