Over on my group’s blog, I put a writeup on some procrastination I did whilst working with word embeddings. Basically I took the Star Trek, Star Wars and Doctor Who wikias and embedded all the words into the same vector space, so we can ask questions like “Who is most like Han Solo in Doctor Who?” and for some reason we get the answer Rory.
The post has a link to the embedding so people can download it and poke around themselves.
I’ve started up a blog for the Information Retrieval and Machine Learning group at Oracle Labs.
At the moment there is a writeup for our paper on multilingual word embeddings, which Mike presented at AAAI 2016. I should have a new post up on something significantly nerdier using word embeddings in a little while, which I’ll post a link to here.
In other news I updated my academic website with up to date versions of FEAST & MIToolbox, and put a couple of new papers up on my publications page.
Well this probably won’t get updated that often. Still marginally better than an empty html page though.