Just the other day, I listened to a podcast about mobile sensing, where professor Andrew Campbell from the mobile sensing group at Dartmouth College talked about a lot of interesting stuff. In particular, I was captivated by his description of the Neural Phone. Apparently, the idea for it was born when Campbell was out jogging and wanted to be able to phone his wife (or a friend, I don’t remember) without touching the phone. Eventually, he and his group managed to put together an iPhone with a cheap EEG headset so that a particular type of “brain wave”, a so-called P300 potential, could be detected by the headset and used to control the phone. In an app called “Dial Tim”, they demonstrated that you could dial an iPhone contact by thinking (producing a P300 potential) when that person was shown on the phone. A lightweight classifier is used to detect the signal corresponding to the desire to call a certain person. It should be noted that according to this interesting paper, the classifier is still pretty sensitive to the person’s state (sitting, standing etc.)
This opens up possibilities for really wild “mind reading” applications; the authors mention the possibility of sensing the aggregate mood in a room from this kind of neural signals, and also how a foreign language teacher could get real-time statistics on the students’ comprehension from EEG data and thus always know how many of them that actually understood a question.
Of course, there is a sinister aspect to this, namely that stray neural signals “in the wild” could be detected by malicious others. The authors call this scenario, which arises from the way the neural information is transmitted in unencrypted IP packets between iPhones, “neural packets everywhere.”
Intrigued by this, I looked up some other work that Campbell and his group has done on mobile phones and classification. This paper deals with “Darwin phones”; a framework for “collaborative sensing” and classification using mobile phones. The authors state that to the best of their knowledge, “Darwin is the first system that applies distributed machine learning techniques and collaborative inference concepts to mobile phones.” The paper contains a number of cool ideas, like “model pooling”, where phones that are close to each other can “borrow” trained classification models from each other, and “collaborative inference”, when a group of phones combine the predictions from their respective model to a potentially more robust overall prediction, which is less sensitive to particularities such as background noise specific to each phone’s location. This way of boosting predictions by using different models to reduce noise is reminiscent of how ensemble models are used in machine learning. The concepts of model pooling and collaborative inference are very useful, because it is typically quite time-consuming to train a classification model on a mobile phone; people don’t like to be bothered to provide labels to training examples.
In short, classification on (clusters of) mobile phones seems to be a really interesting problem!
Completely unrelated but still interesting was a recent Economist article about Alibaba, the Chinese site that matches up producers in China with foreign buyers, eliminating middlemen. I had been vaguely aware of this site and sometimes idly wondered about its business model, and here the Economist suggests that the company is sitting on some really valuable data about how creditworthy small companies are, which companies that know each other, and in general how Chinese middle-class people spend their money. It must be an interesting data set for sure.