"How can we use wearable and ubiquitous computers to help people better understand and improve their own minds?"
So the affective computing is obviously a big part of this. The cognitive level of computing has been studied a lot (learning research, for example), but that's less interesting to me, because it's just logic. But what's happening at the emotional level, how can we detect that, and if we can't actually tell what's happening*, how can we at least work with that and use it to help you?
*because it's really just a lot of neurons firing; what do you mean by "what's happening"?
The persuasive technology is a big part too. I see it as the more applied end of the spectrum. We don't really know how you decide to eat a burger or a salad, but we can apply little tweaks that will actually work. And because we don't really know the whole process of how your mind makes decisions, we can't really do big overhauls. But we can apply little nudges here and there, and this technology could (and does) ship today.
There's more than this; lots of psychology and neuroscience, which I haven't delved too deeply into yet.
And the "how" is often really interesting. The wearable computers, new input/output methods, that are in the state of "really cool but not used in practice yet"... it would be great to put them to good use! Brain-computer interfaces are on my radar, but I'm not as excited about them because they're not as far along yet. As I understand it, it'd be really tough to fit an EEG into your everyday life right now. (but it would be great if you could!)
These goals will keep evolving, I'm sure. But I'm finding lots of good stuff that people are doing in these areas, so it looks very promising.
On a side note, I'm psyched about a new series that Cal Newport is starting up. "The Romantic Scholar." His blog has been a big influence on how I think about what I want to do with my life for the last year or so.