There’s a project going on at one of Microsoft’s tiny Bay Area Research Center called “My Life Bits.” The idea of this project (and, believe me, this is a dramatic over-simplification of the project) is to ask (and hopefully answer) the following question…
<blockquote>What would happen if all the information– all the conversations, all the interactions, etc.– that I encounter throughout the day were captured digitally? How can we make it possible to consume this mountain of data in a meaningful way?</blockquote>
So this might include videos, photos, documents, emails, web pages, etc…. everything. All kinds of media. However, we can correlate these diverse digital bits with interesting meta data– where was I (GPS coordinates)? When was it (clock/calendar)? Who was I interacting with (contacts)?
Cross referencing these datapoints would allow you to consume something akin to a “story” about a trip without having to manually pull together all the different constituant pieces of media. If you then add a dimension of intelligence to the software (bubble quality photos above blury ones, have the camera track information like temperature, light levels, etc.) and suddenly you have something that is powerful beyond description.
The thing that gets me excited about this is the inherent power of all this cross referencing. The key is finding the hook that allows you to find a critical piece of data. “I was talking on the phone with Bob, and he mentioned that one web site.” Well, you bring up Bob’s contact, look through all the phone calls you’ve had with Bob, (OK, so now we have to assume that your computer tracks and records all your phone conversations, including caller-ID data) and say show me the web sites I’ve browsed while on the phone with him. Bang, you’ve found it. The idea of full capture with cross-referencing is that you can start modeling your searching after how the brain really works. It’s kind of like comparing how a book works to how the web works– hyperlinks are a great analogy.
Anyway, long story short, this is why I’m excited about WinFS, the database-driven file system that will ship sometime after longhorn. When the computer “thinks” about the data I create based on what it IS rather than where I PUT it, that’s when this kind of searching– contextual rather than content-based, can be possible.
If you want to know more, check out the MS Research page for the My Life Bits project, or watch the fascinating Channel 9 video on the subject. If you’re like me, you’ll find this fascinating.