I just listened to the TWiT podcast 161 were Jason Calicanis described two of the most interesting demos shown at the recent TechCrunch50 conference – tonchidot and swype. I also briefly mention the contest winner – yammer.
Tochidot:
This demo, by Japanese “tonchidot”, got the best audience response at the recent Techcrunch50 show. Watch the demo here.
It starts off slow. It’s hard to understand the Japanese presenter. About 4 minutes in an English presenter takes over and describes the technology. The idea is really cool.
It’s an iPhone app that interacts with the world around it. As you are walking around the real world, and looking at the iPhone screen, tags (text/audio) about your surroundings, that others posted earlier, appear on the screen in real-time. It uses the iPhone’s built in GPS to know where you are and the iPhones accelerometer to know which angle and which way you are looking. Anyone can add tags about anything they are looking at that anyone later can read/listen too. Ultimately, if this catches on, the world around us could be tagged full of information left by users that went before.
Suggested uses include, restaurant/store reviews, site-seeing guides, museum/painting guides, tourist destination information. Restaurants could post their menus that users could read just by pointing their iPhone at the restaurant. Theatre goers can get reviews of the current show by pointing their iPhone at the theatre.
The question and answer period was hilarious because the presenters clearly could not speak English. “We have a puppet” was an answer to one tech question.
The obvious answer to the unanswered question-and-answer question (what happens when surroundings change over time?) is that the tags would/could be organized by date. The most recent tags would be presented first (to reflect the world as it is now – or most recently was) with the possibility to dig down to older tags to read/hear about how the thing/place you are looking at was in the past.