Hard hard user interfaces.

Started as someone doing ActionScript in Macromedia Flash I got into Human-Computer-Interaction after watching Jeff's Hann's multi-touch work. That's the power of a really cool demo that inspired the birth of NUIGroup community which spawned 1000s of makers all around the world making their own projector camera multitouch systems. All of it became mainstream once Apple released the iPhone to the world, and also gave birth to my interest in human-centered engineering. I spent some 10 years after that playing with sensors, haptics, and gestural interfaces through multiple input modalities but nothing stuck as much as multi-touch did. Thanks to Apple's execution. Everyone's phone/tablet interaction is default multitouch. (unless you have a visual impairment). Primsense evolved into Kinect, then came Wiimote, Leap but nothing stuck.  for speech, it was Amazon's Alexa. Latest news , that its going to lose Amazon 10B.  Like I am writing this post through my keyboard a...

Dilemma

What would be the best between
1. pyGE Touch like implementation under SketchUp, doable if SketchUp can support WIN32COM interfacing http://code.google.com/p/pymt/source/browse/examples/kaswy/PyGE_touch.py
2. Going with Aberrant's TUIO Client within SketchUp, we won't continue with it since thread calculation within SketchUp's scripting is quite a pain.
3. Write TUIO to Text/XML, and get it read by SketchUp. That's what we're busy doing now. After this works, we'll work with code refinement and optimizing the stuff before moving to gestures.

Comments

Popular posts from this blog

NZM-SEC Superfast express that took my '35 hours'

Ideas

discordant yet musical whistles