Hard hard user interfaces.

Started as someone doing ActionScript in Macromedia Flash I got into Human-Computer-Interaction after watching Jeff's Hann's multi-touch work. That's the power of a really cool demo that inspired the birth of NUIGroup community which spawned 1000s of makers all around the world making their own projector camera multitouch systems. All of it became mainstream once Apple released the iPhone to the world, and also gave birth to my interest in human-centered engineering. I spent some 10 years after that playing with sensors, haptics, and gestural interfaces through multiple input modalities but nothing stuck as much as multi-touch did. Thanks to Apple's execution. Everyone's phone/tablet interaction is default multitouch. (unless you have a visual impairment). Primsense evolved into Kinect, then came Wiimote, Leap but nothing stuck.  for speech, it was Amazon's Alexa. Latest news , that its going to lose Amazon 10B.  Like I am writing this post through my keyboard a...

Post Midterms Update

We've now been able to trick SketchUp now to pose TUIO data is SketchUp data by posting the TUIO data as a SketchUp "User" message type. It would then be in sync with SketchUp events rather than asynchronous TUIO events.
This way we can sub-class SketchUp ( like EventRelay ), pass the data off to the SketchUp's embed Ruby as a Synchronous event. Here's the result of DLL "puts"-ing cursors successfully to the SketchUp's embed Ruby Console via a Ruby API puts statement
Implication to which is that we can collect all the TUIO messages and use the SketchUp Ruby API for remaining work as suggested by mentor.

Comments

Popular posts from this blog

NZM-SEC Superfast express that took my '35 hours'

Ideas

discordant yet musical whistles