Hard hard user interfaces.

Started as someone doing ActionScript in Macromedia Flash I got into Human-Computer-Interaction after watching Jeff's Hann's multi-touch work. That's the power of a really cool demo that inspired the birth of NUIGroup community which spawned 1000s of makers all around the world making their own projector camera multitouch systems. All of it became mainstream once Apple released the iPhone to the world, and also gave birth to my interest in human-centered engineering. I spent some 10 years after that playing with sensors, haptics, and gestural interfaces through multiple input modalities but nothing stuck as much as multi-touch did. Thanks to Apple's execution. Everyone's phone/tablet interaction is default multitouch. (unless you have a visual impairment). Primsense evolved into Kinect, then came Wiimote, Leap but nothing stuck.  for speech, it was Amazon's Alexa. Latest news , that its going to lose Amazon 10B.  Like I am writing this post through my keyboard a...

Updations

Got back from home after 5 days, it was probably a long gap, resumed the GSoC work now,

1. The TUIO client now appends cursor coordinates to a text file, the x_pos and y_pos, example coordinates 0.535937488079071, 0.410416662693024.
2. Writing a ruby script SketchUp plugin that reads the TUIO cursor information from the text file.

The downsides to this after discussing with Jay Morrisson
1. it's subject to slowdown by writing to/reading from disk and 2. The serialization has to be done by oneself

After this works I'll consider one more option of trying the dRB ruby gems (Distributed Queue in Ruby) since we can just focus on the sketchup integration

Comments

Popular posts from this blog

NZM-SEC Superfast express that took my '35 hours'

Ideas

discordant yet musical whistles