Hard hard user interfaces.

Started as someone doing ActionScript in Macromedia Flash I got into Human-Computer-Interaction after watching Jeff's Hann's multi-touch work. That's the power of a really cool demo that inspired the birth of NUIGroup community which spawned 1000s of makers all around the world making their own projector camera multitouch systems. All of it became mainstream once Apple released the iPhone to the world, and also gave birth to my interest in human-centered engineering. I spent some 10 years after that playing with sensors, haptics, and gestural interfaces through multiple input modalities but nothing stuck as much as multi-touch did. Thanks to Apple's execution. Everyone's phone/tablet interaction is default multitouch. (unless you have a visual impairment). Primsense evolved into Kinect, then came Wiimote, Leap but nothing stuck.  for speech, it was Amazon's Alexa. Latest news , that its going to lose Amazon 10B.  Like I am writing this post through my keyboard a...

Approaching Final evals

The "pencils down" date for GSoC is approaching fast. This is the core of the work during this time.

1# Code cleaning. Decided to keep the previous ruby tuio client experiments' work safe in the repository (pre-midterms GSoC work). It might be useful in the future development where better and refined gestures might be inculcated.

2# Code commenting wherever necessary

3# Documentation's skeleton done

3.a) Compile HowTo's
3.b) Readme's
3.c) Changelog
3.d) Etc.

Comments

Popular posts from this blog

NZM-SEC Superfast express that took my '35 hours'

Ideas

discordant yet musical whistles