Hard hard user interfaces.

Started as someone doing ActionScript in Macromedia Flash I got into Human-Computer-Interaction after watching Jeff's Hann's multi-touch work. That's the power of a really cool demo that inspired the birth of NUIGroup community which spawned 1000s of makers all around the world making their own projector camera multitouch systems. All of it became mainstream once Apple released the iPhone to the world, and also gave birth to my interest in human-centered engineering. I spent some 10 years after that playing with sensors, haptics, and gestural interfaces through multiple input modalities but nothing stuck as much as multi-touch did. Thanks to Apple's execution. Everyone's phone/tablet interaction is default multitouch. (unless you have a visual impairment). Primsense evolved into Kinect, then came Wiimote, Leap but nothing stuck.  for speech, it was Amazon's Alexa. Latest news , that its going to lose Amazon 10B.  Like I am writing this post through my keyboard a...

Future work

Post midterm tasks

1. We were working on mouse_event Function (http://msdn.microsoft.com/en-us/library/ms646260(VS.85).aspx) for the possible events that MultitouchSU DLL can inject into the SketchUp windows simply by specifying {LBUTTON UP|DOWN} {MBUTTON UP|DOWN}{RBUTTON UP|DOWN}{MOUSEMOVE x,y}.

This indeed is pretty raw, but I thought good enough for the first version. We could have got more sophisticated later with something like "Click(Left|Middle|Right count)" and "MouseMove( 0|Left|Middle|Right, x, y, rate)" etc.

This is an example how it'll be done, (http://code.google.com/p/eventrelay/source/browse/trunk/devel/MSWrunmacro.cpp?r=113)
at Function (void MSW_ProcessMacro::InjectNamedMouseMacro(string& namedMacroStr)) .
A MOVEMOUSETO(x,y) function will have to be added.
For now according to discussion with the mentor, we'll not go with this strategy. Another method that is using the require in the DLL loader script to invoke a gesture recognizer script.
Got some exams to catch up, the coding work should take place post 13th.


A TUIO gesture recognition module based on the Ruby API will sit between the current DLL and the SketchUp's UI-At this stage before we tryout writing the functional code, a medium level view has to be taken. So -Camera Classes of SketchUp - Geometry Classes -and Entity Class of Sketchup.
e.g. Entity.entityID
The entityID method is used to retrieve a unique ID assigned to an entity. Now SketchUp's Received TUIO will be interpreted as gestures.
Other SketchUp classes I am looking at
1. http://code.google.com/apis/sketchup/docs/ourdoc/inputpoint.html
2. The nextFrame method of the Animation interface.

Comments

Popular posts from this blog

NZM-SEC Superfast express that took my '35 hours'

Ideas

discordant yet musical whistles