Skip to main content


Showing posts from July, 2009

ConceptS: Our new Multitouch Wall

Quite a long time since we didn't show our work done in pre-summer-holidays time at IIT delhi's Design lab. Here's a sneak peek into the multitouch wall done by Sparsh team during their IIT Delhi stay. Hail NUI :) The upcoming video will show some custom applications with object marking.

I am trying to.

- make a toolbar and User Interface for the plugin - Figure out how the overlay circular menu will work on SketchUp-> This could pop open after someone makes a NUI swish(~) gesture.

Gestures Plan

My thoughts were triggered after I read the Bill Buxton argument . We must focus on streamlining the interaction. by number of gestures minimum , using them only when they're required and naturally usable .-> Simpler and better SketchUp design and modeling experience. This implies that we should using Multitouch in only one thing-> Camera motions in Sketchup i.e. Zoom, PAN and Orbit There are two modes I am thinking about- MODE 1 Camera= Initialized using a Button1 from a new SketchUp MT Toolbar 1. Zoom -Drag apart two fingers/MT cursors 2. Orbit- Keep two fingers constant and move third vertically to the axis of the two. X X O ^ 3. Pan- Drag in any direction with single finger O-> MODE 2 Drawing= Initalized using Button2 from SketchUp MT toolbar - SketchUp ' s drawing interface is meant to be exploited using single touch(mouse) so we plan not messing around with that - Therefore we're drawing with the help of mouse emulat

A new inspiration

Just happened to see this video off Richard Monsol Haefel's blog. The way this concept model has been done is awesome, atleast it shows what the future has in store for CAD modelers. I'm glad to see someone in the CAD world doing something innovative. Must admit that the guys doing the modeling are creative and all set to change the scenario :-) I've some ideas running up my head now, and I feel re-fuelled with energy. Our GSoC project on which we're working is also something similar. Let's take a step ahead! I am yet to think how'll i go about implementing the bimanual gesture that'll support modelling using PEN+ HAND+ Fingers. just wish that the Google SketchUp's API was a little more developer friendly..

Updates Post midterm new

Oh, the EventRelay was so wonderful we just knew :-) On 24th and 25th July I was busy with my Google Summer of Code doing- 1. The Wiki Page with information about the project readily accessible 2. Some Code Cleanup 3. Adding protected bins on the the repository, so that mentors could try out each. 4. Change-log maintainance Apart this I am working atm on how gestures will work now. A separate blogpost should explain how our team is going to go about it. :-) Add to this I have a deadly series of exams coming up few hours from now!! Just following this post I've an Operating Systems test to write. Studying about deadlocks, making farras , FIFO, SSTF etc :-/ hehe. Why is our desi education system so much about cramming( commit to memory and vomit on paper) ?

Post Midterms Update

We've now been able to trick SketchUp now to pose TUIO data is SketchUp data by posting the TUIO data as a SketchUp "User" message type. It would then be in sync with SketchUp events rather than asynchronous TUIO events. This way we can sub-class SketchUp ( like EventRelay ), pass the data off to the SketchUp's embed Ruby as a Synchronous event. Here's the result of DLL "puts"-ing cursors successfully to the SketchUp's embed Ruby Console via a Ruby API puts statement Implication to which is that we can collect all the TUIO messages and use the SketchUp Ruby API for remaining work as suggested by mentor.

Pre Mids Update

1 . Added a new blogpost(last), regarding my gesture implementation thought. 2 . The repo had gone erroneous due to cleanup and I faced some issues while using the Merge command. Things are fine now-fixed by Pawel. The indented commit message was: "Remove non-required components that came while freezing required filesets. Code commenting and cleaning updations. Adding a Readme for Current Status and project details." That's the research and work I did yesterday, got an exam after an today.

Future work

Post midterm tasks 1. We were working on mouse_event Function ( us/library/ms646260(VS.85). aspx ) for the possible events that MultitouchSU DLL can inject into the SketchUp windows simply by specifying {LBUTTON UP|DOWN} {MBUTTON UP|DOWN}{RBUTTON UP|DOWN}{MOUSEMOVE x,y}. This indeed is pretty raw, but I thought good enough for the first version. We could have got more sophisticated later with something like "Click(Left|Middle|Right count)" and "MouseMove( 0|Left|Middle|Right, x, y, rate)" etc. This is an example how it'll be done, ( eventrelay/source/browse/ trunk/devel/MSWrunmacro.cpp?r= 113 ) at Function (void MSW_ProcessMacro::InjectNamedMouseMacro(string& namedMacroStr)) . A MOVEMOUSETO(x,y) function will have to be added. For now according to discussion with the mentor, we'll not go with this strategy. Another method that is using the require in the DLL loader script to invoke a gesture r


EventRelay is the sketchup plugin(which embeds a DLL within) .DLLs are not a direct and conventional method to extend SketchUp Now the following works- 1. The TUIO messages are being received by the Debug window within Sketchup when we run the TUIO simulator, see screenshot 2. The Gestures still to be implemented. Still trying to find out what's going to be the best strategy for the same. Whether to emulate the mouse, or to go with macros. Special thanks to Pecan and his plugin EventRelay, which really proved to be the way to implement the DLL within.