Posts

Showing posts from June, 2019

Hard hard user interfaces.

Started as someone doing ActionScript in Macromedia Flash I got into Human-Computer-Interaction after watching Jeff's Hann's multi-touch work. That's the power of a really cool demo that inspired the birth of NUIGroup community which spawned 1000s of makers all around the world making their own projector camera multitouch systems. All of it became mainstream once Apple released the iPhone to the world, and also gave birth to my interest in human-centered engineering. I spent some 10 years after that playing with sensors, haptics, and gestural interfaces through multiple input modalities but nothing stuck as much as multi-touch did. Thanks to Apple's execution. Everyone's phone/tablet interaction is default multitouch. (unless you have a visual impairment). Primsense evolved into Kinect, then came Wiimote, Leap but nothing stuck.  for speech, it was Amazon's Alexa. Latest news , that its going to lose Amazon 10B.  Like I am writing this post through my keyboard a...

Project Ambhibia

Image
Project Amphibia.  Keywords : Carbon linings, electronics, deep water, cell‐laden hydrogels,  directly printed wearable systems Leaving a set of these images here as an inspiration, and diary entry for adaptive and embodied interfaces. Most of this research in Organic User interfaces have been around tattooing on the skin which is still in its infancy, due to physical limitations of electronic components. Using carbon layering under special materials we can possibly fabricate electro-actuated  geometries that can have various sensorial and conductive properties.  Possible trials: Stretchable  micro-tectonic sensors fused with Ambhibia , with a full use-case.