Note: Going through old drafts, I found I had never posted this one. Oops!
Here’s a taste of the papers and demos I found intriguing – these are gently-edited version of my live notes, in places, hence the terseness. For access to the papers/demo info and official photo, you’ll probably have to go through IEEE – those registered for the conference got a USB copy of the proceedings.
Oral Session 1
- 3D Force Prediction Using Fingernail Imaging with Automated Calibration, Grieve et al. – this one seemed interesting to me, primarily because it is closely related to the HCI 575 (Computational Perception) course that I am currently taking. It seems like it could have come out of a similar course project – and it seems like there are clear ways to build upon this work to make it less sensitive to lighting and position and allow it to run in realtime.
- Design of a Vibrotactile Display via a Rigid Surface, McMahan et al. – This is the paper on the “haptic floor tiles” I mentioned earlier as a teaser. They generate high-frequency haptic content when you step on one of the specially designed floor tiles – made out of aircraft panels, load cells for force measurement, and voice coils for rendering. It allows them to better simulate walking on natural, structured surfaces (crunchy leaves, snow, etc). See the demos, below, for photos of me walking on these. They put them as a CAVE floor, which seems like a great idea – simulate walking on thin ice, and help train users to avoid falling through. Can we get some for the MIRAGE?
- Emulating Human Attention-Getting Practices with Wearable Haptics, Baumann et al. – A servo-controlled wrist squeezer as a more natural, less intrusive way to get your attention. This research team is very inventive, producing lots of low-cost prototype haptic devices for providing wearable, expressive feedback support. Not sure how we can connect their work to virtual assembly, but they’ve got some great ideas.