Monthly Archives: March 2010

IEEE Haptics – Papers and Demos

Note: Going through old drafts, I found I had never posted this one.  Oops!

Here’s a taste of the papers and demos I found intriguing – these are gently-edited version of my live notes, in places, hence the terseness. For access to the papers/demo info and official photo, you’ll probably have to go through IEEE – those registered for the conference got a USB copy of the proceedings.

Papers

Oral Session 1

  • 3D Force Prediction Using Fingernail Imaging with Automated Calibration, Grieve et al. – this one seemed interesting to me, primarily because it is closely related to the HCI 575 (Computational Perception) course that I am currently taking.  It seems like it could have come out of a similar course project – and it seems like there are clear ways to build upon this work to make it less sensitive to lighting and position and allow it to run in realtime.
  • Design of a Vibrotactile Display via a Rigid Surface, McMahan et al.  – This is the paper on the “haptic floor tiles” I mentioned earlier as a teaser.  They generate high-frequency haptic content when you step on one of the specially designed floor tiles – made out of aircraft panels, load cells for force measurement, and voice coils for rendering. It allows them to better simulate walking on natural, structured surfaces (crunchy leaves, snow, etc). See the demos, below, for photos of me walking on these.  They put them as a CAVE floor, which seems like a great idea – simulate walking on thin ice, and help train users to avoid falling through. Can we get some for the MIRAGE?
  • Emulating Human Attention-Getting Practices with Wearable Haptics, Baumann et al. – A servo-controlled wrist squeezer as a more natural, less intrusive way to get your attention.  This research team is very inventive, producing lots of low-cost prototype haptic devices for providing wearable, expressive feedback support.  Not sure how we can connect their work to virtual assembly, but they’ve got some great ideas.

IEEE Haptics – General impressions, Vendor demo highlights

Alright, well, with the combination of data corruption, poor battery life, poor usability (seriously, I have to transfer each photo off of my camera one at a time?), and an intense single-track conference, the well-intentioned “liveblogging” idea didn’t really come to pass. Nevertheless, I’ll share my general notes and vendor demo impressions photos with you here – summaries of the papers and demos will follow in the next post.

General impressions

  • Medical and dental applications of haptics seem to far outnumber immersive engineering applications. I don’t believe I saw a single CAD model during the whole conference. My intuitive hunch is that this is because spare parts for practice or testing are more feasible than spare people.
  • It seems I would be well-served to read a textbook or take a course in control theory: turns out that not only is most of the technical side of haptics based on it, I’ve been tiptoe-ing around it in my own work for a while.  Anyone have good suggestions on introductory texts (preferably written with a computer science perspective), now that I’ve finished Wikipedia’s content on the subject?
  • HAPI and H3D (open-source) from SenseGraphics seem to be the closest thing to a universal haptics software API out there at the moment, so I’ll definitely be looking in to it.
  • There’s a lot of neat techniques that caught my eye – it will be difficult trying to narrow my own research scope enough to have a doable Ph.D.

From the vendors…

  • SenseGraphics was showing off a haptic application written with their open source haptics scenegraph H3D: the demo was written in python, and is running here on an LCD-based stereo coincident workspace display.
    LCD-based stereo haptic coincident workspace from SenseGraphics

    A bit of an improvement over the CRT-based ReachIn systems, though apparently the conference exhibit room was not amenable to the nVidia glasses behaving nicely.

    SenseGraphics demo, from a user's perspective

  • Another exhibit from SenseGraphics – some kind of training app I didn’t get to try out.

    SenseGraphics H3D app - note the icons from the Tango Project

    SenseGraphics H3D app - note the icons from the Tango Project

  • Sensable brought its display: one of many dental applications, an improved version of the Phantom Desktop, and a demo using a Phantom Omni and 1.5 for teleoperation showing different control algorithms.
    Sensable's Haptic bite articulation demo

    One of the many dental (and medical) applications on display - Sensable

    New Phantoms - variant of Phantom Desktop with versatile mounting

    Sensable's "next big thing" - Phantom Desktops with screw holes for versatile mounting and an external control unit

  • Moog Inc. (not to be confused with synthesizer company Moog Music) showed off their extremely-stiff HapticMaster haptic device. They said that their force control style is is the difference from the Sensable devices that lets them simulate such stiffness. It was pretty good feeling, though the demo system’s lack of manipulator rotation dimmed the glow.The wonderfully-stiff Moog HapticMaster device
  • I tried out the ButterflyHaptics device. While it might be useful for precision medical simulations, its extremely limited working volume and rotational freedom (almost none – and it gets rather upset if you exceed it) seems to make it less applicable for haptic assembly applications.

    ButterflyHaptics maglev device

    ButterflyHaptics maglev device - sensitive, light, but restricted: promising with some polish

Greetings from the east coast!

I’m here in Waltham, MA (almost but not quite Boston), for the IEEE Haptics Symposium, which is being held as the final sub-conference of IEEE VR 2010. I arrived last night, seemingly well-coordinated with the departure of the other ISU attendees of the conference. I’ll be updating this blog regularly during the two-day event, with brief notes and perhaps some photos, if I can get my cell phone to transfer photos without beeping loudly.

My first impressions are that the conference environment seems very welcoming.  The first presentations are on fingertip and tactile haptic systems, an area I’ve never really looked into, so it’s all new to me.  There’s quite a bit of technical interest in the physiological and psychophysical interaction and the cutaneous receptors. Some of this research seem to be slick new hardware that would be cool to have.  (Example: how about haptic floor tiles in the C4 or its replacement? Simulate the effects of walking on natural surfaces like gravel, snow, etc…)

It’s reassuring that the questions that come to my mind while listening to the presentations seem to be generally either a: technical/terminology details a quick Google can answer, or b: also occurring to other, presumably more experienced conference attendees who bravely stand up and actually ask questions after the presentations.

I’ve been to academic conferences before, and have even presented.  That said, this research area is relatively new to me, so I imagine that this experience will teach me quite a bit about what to do and what not to do in conference presentations in this field.  After the first few presentations, I already have a few notes.

I’m looking forward to the rest of the presentations, as well as the demos – there’s nothing like a haptics conference to literally get in touch with new developments!  I’m resisting the urge to pull out the USB proceedings and browse the papers from the rest of IEEE VR and 3DUI that interested me but that I wasn’t present for – I’ll take a look at those later.  I think I will probably soon have to come up with a better system for archiving papers than keeping them all on my Dropbox, or I will run out of space pretty quickly. I’ve got the CHI 2010 conference (and thus its proceedings) coming up in early April, WINVR2010 in May, and I imagine those won’t be my last conferences…

More later!