Alright, well, with the combination of data corruption, poor battery life, poor usability (seriously, I have to transfer each photo off of my camera one at a time?), and an intense single-track conference, the well-intentioned “liveblogging” idea didn’t really come to pass. Nevertheless, I’ll share my general notes and vendor demo impressions photos with you here – summaries of the papers and demos will follow in the next post.
- Medical and dental applications of haptics seem to far outnumber immersive engineering applications. I don’t believe I saw a single CAD model during the whole conference. My intuitive hunch is that this is because spare parts for practice or testing are more feasible than spare people.
- It seems I would be well-served to read a textbook or take a course in control theory: turns out that not only is most of the technical side of haptics based on it, I’ve been tiptoe-ing around it in my own work for a while. Anyone have good suggestions on introductory texts (preferably written with a computer science perspective), now that I’ve finished Wikipedia’s content on the subject?
- HAPI and H3D (open-source) from SenseGraphics seem to be the closest thing to a universal haptics software API out there at the moment, so I’ll definitely be looking in to it.
- There’s a lot of neat techniques that caught my eye – it will be difficult trying to narrow my own research scope enough to have a doable Ph.D.
From the vendors…
- SenseGraphics was showing off a haptic application written with their open source haptics scenegraph H3D: the demo was written in python, and is running here on an LCD-based stereo coincident workspace display.
A bit of an improvement over the CRT-based ReachIn systems, though apparently the conference exhibit room was not amenable to the nVidia glasses behaving nicely.
- Another exhibit from SenseGraphics – some kind of training app I didn’t get to try out.
SenseGraphics H3D app - note the icons from the Tango Project
- Sensable brought its display: one of many dental applications, an improved version of the Phantom Desktop, and a demo using a Phantom Omni and 1.5 for teleoperation showing different control algorithms.
One of the many dental (and medical) applications on display - Sensable
Sensable's "next big thing" - Phantom Desktops with screw holes for versatile mounting and an external control unit
- Moog Inc. (not to be confused with synthesizer company Moog Music) showed off their extremely-stiff HapticMaster haptic device. They said that their force control style is is the difference from the Sensable devices that lets them simulate such stiffness. It was pretty good feeling, though the demo system’s lack of manipulator rotation dimmed the glow.
- I tried out the ButterflyHaptics device. While it might be useful for precision medical simulations, its extremely limited working volume and rotational freedom (almost none – and it gets rather upset if you exceed it) seems to make it less applicable for haptic assembly applications.
ButterflyHaptics maglev device - sensitive, light, but restricted: promising with some polish
I’m here in Waltham, MA (almost but not quite Boston), for the IEEE Haptics Symposium, which is being held as the final sub-conference of IEEE VR 2010. I arrived last night, seemingly well-coordinated with the departure of the other ISU attendees of the conference. I’ll be updating this blog regularly during the two-day event, with brief notes and perhaps some photos, if I can get my cell phone to transfer photos without beeping loudly.
My first impressions are that the conference environment seems very welcoming. The first presentations are on fingertip and tactile haptic systems, an area I’ve never really looked into, so it’s all new to me. There’s quite a bit of technical interest in the physiological and psychophysical interaction and the cutaneous receptors. Some of this research seem to be slick new hardware that would be cool to have. (Example: how about haptic floor tiles in the C4 or its replacement? Simulate the effects of walking on natural surfaces like gravel, snow, etc…)
It’s reassuring that the questions that come to my mind while listening to the presentations seem to be generally either a: technical/terminology details a quick Google can answer, or b: also occurring to other, presumably more experienced conference attendees who bravely stand up and actually ask questions after the presentations.
I’ve been to academic conferences before, and have even presented. That said, this research area is relatively new to me, so I imagine that this experience will teach me quite a bit about what to do and what not to do in conference presentations in this field. After the first few presentations, I already have a few notes.
I’m looking forward to the rest of the presentations, as well as the demos – there’s nothing like a haptics conference to literally get in touch with new developments! I’m resisting the urge to pull out the USB proceedings and browse the papers from the rest of IEEE VR and 3DUI that interested me but that I wasn’t present for – I’ll take a look at those later. I think I will probably soon have to come up with a better system for archiving papers than keeping them all on my Dropbox, or I will run out of space pretty quickly. I’ve got the CHI 2010 conference (and thus its proceedings) coming up in early April, WINVR2010 in May, and I imagine those won’t be my last conferences…