Wednesday 20 February 2008

Current community discussions

I've reviewed the literature and have identified major themes that are relevant to my work:

Audio for Navigation
Researchers at Helsinki University of Technology have produced some interesting work on using audio in virtual environments. Their experiments suggest that audio is useful for locating objects at a distance, but is a poor substitute for vision at close range.

@article{1101558,
author = {Matti Gr\"{o}hn and Tapio Lokki and Tapio Takala},
title = {Comparison of auditory, visual, and audiovisual navigation in a 3D space},
journal = {ACM Trans. Appl. Percept.},
volume = {2},
number = {4},
year = {2005},
issn = {1544-3558},
pages = {564--570},
doi = {http://doi.acm.org/10.1145/1101530.1101558},
publisher = {ACM},
address = {New York, NY, USA},
}

@article{1058321,
author = {Tapio Lokki and Matti Grohn},
title = {Navigation with Auditory Cues in a Virtual Environment},
journal = {IEEE MultiMedia},
volume = {12},
number = {2},
year = {2005},
issn = {1070-986X},
pages = {80--86},
doi = {http://dx.doi.org/10.1109/MMUL.2005.33},
publisher = {IEEE Computer Society Press},
address = {Los Alamitos, CA, USA},
}

@misc{ lokki00case,
author = "T. Lokki and M. Gr and o Savioja and T. Takala",
title = "A case study of auditory navigation in virtual acoustic environments",
text = "T. Lokki, M. Grohn, L. Savioja, and T. Takala, \A case study of auditory
navigation in virtual acoustic environments, Proc. ICAD 2000 ,(Atlanta GA,),
Apr 2000.",
year = "2000",
url = "citeseer.ist.psu.edu/lokki00case.html" }

@misc{ hn-utilizing,
author = "Matti Gr\"ohn",
title = "Utilizing Audio in Immersive Visualization",
url = "citeseer.ist.psu.edu/452324.html" }

@article{1101559,
author = {Matti Gr\"{o}hn and Tapio Lokki and Tapio Takala},
title = {Author's comments on Gr\öhn, Lokki, and Takala, ICAD 2003},
journal = {ACM Trans. Appl. Percept.},
volume = {2},
number = {4},
year = {2005},
issn = {1544-3558},
pages = {571--573},
doi = {http://doi.acm.org/10.1145/1101530.1101559},
publisher = {ACM},
address = {New York, NY, USA},
}

Audio for real world navigation
There are a couple of different applications that seem interesting, both concerned with using maps of the real world. One approach uses 3D audio presented to the user of a GPS device as a guide to their exploration, a technique which would be easy to implement in SL. The other interesting technique is the audible representation of a map that the user can familiarise themselves with in order to help build up a mental model of a real world location prior to visiting. Similarly this should be relatively easy to implement in SL.

@inproceedings{1182492,
author = {Wilko Heuten and Daniel Wichmann and Susanne Boll},
title = {Interactive 3D sonification for the exploration of city maps},
booktitle = {NordiCHI '06: Proceedings of the 4th Nordic conference on Human-computer interaction},
year = {2006},
isbn = {1-59593-325-5},
pages = {155--164},
location = {Oslo, Norway},
doi = {http://doi.acm.org/10.1145/1182475.1182492},
publisher = {ACM},
address = {New York, NY, USA},
}


Real World GPS / Audio Maps
Games
Physical properties in the virtual world

Second Life Accessibility
A number of people are already discussing the issues and the following trends seem to come up:
Visual impairment.
Most people who are registered as legally blind do have some degree of sight left, so the most common issues are not so much to do with navigating without sight, but rather adapting the display to their individusal needs. This means being able to modify colours and their contrast for the colour-blind, and adjust magnification or screen resolution for low-vision users. These are simple tasks that could be accomplished with relatively little technical work, but my research project takes on the larger and more theoretical task of enabling access to users without the use of vision at all.

Alternate clients
This theme is largely oriented around making SL accessible to screen readers, with the leader being SLeek. These projects only expose a small portion of SL, typically limiting the information conveyed to lists of local objects and avatars, a chat interface and teleportation facility. This approach is the fastest way to get blind users onto Second Life, but gives them little more than a new instant messaging client.

GuideBot
In the real world it is not uncommon for a sighted person to guide the blind, especially in new environments. This theme takes the principle of assistance and tries to automate it by using software-controlled avatars (or "bots"). Josh Markwordt's project is one example, though I believe the complexity of SL, and the ability of users to search for and teleport to specific locations, will limit the usefulness of this approach.

Virtual Representation
Disabled people have a presence in SL already, but these are mostly people who have restricted mobility. Simon Walsh is the owner of Wheelies, a disability-themed nightclub. One of the issues that comes up is the representation of physical disabilities in the virtual world. Avatars can be customised to represent oneself in any way possible, and there is a trend amongst wheelchair users to incorporate representations of chairs in their virtual identities.

For a comprehensive review of the literature, see my other posts:
SLED Accessibility Threads
Disability in SL
Mailing List Fora
Accessibility Analysis


Haptics
Jeff VanDrimmelen, an Academic Computing Expert in the Office of Arts and Sciences Information Services, University of North Carolina at Chapel Hill publishes research on a site called Haptic Education - Adding the Tactile Sensation to Virtual Learning.


VanDrimmelen's team have focused on another virtual environment called Croquet, but have also considered Second Life and make some interesting observations,
The creators of Second Life actually started their project out with a large haptic device, but soon abandoned it for more financially appealing options.

In Second Life the only way to navigate with a mouse is to bring up an on screen navigation menu that you have to click to move the avatar. It works okay when the avatar is flying, but otherwise you just end up using the buttons on the handle to move around. However, just in case anyone wants to work with the script, here it is.
In Linden's default client movement is controlled using the keyboard, but in my own research I have recently been able to control by walking and flying using a force feedback joystick (Logitech Wingman Strike Force 3D). This was made possible by using a free 3rd party tool called GlovePIE which VanDrimmelen's team also employed. The tool works by intercepting output from the joystick and injecting the corresponding keyboard signals, such that by moving the joystick left and right the Second Life avatar turns left and right, and moving the joystick forward and backwards moves the avatar forward and back. The same technique is used by VanDrimmelen's team to use the Novint Falcon as input device for Croquet. This approach appears to offer a very quick and easy way to prototype haptics in Second Life. VanDrimmelen continues, however,
It should be noted that about the same time we found the GlovePIE software Novint announced they are working on drivers that will work with not only Second Life, but World of Warcraft as well.
Currently both of these drivers are "in exploration phase" with no estimated completion date. Also in their (busy!) release schedule Novint also describe another interesting product, "Feelin' It: Blind Games™":
Novint will release a number of games that can be played entirely without sight. For example, in a bowling game, you will be able to feel the extents of the lane, feel the weight of the ball as it is thrown, and hear the pins crash down. After throwing the ball and hitting the pins, the game will bring up a touchable representation of how the ball traveled down the lane to guide the user's muscle memory for future shots, and the user will be able to feel with a 3D cursor which pins are still standing. All the information needed to play the game and become a true master, will be available without any graphics.
Further haptic research in Second Life is being conducted by Maurizio de Pascale, Sara Mulatto, Domenico Prattichizzo from the Haptics Group of the Siena Robotics and Systems Lab, in the Dipartimento di Ingegneria Informatica at the University of Siena. In particular they have a paper called "Bringing Haptics to Second Life: A Haptics-enabled Second Life Viewer for Blind Users", which is due for publication at the "Haptic in Ambient Systems" conference, which takes place in Quebec City, Canada on February 11-14, 2008.



Judging from the screenshot, I would imagine that the Siena team are not using the Novint, but rather a different haptic device that has a stylus, perhaps one of SensAble Technology's Phantom range which seem popular in academic research.

Another research project that is of interest as inspiration for our Second Life work is the Haptic Torch from the Interactive Systems Research Group at the University of Reading.


"The unique design of the torch allows users to range from sighted individuals in low-light conditions to people who are both deaf and blind. The torch provides a method of alerting users to presence of potentiol hazards using non-contact measurement techniques. An subtle tactile (touch) interface conveys relevent information to the user while not interfering with other senses." [sic]
Whereas the Haptic Torch is only capable of signifying the presence of objects, the Falcon could be used to reach out and feel their shape, and this immediate physical stimuli will assist the users construction of a mental map of the virtual space.

No comments: