"In a room with large-screen walls, where animated virtual players hold different musical instruments, the visitor, wearing data gloves, conducts a musical performance, leading the tempo with one hand and, with the other,
directing aspects of the performance (a string crescendo, for example). The players show features of human behavior: they pay attention when the conductor begins, and they continue playing for awhile if the conductor ceases, but they soon return to playing nonsense. Through amplified speakers, the visitor can also experience the acoustics of the surrounding virtual concert
hall. Alternative acoustic environments (open space, concert hall, church) and pieces in different musical styles can be selected from a menu.
Various techniques are used to produce this fully synthetic experience: rule-based agents for players' behavior, neural networks for response to the conductor, physical instrument modeling for the sound synthesis, real-time reverberation simulation for the hall acoustics, and auralization filters for 3D
sound."