I was commissioned to create a fully responsive 3D interactive virtual tour of the AlloSphere, a three-story facility at the University of California, Santa Barbara. The AlloSphere uses multiple modalities to represent large and complex data, including immersive visualization, sonification, and interactivity.
My goal was to showcase the AlloSphere's capabilities through an engaging web experience, combining 3D graphics, interactive elements, and dynamic audio. I worked directly with founder Dr. JoAnn Kuchera-Morin to accurately represent the facility's features.
Key Features and Technical Implementation
1. Immersive 3D Visualization of the AlloSphere
I began by creating a detailed 3D model of the AlloSphere using Blender. To bring this model to life in the browser, I utilized Three.js, a powerful JavaScript library for 3D graphics.
This implementation not only loads the 3D model but also applies a custom shader material to enhance its visual appearance. The shader creates a dynamic, shimmering effect that helps convey the high-tech nature of the AlloSphere.
2. Interactive 3D Sound Visualization
To showcase the AlloSphere's 54-speaker surround sound system, I created a dynamic visualization that demonstrates spatial audio capabilities. This involved creating a 3D light fixture that moves in space, with corresponding audio panning.
This code creates a moving light fixture that orbits the center of the AlloSphere. As it moves, the audio panner is updated to match its position, creating a realistic spatial audio effect. Additionally, the speakers' visual intensity is modulated based on their proximity to the moving sound source.
3. Topographical Visualization using Mapbox Data
To showcase the AlloSphere's location in Santa Barbara, I used topographical data from Mapbox to create a detailed 3D terrain visualization. This involved processing the raw data and mapping it to a 3D mesh with dynamic height and color properties.
This code fetches elevation data from Mapbox, decodes it, and uses it to create a detailed 3D terrain mesh. The terrain's height is dynamically animated, and its color is updated in real-time based on the current height, creating a visually engaging representation of the Santa Barbara landscape.
4. Custom Neuron Growth Algorithm
For the Growth Algorithm Demo, I implemented a custom neuron growth simulation that dynamically generates and visualizes neuronal structures.
This implementation creates a dynamic, branching structure that mimics neuronal growth. The algorithm considers factors like growth rate, branching probability, and maximum branches to create a realistic and visually interesting simulation.
5. Audio-Reactive Kaleidoscope Shader
For the Audio-Visual Demo, I created a custom kaleidoscope shader that reacts to the music. This involved creating an audio analyzer and linking its output to shader parameters.
Certainly! I'll continue with the explanation and code for the Audio-Reactive Kaleidoscope Shader:
function getAverageFrequency(array, start, end) {
let sum = 0;
for (let i = start; i < end; i++) {
sum += array[i];
}
return sum / (end - start);
}
function animate() {
requestAnimationFrame(animate);
updateAudio();
kaleidoscopePass.uniforms.time.value += 0.01;
composer.render();
}
animate();
This implementation creates a dynamic kaleidoscope effect that responds to different frequency ranges of the audio input. Here's a breakdown of how it works:
We set up an audio context and analyzer to process the audio data in real-time.
The kaleidoscope shader is defined with uniforms for time and different frequency ranges (bass, mid, treble). These uniforms allow us to pass data from our JavaScript code to the shader.
In the fragment shader, we use these audio-derived values to influence various aspects of the kaleidoscope effect:
The number of sides in the kaleidoscope pattern is influenced by the bass frequency.
The color of the output is modulated by the bass, mid, and treble values.
The updateAudio function analyzes the current audio frame and calculates average values for different frequency ranges.
In the animation loop, we continuously update the audio analysis and the shader uniforms, creating a dynamic, music-reactive visual effect.
This audio-reactive shader creates a mesmerizing visual experience that's tightly coupled with the audio, reinforcing the AlloSphere's capacity for multi-modal data representation.
6. Dynamic Music Progression System
For the virtual tour, I implemented a dynamic music system that evolves as users progress through different stages of the AlloSphere experience. This system triggers new audio elements and adjusts existing ones based on the user's current step in the tour.
We define multiple audio stems (ambient, bass, melody, percussion, and a sound effect) that will be triggered and mixed at different points in the tour.
The loadAudioStems function asynchronously loads all the audio files, creates buffer sources, and sets up gain nodes for volume control.
The triggerStem function starts playing a specific stem if it's not already playing.
The adjustStemVolume function allows for smooth volume transitions of individual stems.
The handleTourStep function is the core of our progression system. It's called each time the user moves to a new step in the tour:
It triggers new stems and adjusts volumes based on the current step.
Each step builds upon the previous one, gradually introducing new elements to the soundscape.
A sound effect is played at each step change to provide auditory feedback to the user.
In the usage example, we attach the handleTourStep function to a 'Next' button click event, incrementing the tour step each time.
This system creates a dynamic, evolving soundscape that enhances the user's journey through the virtual AlloSphere. As they progress through the tour, the music becomes richer and more complex, mirroring their deepening understanding of the AlloSphere's capabilities. The sound effects provide clear auditory cues for transitions between stages of the tour, contributing to a more engaging and immersive user experience.
By implementing these advanced features - from complex 3D visualizations and custom shaders to dynamic audio systems and interactive elements - the AlloSphere tour pushes the boundaries of what's possible in web-based 3D experiences.