A revolutionary full-stack web platform that revamps music promotion. SoundDrop converts album art into interactive, audio-reactive 3D visuals, offering artists a unique way to showcase their work.
SoundDrop is an innovative web application designed to revolutionize the way musicians and artists promote their music online. It combines cutting-edge web technologies to create interactive, audio-reactive visual experiences that serve as unique landing pages for music promotion. SoundDrop allows artists to upload their tracks and cover art, which are then transformed into dynamic 3D visualizations that react in real-time to the music - complete with links to their music on all platforms.
The project aims to provide a platform where artists can create engaging, shareable content that stands out in the crowded digital music landscape. By leveraging the power of modern web browsers, SoundDrop delivers a high-fidelity audio-visual experience without the need for any plugins or additional software.
Implementation
As the sole creator and developer of SoundDrop, my role encompassed the entire development lifecycle, from conceptualization to implementation. This included:
Architecting the overall structure of the application
Developing the front-end user interface using React
Implementing 3D graphics and novel audio-reactive visualizations using React Three Fiber/Three.js and custom WebGL shaders
Setting up the backend infrastructure using Firebase for authentication and data storage
Integrating Stripe's API for payment processing and subscription tracking
Optimizing performance for smooth rendering across various devices
Implementing responsive design for both desktop and mobile experiences
This project showcases my ability to work with complex, interdisciplinary technologies and create a cohesive, user-friendly application that pushes the boundaries of what's possible in web development.
Technologies Used
SoundDrop leverages a modern tech stack to deliver its unique features:
React: The core framework used for building the user interface and managing component state.
Three.js and React Three Fiber: Used for creating and rendering 3D graphics in the browser.
This combination of technologies allows SoundDrop to deliver a seamless, interactive experience that bridges the gap between audio and visual art forms. The project demonstrates the potential of web technologies to create rich, immersive applications that were once only possible with native software.
2. Key Features
Interactive 3D Graphics
SoundDrop transforms album cover art into an interactive 3D point cloud using React Three Fiber. This creates a visually stunning representation of the artist's work.
import { Points, Point } from '@react-three/drei'
import { useFrame } from '@react-three/fiber'
function PointCloud({ imageData, audioData }) {
const pointsRef = useRef()
useFrame(() => {
// Update point positions based on audio data
pointsRef.current.geometry.attributes.position.needsUpdate = true
})
return (
<Points ref={pointsRef}>
{imageData.map((pixel, index) => (
<Point key={index} position={[pixel.x, pixel.y, pixel.z]} color={pixel.color} />
))}
</Points>
)
}
I created a method to process the image data in order to assign a vertices array for the point cloud:
var imageData = context.getImageData(0, 0, imageWidth, imageHeight).data;
var vertices2 = [];
var x = imageWidth * -0.5;
var y = imageHeight * 0.5;
for (var i = 0; i < imageHeight; i++) {
for (var j = 0; j < imageWidth; j++) {
var color = new THREE.Color();
color.setRGB(
imageData[c] / 255,
imageData[c + 1] / 255,
imageData[c + 2] / 255
);
var weight =
color.r * weights[0] + color.g * weights[1] + color.b * weights[2];
var vertex = new THREE.Vector3();
vertex.x = x;
vertex.y = y;
vertex.z = zRange * -0.5 + zRange * weight;
vertices2.push(vertex);
c += 4;
x++;
}
x = imageWidth * -0.5;
y--;
}
This code iterates through each pixel of the image, creates a color from the RGB values, calculates a weight based on the color, and then creates a vertex with x and y coordinates based on the pixel position, and a z coordinate based on the calculated weight. This effectively turns the 2D image into a 3D point cloud where the "depth" of each point is determined by the brightness of the corresponding pixel
I then convert the array of vertices into a Float32Array for use with Three.js:
const vertices3 = [];
for (var i = 0; i < vertices2.length; i++) {
vertices3.push(vertices2[i].x);
vertices3.push(vertices2[i].y);
vertices3.push(vertices2[i].z);
}
verticesFloat = new Float32Array(vertices3);
Audio Reactivity
One of the most impressive features is the real-time audio reactivity. The point cloud dynamically responds to the music, creating a unique visual experience for each track.
function updatePointCloudWithAudio(pointCloud, audioData) {
const positions = pointCloud.geometry.attributes.position.array
for (let i = 0; i < positions.length; i += 3) {
const audioIndex = Math.floor(i / 3) % audioData.length
positions[i + 2] = audioData[audioIndex] / 256 * 5 // Scale Z-axis based on audio
}
pointCloud.geometry.attributes.position.needsUpdate = true
}
useFrame(() => {
if (audioAnalyser && pointCloudRef.current) {
const audioData = new Uint8Array(audioAnalyser.frequencyBinCount)
audioAnalyser.getByteFrequencyData(audioData)
updatePointCloudWithAudio(pointCloudRef.current, audioData)
}
})
The FFT data is what drives the movement of the point cloud, scaling the Z-axis of pixels based on audio frequency
User Authentication
Firebase is used for secure user authentication, allowing artists to create and manage their accounts.
import { getAuth, createUserWithEmailAndPassword, signInWithEmailAndPassword } from "firebase/auth";
const auth = getAuth();
// Sign Up
createUserWithEmailAndPassword(auth, email, password)
.then((userCredential) => {
// Signed in
const user = userCredential.user;
// ...
})
.catch((error) => {
const errorCode = error.code;
const errorMessage = error.message;
// ..
});
// Sign In
signInWithEmailAndPassword(auth, email, password)
.then((userCredential) => {
// Signed in
const user = userCredential.user;
// ...
})
.catch((error) => {
const errorCode = error.code;
const errorMessage = error.message;
});
Customizable Pages
Artists can create personalized pages with custom URLs, adding links to their various streaming platforms.
In developing SoundDrop, I carefully designed an architecture that could handle complex 3D visualizations, real-time audio processing, and user data management while maintaining a smooth user experience. Here's an overview of the key components:
Frontend Framework
I chose React as the primary frontend framework for its component-based architecture and efficient UI updates. This allowed me to create a modular and maintainable codebase.
For the core of SoundDrop's visual experience, I integrated Three.js with React Three Fiber. This combination allowed me to create complex 3D scenes declaratively within my React components. I also implemented custom shaders for advanced visual effects:
While React's built-in hooks are great for local state, I needed a more robust solution for global state management. I implemented Zustand, a lightweight yet powerful state management library:
This allowed me to efficiently manage global state across the application, from audio settings to user preferences.
Backend and Database
For the backend, I chose Firebase for its real-time capabilities and ease of integration. I implemented authentication, data storage, and real-time updates:
One of the most challenging and rewarding aspects was implementing real-time audio processing. I used the Web Audio API to analyze audio and drive the visualizations:
This architecture allowed me to create a seamless integration between the audio input and the visual output, resulting in the dynamic, music-responsive visualizations that are at the heart of SoundDrop.
4. Code Deep Dives
3D Rendering with React Three Fiber
In SoundDrop, I leveraged React Three Fiber to create dynamic 3D visualizations. One of the key components I developed was the PointCloud, which transforms the album cover into an interactive 3D object:
import { useRef, useMemo } from 'react'
import { useFrame } from '@react-three/fiber'
import * as THREE from 'three'
function PointCloud({ imageData, audioData }) {
const pointsRef = useRef()
const particles = useMemo(() => {
const tempParticles = []
for (let i = 0; i < imageData.length; i += 4) {
const x = (i % (imageWidth * 4)) / 4 - imageWidth / 2
const y = Math.floor(i / (imageWidth * 4)) - imageHeight / 2
const z = 0
tempParticles.push(new THREE.Vector3(x, y, z))
}
return tempParticles
}, [imageData])
const colors = useMemo(() => {
return new Float32Array(particles.length * 3).map((_, i) => {
if (i % 3 === 0) {
const index = (i / 3) * 4
return imageData[index] / 255
}
if (i % 3 === 1) {
const index = (Math.floor(i / 3) * 4) + 1
return imageData[index] / 255
}
if (i % 3 === 2) {
const index = (Math.floor(i / 3) * 4) + 2
return imageData[index] / 255
}
})
}, [imageData, particles])
useFrame(() => {
if (pointsRef.current && audioData) {
const positions = pointsRef.current.geometry.attributes.position.array
for (let i = 0; i < positions.length; i += 3) {
const audioIndex = Math.floor(i / 3) % audioData.length
positions[i + 2] = audioData[audioIndex] / 255 * 50 // Z-axis modulation
}
pointsRef.current.geometry.attributes.position.needsUpdate = true
}
})
return (
<points ref={pointsRef}>
<bufferGeometry>
<bufferAttribute
attachObject={['attributes', 'position']}
count={particles.length}
array={new Float32Array(particles.flatMap(p => [p.x, p.y, p.z]))}
itemSize={3}
/>
<bufferAttribute
attachObject={['attributes', 'color']}
count={colors.length / 3}
array={colors}
itemSize={3}
/>
</bufferGeometry>
<pointsMaterial vertexColors size={2} sizeAttenuation={false} />
</points>
)
}
This component creates a point for each pixel in the album cover, positioning it in 3D space. The z-axis of each point is then modulated based on the audio data, creating the dynamic, music-reactive effect.
Custom Shaders
To achieve more complex visual effects, I implemented custom shaders. Here's a complete example of a custom shader I used in SoundDrop to create a dynamic, audio-reactive background:
This shader creates a dynamic, flowing background that responds to the bass and treble intensities of the audio. The noise function creates organic patterns, while the audio input influences the color mixing and adds a sparkle effect.
To use this shader in my React Three Fiber setup, I created a custom material:
To manage user subscriptions, I utilized Firebase's Firestore database along with Cloud Functions to handle Stripe webhook events. Here's an overview of the process:
When a user subscribes, I store their subscription information in Firestore:
SoundDrop was my first and most robust build of a full stack web app. I built it over the course of a few months, pushing myself to come up with novel ways to display the cover arts and create custom filters. It taught me valuable lessons about organizing databases in structured ways and interfacing with payment API's. Check it out for yourself ↓