Tal Halperin | Web Developer
  • Introduction
  • Projects
    • SoundDrop
    • AlloSphere
    • Stoca
    • TalHalperin.com
Powered by GitBook
On this page
  • 1. Introduction
  • Project Overview
  • Implementation
  • Technologies Used
  • 2. Key Features
  • Interactive 3D Graphics
  • Audio Reactivity
  • User Authentication
  • Customizable Pages
  • Responsive Design
  • Subscription Model
  • 3. Architecture Overview
  • Frontend Framework
  • 3D Graphics Engine
  • State Management
  • Backend and Database
  • Audio Processing
  • 4. Code Deep Dives
  • 3D Rendering with React Three Fiber
  • Custom Shaders
  • Audio Processing
  • Firebase Integration
  • Image Storage and Retrieval
  • Subscription Tracking in Firebase
  • Link Management
  1. Projects

SoundDrop

A revolutionary full-stack web platform that revamps music promotion. SoundDrop converts album art into interactive, audio-reactive 3D visuals, offering artists a unique way to showcase their work.

PreviousIntroductionNextAlloSphere

Last updated 8 months ago

1. Introduction

Project Overview

SoundDrop is an innovative web application designed to revolutionize the way musicians and artists promote their music online. It combines cutting-edge web technologies to create interactive, audio-reactive visual experiences that serve as unique landing pages for music promotion. SoundDrop allows artists to upload their tracks and cover art, which are then transformed into dynamic 3D visualizations that react in real-time to the music - complete with links to their music on all platforms.

The project aims to provide a platform where artists can create engaging, shareable content that stands out in the crowded digital music landscape. By leveraging the power of modern web browsers, SoundDrop delivers a high-fidelity audio-visual experience without the need for any plugins or additional software.

Implementation

As the sole creator and developer of SoundDrop, my role encompassed the entire development lifecycle, from conceptualization to implementation. This included:

  1. Architecting the overall structure of the application

  2. Developing the front-end user interface using React

  3. Implementing 3D graphics and novel audio-reactive visualizations using React Three Fiber/Three.js and custom WebGL shaders

  4. Setting up the backend infrastructure using Firebase for authentication and data storage

  5. Integrating Stripe's API for payment processing and subscription tracking

  6. Optimizing performance for smooth rendering across various devices

  7. Implementing responsive design for both desktop and mobile experiences

This project showcases my ability to work with complex, interdisciplinary technologies and create a cohesive, user-friendly application that pushes the boundaries of what's possible in web development.

Technologies Used

SoundDrop leverages a modern tech stack to deliver its unique features:

  • React: The core framework used for building the user interface and managing component state.

  • Three.js and React Three Fiber: Used for creating and rendering 3D graphics in the browser.

import { Canvas } from '@react-three/fiber'
import { OrbitControls } from '@react-three/drei'

function App() {
  return (
    <Canvas>
      <ambientLight intensity={0.5} />
      <spotLight position={[10, 10, 10]} angle={0.15} penumbra={1} />
      <pointLight position={[-10, -10, -10]} />
      <Box position={[-1.2, 0, 0]} />
      <Box position={[1.2, 0, 0]} />
      <OrbitControls />
    </Canvas>
  )
}
  • Web Audio API: For processing and analyzing audio in real-time.

const audioContext = new (window.AudioContext || window.webkitAudioContext)();
const analyser = audioContext.createAnalyser();
analyser.fftSize = 2048;
const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);

// In your render loop
analyser.getByteFrequencyData(dataArray);
// Use dataArray to drive your visualizations
  • Firebase: Provides backend services including authentication, database, and hosting.

import { initializeApp } from 'firebase/app';
import { getAuth } from 'firebase/auth';
import { getFirestore } from 'firebase/firestore';

const firebaseConfig = {
  // Firebase configuration
};

const app = initializeApp(firebaseConfig);
const auth = getAuth(app);
const db = getFirestore(app);
  • GLSL: The OpenGL Shading Language used for writing custom shaders for advanced visual effects.

uniform float time;
varying vec2 vUv;

void main() {
  vec2 uv = vUv;
  vec3 color = vec3(0.5 + 0.5 * sin(time + uv.xyx + vec3(0,2,4)));
  gl_FragColor = vec4(color, 1.0);
}

This combination of technologies allows SoundDrop to deliver a seamless, interactive experience that bridges the gap between audio and visual art forms. The project demonstrates the potential of web technologies to create rich, immersive applications that were once only possible with native software.


2. Key Features

Interactive 3D Graphics

SoundDrop transforms album cover art into an interactive 3D point cloud using React Three Fiber. This creates a visually stunning representation of the artist's work.

import { Points, Point } from '@react-three/drei'
import { useFrame } from '@react-three/fiber'

function PointCloud({ imageData, audioData }) {
  const pointsRef = useRef()

  useFrame(() => {
    // Update point positions based on audio data
    pointsRef.current.geometry.attributes.position.needsUpdate = true
  })

  return (
    <Points ref={pointsRef}>
      {imageData.map((pixel, index) => (
        <Point key={index} position={[pixel.x, pixel.y, pixel.z]} color={pixel.color} />
      ))}
    </Points>
  )
}

I created a method to process the image data in order to assign a vertices array for the point cloud:

var imageData = context.getImageData(0, 0, imageWidth, imageHeight).data;
var vertices2 = [];
var x = imageWidth * -0.5;
var y = imageHeight * 0.5;

for (var i = 0; i < imageHeight; i++) {
  for (var j = 0; j < imageWidth; j++) {
    var color = new THREE.Color();
    color.setRGB(
      imageData[c] / 255,
      imageData[c + 1] / 255,
      imageData[c + 2] / 255
    );

    var weight =
      color.r * weights[0] + color.g * weights[1] + color.b * weights[2];

    var vertex = new THREE.Vector3();
    vertex.x = x;
    vertex.y = y;
    vertex.z = zRange * -0.5 + zRange * weight;

    vertices2.push(vertex);

    c += 4;
    x++;
  }
  x = imageWidth * -0.5;
  y--;
}

This code iterates through each pixel of the image, creates a color from the RGB values, calculates a weight based on the color, and then creates a vertex with x and y coordinates based on the pixel position, and a z coordinate based on the calculated weight. This effectively turns the 2D image into a 3D point cloud where the "depth" of each point is determined by the brightness of the corresponding pixel

I then convert the array of vertices into a Float32Array for use with Three.js:

const vertices3 = [];
for (var i = 0; i < vertices2.length; i++) {
  vertices3.push(vertices2[i].x);
  vertices3.push(vertices2[i].y);
  vertices3.push(vertices2[i].z);
}
verticesFloat = new Float32Array(vertices3);

Audio Reactivity

One of the most impressive features is the real-time audio reactivity. The point cloud dynamically responds to the music, creating a unique visual experience for each track.

function updatePointCloudWithAudio(pointCloud, audioData) {
  const positions = pointCloud.geometry.attributes.position.array
  for (let i = 0; i < positions.length; i += 3) {
    const audioIndex = Math.floor(i / 3) % audioData.length
    positions[i + 2] = audioData[audioIndex] / 256 * 5 // Scale Z-axis based on audio
  }
  pointCloud.geometry.attributes.position.needsUpdate = true
}

useFrame(() => {
  if (audioAnalyser && pointCloudRef.current) {
    const audioData = new Uint8Array(audioAnalyser.frequencyBinCount)
    audioAnalyser.getByteFrequencyData(audioData)
    updatePointCloudWithAudio(pointCloudRef.current, audioData)
  }
})

The FFT data is what drives the movement of the point cloud, scaling the Z-axis of pixels based on audio frequency

User Authentication

Firebase is used for secure user authentication, allowing artists to create and manage their accounts.

import { getAuth, createUserWithEmailAndPassword, signInWithEmailAndPassword } from "firebase/auth";

const auth = getAuth();

// Sign Up
createUserWithEmailAndPassword(auth, email, password)
  .then((userCredential) => {
    // Signed in 
    const user = userCredential.user;
    // ...
  })
  .catch((error) => {
    const errorCode = error.code;
    const errorMessage = error.message;
    // ..
  });

// Sign In
signInWithEmailAndPassword(auth, email, password)
  .then((userCredential) => {
    // Signed in 
    const user = userCredential.user;
    // ...
  })
  .catch((error) => {
    const errorCode = error.code;
    const errorMessage = error.message;
  });

Customizable Pages

Artists can create personalized pages with custom URLs, adding links to their various streaming platforms.

function CustomizablePage({ user }) {
  const [links, setLinks] = useState([]);

  const addLink = (platform, url) => {
    setLinks([...links, { platform, url }]);
  }

  return (
    <div>
      <h1>{user.artistName}</h1>
      <PointCloudVisualization coverArt={user.coverArt} />
      {links.map((link, index) => (
        <a key={index} href={link.url}>{link.platform}</a>
      ))}
      <button onClick={() => addLink('Spotify', 'https://spotify.com/...')}>Add Spotify</button>
      {/* Add more platform buttons */}
    </div>
  )
}

Responsive Design

The application is fully responsive, ensuring a great user experience on both desktop and mobile devices.

import { useMediaQuery } from '@mantine/hooks';

function ResponsiveLayout({ children }) {
  const isMobile = useMediaQuery('(max-width: 768px)');

  return (
    <div style={{ padding: isMobile ? '10px' : '20px' }}>
      {isMobile ? <MobileNavigation /> : <DesktopNavigation />}
      {children}
    </div>
  )
}

Subscription Model

Stripe's API is integrated for handling payments and managing subscriptions, allowing for premium features and monetization.

const checkoutProMonth = async () => {
  const docRef = await fb.firestore
    .collection("customers")
    .doc(authUser.authUser.uid)
    .collection("checkout_sessions")
    .add({
      price: "price_1LzQ4cLFZvJ24cvWWmzL2zeR",
      allow_promotion_codes: true,
      success_url: window.location.origin,
      cancel_url: window.location.origin,
    });

  docRef.onSnapshot((snap) => {
    const { error, url } = snap.data();
    if (error) {
      alert(`An error occurred: ${error.message}`);
    }
    if (url) {
      window.location.assign(url);
    }
  });
};

3. Architecture Overview

In developing SoundDrop, I carefully designed an architecture that could handle complex 3D visualizations, real-time audio processing, and user data management while maintaining a smooth user experience. Here's an overview of the key components:

Frontend Framework

I chose React as the primary frontend framework for its component-based architecture and efficient UI updates. This allowed me to create a modular and maintainable codebase.

function App() {
  return (
    <div className="App">
      <Canvas>
        <PointCloudVisualization />
      </Canvas>
      <AudioControls />
      <UserDashboard />
    </div>
  );
}

3D Graphics Engine

For the core of SoundDrop's visual experience, I integrated Three.js with React Three Fiber. This combination allowed me to create complex 3D scenes declaratively within my React components. I also implemented custom shaders for advanced visual effects:

uniform float time;
varying vec2 vUv;

void main() {
  vec2 uv = vUv;
  vec3 color = vec3(0.5 + 0.5 * sin(time + uv.xyx + vec3(0,2,4)));
  gl_FragColor = vec4(color, 1.0);
}

State Management

While React's built-in hooks are great for local state, I needed a more robust solution for global state management. I implemented Zustand, a lightweight yet powerful state management library:

import create from 'zustand'

const useStore = create((set) => ({
  audioEnabled: false,
  currentTrack: null,
  setAudioEnabled: (enabled) => set({ audioEnabled: enabled }),
  setCurrentTrack: (track) => set({ currentTrack: track }),
}))

This allowed me to efficiently manage global state across the application, from audio settings to user preferences.

Backend and Database

For the backend, I chose Firebase for its real-time capabilities and ease of integration. I implemented authentication, data storage, and real-time updates:

import { initializeApp } from 'firebase/app';
import { getFirestore, doc, setDoc } from 'firebase/firestore';

const app = initializeApp(firebaseConfig);
const db = getFirestore(app);

export const createUserPage = async (userId, pageData) => {
  await setDoc(doc(db, "users", userId), pageData);
}

Audio Processing

One of the most challenging and rewarding aspects was implementing real-time audio processing. I used the Web Audio API to analyze audio and drive the visualizations:

const setupAudioAnalyzer = (audioElement) => {
  const audioContext = new (window.AudioContext || window.webkitAudioContext)();
  const analyser = audioContext.createAnalyser();
  const source = audioContext.createMediaElementSource(audioElement);
  source.connect(analyser);
  analyser.connect(audioContext.destination);

  analyser.fftSize = 2048;
  const bufferLength = analyser.frequencyBinCount;
  const dataArray = new Uint8Array(bufferLength);

  return { analyser, dataArray };
}

This architecture allowed me to create a seamless integration between the audio input and the visual output, resulting in the dynamic, music-responsive visualizations that are at the heart of SoundDrop.


4. Code Deep Dives

3D Rendering with React Three Fiber

In SoundDrop, I leveraged React Three Fiber to create dynamic 3D visualizations. One of the key components I developed was the PointCloud, which transforms the album cover into an interactive 3D object:

import { useRef, useMemo } from 'react'
import { useFrame } from '@react-three/fiber'
import * as THREE from 'three'

function PointCloud({ imageData, audioData }) {
  const pointsRef = useRef()

  const particles = useMemo(() => {
    const tempParticles = []
    for (let i = 0; i < imageData.length; i += 4) {
      const x = (i % (imageWidth * 4)) / 4 - imageWidth / 2
      const y = Math.floor(i / (imageWidth * 4)) - imageHeight / 2
      const z = 0
      tempParticles.push(new THREE.Vector3(x, y, z))
    }
    return tempParticles
  }, [imageData])

  const colors = useMemo(() => {
    return new Float32Array(particles.length * 3).map((_, i) => {
      if (i % 3 === 0) {
        const index = (i / 3) * 4
        return imageData[index] / 255
      }
      if (i % 3 === 1) {
        const index = (Math.floor(i / 3) * 4) + 1
        return imageData[index] / 255
      }
      if (i % 3 === 2) {
        const index = (Math.floor(i / 3) * 4) + 2
        return imageData[index] / 255
      }
    })
  }, [imageData, particles])

  useFrame(() => {
    if (pointsRef.current && audioData) {
      const positions = pointsRef.current.geometry.attributes.position.array
      for (let i = 0; i < positions.length; i += 3) {
        const audioIndex = Math.floor(i / 3) % audioData.length
        positions[i + 2] = audioData[audioIndex] / 255 * 50 // Z-axis modulation
      }
      pointsRef.current.geometry.attributes.position.needsUpdate = true
    }
  })

  return (
    <points ref={pointsRef}>
      <bufferGeometry>
        <bufferAttribute
          attachObject={['attributes', 'position']}
          count={particles.length}
          array={new Float32Array(particles.flatMap(p => [p.x, p.y, p.z]))}
          itemSize={3}
        />
        <bufferAttribute
          attachObject={['attributes', 'color']}
          count={colors.length / 3}
          array={colors}
          itemSize={3}
        />
      </bufferGeometry>
      <pointsMaterial vertexColors size={2} sizeAttenuation={false} />
    </points>
  )
}

This component creates a point for each pixel in the album cover, positioning it in 3D space. The z-axis of each point is then modulated based on the audio data, creating the dynamic, music-reactive effect.

Custom Shaders

To achieve more complex visual effects, I implemented custom shaders. Here's a complete example of a custom shader I used in SoundDrop to create a dynamic, audio-reactive background:

// Vertex Shader
varying vec2 vUv;
varying vec3 vPosition;

void main() {
  vUv = uv;
  vPosition = position;
  gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}

// Fragment Shader
uniform float time;
uniform float bassIntensity;
uniform float trebleIntensity;
uniform vec3 color1;
uniform vec3 color2;
uniform vec3 color3;

varying vec2 vUv;
varying vec3 vPosition;

// Simplex 3D Noise
vec4 permute(vec4 x){return mod(((x*34.0)+1.0)*x, 289.0);}
vec4 taylorInvSqrt(vec4 r){return 1.79284291400159 - 0.85373472095314 * r;}

float snoise(vec3 v){ 
  const vec2  C = vec2(1.0/6.0, 1.0/3.0) ;
  const vec4  D = vec4(0.0, 0.5, 1.0, 2.0);

  vec3 i  = floor(v + dot(v, C.yyy) );
  vec3 x0 =   v - i + dot(i, C.xxx) ;

  vec3 g = step(x0.yzx, x0.xyz);
  vec3 l = 1.0 - g;
  vec3 i1 = min( g.xyz, l.zxy );
  vec3 i2 = max( g.xyz, l.zxy );

  vec3 x1 = x0 - i1 + 1.0 * C.xxx;
  vec3 x2 = x0 - i2 + 2.0 * C.xxx;
  vec3 x3 = x0 - 1. + 3.0 * C.xxx;

  i = mod(i, 289.0 ); 
  vec4 p = permute( permute( permute( 
             i.z + vec4(0.0, i1.z, i2.z, 1.0 ))
           + i.y + vec4(0.0, i1.y, i2.y, 1.0 )) 
           + i.x + vec4(0.0, i1.x, i2.x, 1.0 ));

  float n_ = 1.0/7.0;
  vec3  ns = n_ * D.wyz - D.xzx;

  vec4 j = p - 49.0 * floor(p * ns.z *ns.z);

  vec4 x_ = floor(j * ns.z);
  vec4 y_ = floor(j - 7.0 * x_ );

  vec4 x = x_ *ns.x + ns.yyyy;
  vec4 y = y_ *ns.x + ns.yyyy;
  vec4 h = 1.0 - abs(x) - abs(y);

  vec4 b0 = vec4( x.xy, y.xy );
  vec4 b1 = vec4( x.zw, y.zw );

  vec4 s0 = floor(b0)*2.0 + 1.0;
  vec4 s1 = floor(b1)*2.0 + 1.0;
  vec4 sh = -step(h, vec4(0.0));

  vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy ;
  vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww ;

  vec3 p0 = vec3(a0.xy,h.x);
  vec3 p1 = vec3(a0.zw,h.y);
  vec3 p2 = vec3(a1.xy,h.z);
  vec3 p3 = vec3(a1.zw,h.w);

  vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3)));
  p0 *= norm.x;
  p1 *= norm.y;
  p2 *= norm.z;
  p3 *= norm.w;

  vec4 m = max(0.6 - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0);
  m = m * m;
  return 42.0 * dot( m*m, vec4( dot(p0,x0), dot(p1,x1), 
                                dot(p2,x2), dot(p3,x3) ) );
}

void main() {
  vec2 uv = vUv;
  
  float noise = snoise(vec3(uv * 5.0, time * 0.1));
  
  // Use bass and treble intensities to affect the noise
  noise += bassIntensity * snoise(vec3(uv * 3.0, time * 0.2));
  noise += trebleIntensity * snoise(vec3(uv * 8.0, time * 0.3));
  
  // Create color mix based on noise and audio intensities
  vec3 color = mix(color1, color2, noise);
  color = mix(color, color3, (bassIntensity + trebleIntensity) * 0.5);
  
  // Add some sparkle effect based on treble
  if (fract(noise * 10.0 + trebleIntensity) < 0.1) {
    color += vec3(1.0) * trebleIntensity;
  }
  
  gl_FragColor = vec4(color, 1.0);
}

This shader creates a dynamic, flowing background that responds to the bass and treble intensities of the audio. The noise function creates organic patterns, while the audio input influences the color mixing and adds a sparkle effect.

To use this shader in my React Three Fiber setup, I created a custom material:

import { shaderMaterial } from '@react-three/drei'
import { extend, useFrame } from '@react-three/fiber'
import { Vector3 } from 'three'

const AudioReactiveMaterial = shaderMaterial(
  {
    time: 0,
    bassIntensity: 0,
    trebleIntensity: 0,
    color1: new Vector3(0.1, 0.3, 0.5),
    color2: new Vector3(0.7, 0.2, 0.3),
    color3: new Vector3(0.9, 0.8, 0.1),
  },
  // vertex shader (as string)
  vertexShader,
  // fragment shader (as string)
  fragmentShader
)

extend({ AudioReactiveMaterial })

function AudioReactiveBackground({ audioData }) {
  const materialRef = useRef()

  useFrame((state) => {
    if (materialRef.current) {
      materialRef.current.time = state.clock.elapsedTime
      materialRef.current.bassIntensity = audioData.bassIntensity
      materialRef.current.trebleIntensity = audioData.trebleIntensity
    }
  })

  return (
    <mesh>
      <planeBufferGeometry args={[2, 2]} />
      <audioReactiveMaterial ref={materialRef} />
    </mesh>
  )
}

Audio Processing

Audio processing was crucial for creating the reactive visualizations. I used the Web Audio API to analyze the audio in real-time:

class AudioAnalyzer {
  constructor(audioElement) {
    this.audioContext = new (window.AudioContext || window.webkitAudioContext)()
    this.analyser = this.audioContext.createAnalyser()
    this.analyser.fftSize = 2048
    this.bufferLength = this.analyser.frequencyBinCount
    this.dataArray = new Uint8Array(this.bufferLength)
    
    const source = this.audioContext.createMediaElementSource(audioElement)
    source.connect(this.analyser)
    this.analyser.connect(this.audioContext.destination)
  }

  getAudioData() {
    this.analyser.getByteFrequencyData(this.dataArray)
    return this.dataArray
  }

  getBassLevel() {
    const bass = this.dataArray.slice(0, 10)
    return bass.reduce((a, b) => a + b) / bass.length
  }

  getTrebleLevel() {
    const treble = this.dataArray.slice(-10)
    return treble.reduce((a, b) => a + b) / treble.length
  }
}

I then used this analyzer in my React components to drive the visualizations:

function VisualizerComponent({ audioElement }) {
  const [analyzer, setAnalyzer] = useState(null)
  const [audioData, setAudioData] = useState(new Uint8Array(0))

  useEffect(() => {
    const newAnalyzer = new AudioAnalyzer(audioElement)
    setAnalyzer(newAnalyzer)
  }, [audioElement])

  useFrame(() => {
    if (analyzer) {
      setAudioData(analyzer.getAudioData())
    }
  })

  return <PointCloud audioData={audioData} />
}

Firebase Integration

For user authentication and data storage, I integrated Firebase. Here's how I set up the Firebase configuration and implemented user registration:

import { initializeApp } from 'firebase/app'
import { getAuth, createUserWithEmailAndPassword } from 'firebase/auth'
import { getFirestore, doc, setDoc } from 'firebase/firestore'

const firebaseConfig = {
  // Firebase configuration
}

const app = initializeApp(firebaseConfig)
const auth = getAuth(app)
const db = getFirestore(app)

export const registerUser = async (email, password, username) => {
  try {
    const userCredential = await createUserWithEmailAndPassword(auth, email, password)
    const user = userCredential.user

    await setDoc(doc(db, 'users', user.uid), {
      username,
      email,
      createdAt: new Date().toISOString()
    })

    return user
  } catch (error) {
    console.error('Error registering user:', error)
    throw error
  }
}

Image Storage and Retrieval

To handle image uploads and retrievals efficiently, I implemented a system using Firebase Storage. Here's how I managed this process:

When a user uploads a cover art image, I first resize it client-side to ensure consistent performance, then upload it to Firebase Storage:

import { getStorage, ref, uploadBytes, getDownloadURL } from "firebase/storage";

const storage = getStorage();

async function uploadImage(file, userId) {
  // Resize image (implementation not shown for brevity)
  const resizedFile = await resizeImage(file, 1000, 1000);

  const imageRef = ref(storage, `users/${userId}/coverArt/${Date.now()}`);
  
  try {
    const snapshot = await uploadBytes(imageRef, resizedFile);
    const downloadURL = await getDownloadURL(snapshot.ref);
    return downloadURL;
  } catch (error) {
    console.error("Error uploading image: ", error);
    throw error;
  }
}

To display the images, I fetch the download URL and use it in my components:

import { getStorage, ref, getDownloadURL } from "firebase/storage";

const storage = getStorage();

async function getImageUrl(userId, imagePath) {
  const imageRef = ref(storage, `users/${userId}/${imagePath}`);
  
  try {
    const url = await getDownloadURL(imageRef);
    return url;
  } catch (error) {
    console.error("Error getting image URL: ", error);
    throw error;
  }
}

Subscription Tracking in Firebase

To manage user subscriptions, I utilized Firebase's Firestore database along with Cloud Functions to handle Stripe webhook events. Here's an overview of the process:

When a user subscribes, I store their subscription information in Firestore:

import { getFirestore, doc, setDoc } from "firebase/firestore";

const db = getFirestore();

async function storeSubscription(userId, subscriptionData) {
  const userRef = doc(db, "users", userId);
  
  try {
    await setDoc(userRef, { subscription: subscriptionData }, { merge: true });
  } catch (error) {
    console.error("Error storing subscription: ", error);
    throw error;
  }
}

I use a Cloud Function to listen for Stripe webhook events and update the user's subscription status accordingly:

const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();

exports.stripeWebhookHandler = functions.https.onRequest(async (request, response) => {
  const event = request.body;

  switch (event.type) {
    case 'customer.subscription.updated':
    case 'customer.subscription.deleted':
      const subscription = event.data.object;
      const userId = subscription.metadata.firebaseUID;
      
      await admin.firestore().collection('users').doc(userId).update({
        'subscription.status': subscription.status,
        'subscription.plan': subscription.items.data[0].plan.nickname,
      });
      break;
    // ... handle other event types
  }

  response.sendStatus(200);
});

In my application, I check the user's subscription status to determine feature access:

import { getFirestore, doc, getDoc } from "firebase/firestore";

const db = getFirestore();

async function checkSubscriptionStatus(userId) {
  const userRef = doc(db, "users", userId);
  
  try {
    const docSnap = await getDoc(userRef);
    if (docSnap.exists()) {
      const userData = docSnap.data();
      return userData.subscription?.status === 'active';
    }
    return false;
  } catch (error) {
    console.error("Error checking subscription status: ", error);
    throw error;
  }
}

Link Management

To allow users to add and manage their music platform links, I created a LinkManager component:

import { useState, useEffect } from 'react'
import { doc, updateDoc, arrayUnion, arrayRemove } from 'firebase/firestore'
import { db } from '../firebaseConfig'

function LinkManager({ userId }) {
  const [links, setLinks] = useState([])
  const [newLink, setNewLink] = useState({ platform: '', url: '' })

  useEffect(() => {
    // Fetch user's links from Firestore
    // ...
  }, [userId])

  const addLink = async () => {
    try {
      await updateDoc(doc(db, 'users', userId), {
        links: arrayUnion(newLink)
      })
      setLinks([...links, newLink])
      setNewLink({ platform: '', url: '' })
    } catch (error) {
      console.error('Error adding link:', error)
    }
  }

  const removeLink = async (linkToRemove) => {
    try {
      await updateDoc(doc(db, 'users', userId), {
        links: arrayRemove(linkToRemove)
      })
      setLinks(links.filter(link => link !== linkToRemove))
    } catch (error) {
      console.error('Error removing link:', error)
    }
  }

  return (
    <div>
      {links.map((link, index) => (
        <div key={index}>
          <span>{link.platform}: {link.url}</span>
          <button onClick={() => removeLink(link)}>Remove</button>
        </div>
      ))}
      <input
        value={newLink.platform}
        onChange={(e) => setNewLink({...newLink, platform: e.target.value})}
        placeholder="Platform"
      />
      <input
        value={newLink.url}
        onChange={(e) => setNewLink({...newLink, url: e.target.value})}
        placeholder="URL"
      />
      <button onClick={addLink}>Add Link</button>
    </div>
  )
}

SoundDrop was my first and most robust build of a full stack web app. I built it over the course of a few months, pushing myself to come up with novel ways to display the cover arts and create custom filters. It taught me valuable lessons about organizing databases in structured ways and interfacing with payment API's. Check it out for yourself ↓

Visit site ↗
View example demo 1 ↗
View example demo 2 ↗
Visit site ↗
Live demo showcasing interactivity and audio-reactivity
Users can edit the visuals with filters and backgrounds, all custom GLSL shaders
Another live demo with custom kaleidoscope GLSL shader
Hooking into Stripe's API to handle subscriptions
Page cover image