paint-brush
Striking a Chord: The Art of Animating Music With Meaningby@charnog
322 reads
322 reads

Striking a Chord: The Art of Animating Music With Meaning

by Denis GoncharOctober 18th, 2023
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

Combining a passion for music with tech expertise, this piece walks readers through the creation of a distinct visual album website. Challenges, coding shortcuts, and the role of the Open Graph protocol are highlighted, while emphasizing the balance between perfectionism and swift execution.
featured image - Striking a Chord: The Art of Animating Music With Meaning
Denis Gonchar HackerNoon profile picture

Cover for the album by DALL·E 3


In this article, I'll share how I made a project over a weekend to release my album (https://evhaevla.netlify.app/). I'm not a trained musician or composer, but sometimes, tunes pop up in my mind. I jot them down, and then let the computer play them.


In 2021, I launched my album titled "Everyone is Happy, Everyone is Laughing." It's a simple album by an unfamiliar "composer" – that's me.


I'm not just into music; I'm also a developer, mainly focusing on frontend work recently. I thought, why not combine these two loves? So, I set out to design a website to visually present my album.


This article won't dive deep into every technical detail — that would be too lengthy and might not appeal to everyone. Instead, I'll highlight the core concepts and the hurdles I encountered. For those interested, all the code can be found on GitHub.

Visual

My album is composed for piano, making the decision straightforward. Imagine rectangles descending upon the piano keys. Anyone with a musical inclination has likely encountered numerous videos on YouTube depicting notes in this manner. A rectangle touches a key, illuminating it, indicating the precise moment to strike the note.


I’m uncertain of the origin of this visual style, but a quick Google search predominantly yields screenshots of Synthesia.

Synthesia UI

On YouTube, there are creators who manage to produce visually stunning effects. Viewing such videos is a treat, both from aesthetic and musical perspectives. Watch this or this.


What will we need to implement?

  1. Keys
  2. Rectangles
  3. Audio
  4. Animation


Let's tackle each point and put all of these into action.

Keys

Initially, I assumed that implementing the keys would pose the greatest challenge. However, a quick online search revealed a plethora of examples and guides on how to do just that. Aiming for a design with a touch of elegance, I opted for an example created by Philip Zastrow.


All that was left for me to do was replicate the keys several times and establish a grid for the notes to glide across. I employed Vue.js as the frontend framework, and below is the component code.

<template>
  <ul style="transform: translate3d(0, 0, 0)">
    <li :id="`key_${OFFSET - 1}`" style="display: none"></li>
    <template v-for="key in keys" :key="key.number">
      <li :class="`${key.color} ${key.name}`" :id="`key_${key.number}`"></li>
    </template>
  </ul>
</template>

<script setup lang="ts">
import { ref } from 'vue'
import type { Key } from '@/components/types'

const OFFSET = 24

const template = [
  {
    color: 'white',
    name: 'c' // Do
  },
  {
    color: 'black',
    name: 'cs' // Do-diez
  },
  {
    color: 'white',
    name: 'd' // Re
  },
  /* ... */
]

const keys = ref<Key[]>([])

for (let i = 0; i < 72; i++) {
  keys.value.push({
    ...template[i % 12],
    number: i + OFFSET
  })
}
</script>

I'd like to mention that I appended an id attribute to each key, which will be essential when initiating their animations.

Rectangles

While this may appear to be the simplest segment, a few challenges are hidden in plain sight.


  1. How can the effect of descending notes be accomplished?


  2. Is there a need to maintain a structure that can be queried to retrieve the current notes?


  3. What’s the best approach to render the outcomes of such queries?


Each question presents an obstacle to navigate to achieve the desired effect seamlessly.


I won’t linger on each question but will rather cut straight to the chase. Given the myriad challenges associated with the dynamic approach, it’s wise to heed Occam’s Razor and opt for a static solution.


Here’s how I addressed it: I rendered all 6215 notes simultaneously on a single expansive canvas. This canvas is housed within a container styled with the overflow: hidden property. Achieving the falling notes effect is then simply a matter of animating the scrollTop of this container.


However, a lingering question remains: how do I obtain the coordinates for each note?


Fortunately, I have a MIDI file where all these notes are archived, a convenience afforded by being the composer of the album. It boils down to rendering the notes utilizing the data extracted from the MIDI file.


Given that the MIDI file is in binary format and I had no intention of parsing it myself, I enlisted the help of the midi-file library.


The midi-file library is efficient in extracting raw data from MIDI files, but for my needs, that’s not sufficient. I aim to transform this data into a more accessible and application-friendly format to facilitate seamless rendering within the app.


In a MIDI file, it’s not notes in the usual sense that I’m dealing with, but events. There’s a whole array of these events, but I’m primarily focused on two types: 'noteOn', which is triggered when a key is pressed, and 'noteOff', when the key is released.


Both 'noteOn' and 'noteOff' events specify the particular note number that was pressed or released, respectively. Time, in the conventional sense, is absent in MIDI. Instead, we have "ticks." The number of ticks per beat is detailed in the MIDI file’s header.

Blue - beats (default length is 96 ticks), Red - note events


Indeed, there's more to consider. A tempo track is also present, containing ‘setTempo’ events that are integral to the process, considering the tempo can change during playback. My initial approach involved adjusting the animation speed of the scrollTop property of the container to align with the tempo.


However, I soon realized this wouldn’t yield the expected outcome due to excessive error accumulation. A 'linear' time stretch proved more effective for animating the scrollTop.


Even with the animation aspect sorted, the tempo still required incorporation. I addressed this by adjusting the lengths of the notes’ rectangles themselves. While not the optimal solution (manipulating speed would be ideal), this method ensured smoother operation.


This solution isn’t perfect, mainly because I associate a tempo event with a note event based on whether they have the same or less time. This means that if another tempo event occurs while a note is still in the playing state, it will simply be disregarded.


This could potentially introduce an error, particularly if a note is very long and there's a dramatic tempo change occurring during its playtime. It’s a trade-off. I’ve accepted this minor flaw as I’m focused on rapid development.


There are instances where speed takes precedence, and it’s more pragmatic not to get entangled in every detail.

Only the first tempo event is associated with the note


So, we're equipped with the following information:


  1. The specific key number
  2. The precise moment the key is pressed
  3. The exact time the key is released
  4. The intensity or "speed" at which the key is pressed


With these details at hand, I can pinpoint the exact coordinates on the canvas for every note. The key number determines the X-axis, while the commencement of the key press is the Y-axis. The length of the press dictates the rectangle's height.


By using a standard div element and setting its position to 'absolute', I successfully achieved the desired effect.

The rectangles you observe here are merely straightforward `<div>` elements with applied styles. They're positioned absolutely on an elongated canvas, which is then scrolled through


Audio

I didn't intend to create a synthesizer for the piano, as that would have taken up a lot of time. Instead, I used an existing OGG file that had already been "rendered" and selected Native Instruments' The Grandeur for the sound library.


Personally, I believe it's the finest piano VST instrument available.


I embedded the resulting OGG file into a standard audio element. My main task then was to synchronize the audio with the scrollTop animation of my note canvas.

Animation

Before I could tackle synchronization, the animation had to be established first. The canvas animation is fairly straightforward — I animate the scrollTop from an infinite value down to zero, using linear interpolation. The duration of this animation matches the length of the album.


When a note descends onto a key, that key illuminates. This means that for each note's descent, I need to "activate" the corresponding key, and once the note completes its course, deactivate it.


With a total of 6215 notes, this equates to a whopping 12,430 note activation and deactivation animations.


Additionally, I aimed to provide users with the capability to rewind the audio, enabling them to navigate anywhere within the album. To implement such a feature, a robust solution is essential.


And when faced with the need for a dependable solution that "just works", my go-to is always the GreenSock Animation Platform.


Look at the amount of code it takes to create all the animations for each of the keys. Using id to animate components isn't the best practice for Single Page Applications. However, this method is a real time-saver. Recall the id I mentioned for each key? This is where they come into play.


const keysTl = gsap.timeline()
notes.value.forEach((note) => {
  const keySelector = `#note_${note.noteNumber}`

  keysTl
    .set(keySelector, KEY_ACTIVE_STATE, note.positionSeconds)
    .set(keySelector, KEY_INACTIVE_STATE, note.positionSeconds + note.durationSeconds - 0.02)
})


The synchronization code essentially establishes a connection through events between the audio and the GSAP global timeline.


audioRef.value?.addEventListener('timeupdate', () => {
  const time = audioRef.value?.currentTime ?? 0
  globalTl.time(time)
})

audioRef.value?.addEventListener('play', () => {
  globalTl.play()
})

audioRef.value?.addEventListener('playing', () => {
  globalTl.play()
})

audioRef.value?.addEventListener('waiting', () => {
  globalTl.pause()
})

audioRef.value?.addEventListener('pause', () => {
  globalTl.pause()
})


Captions

Just when I felt like wrapping up, an intriguing idea sprang to mind. What if I added a unique twist to the album? It wasn't originally on my to-do list, but I felt the project wouldn't truly shine without this feature. So, I chose to incorporate it as well.


Every time I immerse myself in a track, I find myself reflecting on its deeper meanings. What message was the composer trying to convey? Consider, for instance, a segment in "Nightbook" by Ludovico Einaudi. The piano resonates in the left ear, while the strings echo in the right.


It crafts an ambiance of a dialogue unfolding between the two. It feels as if the piano keys are probing: "Do you concur?" The strings affirmatively respond. "Is that the query?" The strings echo their affirmation. The sequence culminates with both instruments converging, symbolizing a realization of unity and harmony. Isn't that a mesmerizing experience?


It's imperative to mention that this is purely my personal interpretation. Once, I had the opportunity to attend a Ludovico concert in Milan. After the performance, I approached him and inquired if he had indeed intended to embed the notion of a dialogue in that particular segment.


His response was enlightening: "I never pondered it that way, but you certainly possess a vivid imagination."


Drawing from that experience, I pondered: what if I integrated subtitles into the sheet music? As specific segments play, commentary could materialize on-screen, providing insights or interpretations into the composer's intent.


This feature could offer listeners a deeper understanding or a fresh perspective on "what did the author truly mean?"


It was fortunate that I chose GSAP as my animation tool. It allowed me to effortlessly integrate another timeline, specifically tasked with animating the commentary. This addition streamlined the process and made the implementation of my idea much smoother.


I had an inclination to introduce the comments via HTML markup. To achieve this, I crafted a component that introduces the animation during the onMounted event.

<template>
  <div :class="$style.comment" ref="commentRef">
    <slot></slot>
  </div>
</template>

<script setup lang="ts">
/* ... */

onMounted(() => {
  if (!commentRef.value) return

  props.timeline
    .fromTo(
      commentRef.value,
      {
        autoAlpha: 0
      },
      {
        autoAlpha: 1,
        duration: 0.5
      },
      props.time ? parseTime(props.time) : props.delay ? `+=${props.delay}` : '+=1'
    )
    .to(
      commentRef.value,
      {
        autoAlpha: 0,
        duration: 0.5
      },
      props.time ? parseTime(props.time) + props.duration : `+=${props.duration}`
    )
})
</script>


The use of this component would be as follows.

<template>
  <div>
    <Comment time="0:01" :duration="5" :timeline="commentsTl">
      <h1>A title for a track</h1>
    </Comment>
    <Comment :delay="1" :duration="13" :timeline="commentsTl">
      I would like to say...
    </Comment>

Apotheosis

With all the elements in place, the next step was to host the site. I opted for Netlify. Now, I invite you to experience the album and view the final presentation.


https://evhaevla.netlify.app


I genuinely hope there are other piano-loving developers out there, eager to showcase their albums in such a unique manner. If you're one of them, don't hesitate to fork the project.